Sunday, October 4, 2015

Search and Ranking Updates

The Panda Update roll out by Google is still unrolling. This will take months and will affect many sites as it unfurls. The update is a spam filter that is programmed for remove sites which violate the guidelines of the search engine. 

This augurs good for the websites which have been affected by earlier roll out but have rectified accordingly. Hence websites that have removed spam elements will see favorable results as far as organic search is considered. 

In order to avoid being hit by such updates one need to follow guidelines set up by major search engines. Elements like external and internal duplicates, shallow and irrelevant contents face penalty. Excessive use of tags, improper internal linking, keyword stuffing are some of the factors that invite Panda.  

Saturday, September 26, 2015

Gearing Up Your Website

When gearing up your website for search engines then have a serious peak at it first.

1) Does it has a look and feel that is soothing and justifies the topic?
Does it create a trust in your visitor?

2) Is it bogged down with unimpressive heavy graphics that take an eternity to load and the visitors leave in frustration?

3) Is it bogged down with confusing number of links placed haphazardly? Hence is the navigation user friendly?

4) Does the site talks only of sales an advertisement with no quality content? Does it communicate properly with the visitors or prospective buyers?     

5) Are the content persuasive and convincing? Does the site carry relational pages with on topic information on related technology and useful resources? Give people reason to visit your website.

6) Does the site  contains a long list of affiliate links. Is it bloated with affiliate banners that speak loudly of the affiliate programs? 

7) Is your site secure protected by SSL? Does it contain relevant privacy policies and your contact information? Are their certifications or approvals on site? 

8) The affiliate program you join should be legal and hold a reputation on the Net. Else you will have a tough time selling online. Is the advertiser trustworthy does the company have a reputation and brand image?    

9) Testimonials with contact info?

Structured Microdata - Schema

Google may soon incorporate structured data using vocabulary provided by in its ranking algorithms. By incorporating the markups one can enhance the search friendliness of one's site. The types of format used are JASN-LD which is the latest introduced and is more script friendly . The other formats are RDFa and Microdata.  
The schema creates a machine readable language easily understood by the algorithms. The mark up is embedded in HTML documents. 

Attribute to define an item is called itemscope while its property is defined as itemprop. Types are defined as itemtypes. 

For Local Search 

Itemscope is used to define the item that is in this case a person
<section itemscope itemtype="">
Like wise itemprops are used to define properties of that person i.e name, address, work profile. The address would fall under itemtype. 

This vocabulary is also supported by other search engines.

The mark up data is also used for businesses, events etc.

Here are some examples 

Organization Schema Markup be sure to add the logo, corporate address and social profile (Links).

Website Schema Markup   this helps generate sitelink search box feature provided the utility is there on the website.  

Breadcrumbs Schema Markup helps generate breadcrumb rich snippets.   

Site Navigation Markup helps SEs understand your site structure and navigation.  

Video Markup using Schema for sites with video embedded it displays rich video snippets.

Schema could also be used for local business, restaurants, ratings, offers, person and more.  

 More Information could be found at Search Engine Land

Find more information at Google Developer

Saturday, September 19, 2015

Search Engine Optimization: Site Stabbers?

Elements or Acts That Kill The Site Traffic

One of my client website was ranking well on top search terms with traffic touching half a million every month. Then whence the success knocked his doors downfall followed. He placed an Ad above the fold. What this means is an advertisement above the main content. 

In spite of repeated efforts to address the malady there was no pliant and eventually the search engines discovered it and brought the site down. This happened with another adamant client. Put just a couple of topic related Ads on your web pages if you plan to earn from them and that to much below the fold. Search Engines can find out that the website is built for Advertisement purpose with little or no useful information for the visitors.   

Content scraping or duplicating contents have been discussed earlier.

Some More Errors      

It is very difficult to work with clients/designers with little knowledge of Seo. Why H1 Tag is not used? Why no internal linking? Would like to target our Mumbai based site to America put USA in the title.       

A site should be composed of natural elements and related contents. Websites with shallow contents rarely do well on competitive search terms. Bad architecture is another downfall. Create user friendly URLs with well defined hierarchy and create silos if possible.
Restructuring sites would invite changes in the URL structure of internal pages which the search engines would not accredit it to older URLs. Hence one remedy is placing 301 redirects and making sure that they remain that way. Changing from http: to more secure https: would also lead to changed URL. 

Bugs can introduce <nofollow/noindex> to meta of pages you wish  the search engines to crawl. Hence keep a sharp eye on these activities. Or this can unknowingly happen whence minor tweaks of edits are being made by the web designer.       

Keep analyzing the webmaster tools so as to discover any anomaly in the process and the website. Define the preferred URL and the target country. This tool will also point out duplicate titles and meta which can harm your site traffic...terribly if it concerns the home page or the main pages of your website. Likewise keep a lookout on broken links or the stagnant one callously left by the designer whence of no use.    

Accidental removal or breaking of (rel=canonical) Can also drastically bring down the traffic. 

An error can also creep into XML and HTML site maps hence keep updating and checking them.  This can also happen with Robots.txt file This is purely used to keep low quality or pages with shallow contents out of purview of search engines. 

Technologies where CMS or content management systems are in use can sometimes deliver wrong contents to search engines. So one should keep check on what the spiders are being presented.  

With complicated technology in use many things can go wrong. Hence keep tab. One should also look out for updates in case of the automated or complex platforms on which websites are built.   

It is obvious that after recent changes to its algorithms and updates Google looks evenly at the on page factors hence more attention has to  be paid to the site. In earlier time rabid link building and keyword spam was tolerated even in case of badly built more. Hence follow guidelines set by the search engines if you wish to attain more traffic.

Monday, September 7, 2015

Panda Updates: What Are They?

Panda Updates are rolled out time to time in order to prevent on page spam. Google does not like misuse of on page elements that are purely embedded for increasing the rank on SERPs. Hence the company introduced a filter that could discover spam elements on the page and deal accordingly.       

Content farms where the most targeted using this filter. Blogs and websites using inadequate contents in order to rank higher were pulled down. I had come across many blogs acting as content farms which were of very low quality written by novice writers. The articles  were used to publish on blogs, scraper sites and websites in order to rank high and eventually benefit from the published Ads.      

Google also targets such sites by making changes to its algorithms. The search engine giant has been busy cleaning up the mess. Many sites that have escaped the filters would eventually come under the net in future roll outs. 

Hence the moral of the story is that one should have unique informative contents if you wish your site to rank well and attain impressive traffic. Misuse of on page elements like H1, H2, H3 tags, repeated keywords and irrelevant and shallow literature on sites will result in rankings going down.     

Likewise link farms, as portals having too many links then is natural are also the target. Spam link architecture in order to promote few pages is also discouraged.

Though it is anybody's guess as to what content the search engine ascribes as unique. But it is simple approach that matters. No copied contents, no internal duplicate contents,  no verbosity or garbage. Be sensible in order to avoid online penalties which the search engines may impose.

Thursday, September 3, 2015

Common Search Engine Optimization Errors

As is mentioned in the previous blog entry duplicate contents lead to a big downfall on SERPs. A poor link architecture does as well. Your links should establish a perfect hierarchy with prime placement. Duplicate page titles not properly written or too short will also affect your rankings. 

Use the title tag to inform the visitors about the page content and the location. For location one should schema mark up.   Do not stuff keywords artificially and keep the length to 75 characters with space maximum. Your title should be effective in sending the right message and fetching an impression. 

Meta Descriptions though not very active in ranking are a must in all important pages. They should effectively add to the titles message and offer more details. Some add address and phone numbers to the tag as a portion is visible on the SERPs.  The description should be about 250 characters along with spaces. The message should be very clear regarding the offerings and hence compel the visitor to click.     

Write title & meta for all pages with right meaning and accurate description. This will also help whence Google exhibits sitelinks of your websites on the SERP. Add all meta element as relevant for example use index/follow or noindex/nofollow tag in all pages. 

Analytic code missing. You will not be able to gauge the performance of your site on SERPs and analyze for future action. Google analytic code is most popular it should be added in all pages before </body> tag. Keep checking for any errors that may creep in.       

Structured markup or rich snippets are required hence we will take this aspect separately.  The rich snippets carry ratings and reviews which can be effective as conversion tool for impressions. 

Do you forget to check for broken links and old unused codes? One should regularly check for these since in case of former site navigation is affected while the latter will slow down page download.   

Absence of site maps. Both html and XML site maps are needed. The former is for visitors and the latter is for search engines.         

About Website Contents: How to Write?

I have noticed number of times weak contents of websites doing fairly well on the Internet. Well this will not work for long. The website contents should be well researched in order to present unique information. In case of cheaply built  websites the designers simply grab contents from other sites and publish. This is disastrous for those seeking online presence and good traffic.

The contents should be well written and easy to understand. This means structured sentences in logical order, and no grammatical and spelling mistakes. Avoid any short of duplicate sentences and information. This means do not repeat information again and again once is enough. 

In order to write website contents on needs to start from the white board stage. Once the topic is finalized draw the concept. What is it all about? How should it be presented? 

The next step is content creation where in you write down the topic of all the pages that have to be created. Please make sure that no extra content which is weakly related/unrelated to the topic should be included. Hence in a website about Jabalpur city do not add information pages about Bhopal. Indore etc. This will hardly benefit until unless the information is desired. The search engines may see this as spam.      

Once the content creation is complete and approved by the owner you can begin to write. Using choice vocabulary for LSI optimization write down the contents in short paragraph - without being verbose. The word limit in case of most of the websites is about five hundred per page.  

Visitors read selectively hence give a appropriate headings using tags (h2, h3 etc). This will attract visitors since they can then limit themselves to just what they wish to read.    

In case of e-commerce websites internal duplicate content is a serious issue though tolerated by search engines they do weaken your website ranking. Hence take time and write product description in unique manner, well as much as you can since the sites may have many variants of very similar products.      

Infuse search terms naturally but limit to under two percent along with many related search terms. In case of large portals pages having similar contents on should use the meta tag noindex = <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">. Use rel=canonical in order to direct to preferred pages.    <link rel="canonical" href="">

Whence the contents are complete please add relevant title to all the pages and meta description to important pages along with keywords. 

Link the pages internally only if the panels do not contain site links already. Top panel links are most valued hence link important pages there. In the footer one can add functional pages like, contact, booking, etc. 

Publishing pages on Hubpages, top blogs and article sites will help you in website content writing for your clients. There are many publishing sites with editorial scrutiny which can help you to learn to write well.