Saturday, September 19, 2015

Search Engine Optimization: Site Stabbers?

Elements or Acts That Kill The Site Traffic

One of my client website was ranking well on top search terms with traffic touching half a million every month. Then whence the success knocked his doors downfall followed. He placed an Ad above the fold. What this means is an advertisement above the main content. 

In spite of repeated efforts to address the malady there was no pliant and eventually the search engines discovered it and brought the site down. This happened with another adamant client. Put just a couple of topic related Ads on your web pages if you plan to earn from them and that to much below the fold. Search Engines can find out that the website is built for Advertisement purpose with little or no useful information for the visitors.   

Content scraping or duplicating contents have been discussed earlier.

Some More Errors      

It is very difficult to work with clients/designers with little knowledge of Seo. Why H1 Tag is not used? Why no internal linking? Would like to target our Mumbai based site to America put USA in the title.       

A site should be composed of natural elements and related contents. Websites with shallow contents rarely do well on competitive search terms. Bad architecture is another downfall. Create user friendly URLs with well defined hierarchy and create silos if possible.
Restructuring sites would invite changes in the URL structure of internal pages which the search engines would not accredit it to older URLs. Hence one remedy is placing 301 redirects and making sure that they remain that way. Changing from http: to more secure https: would also lead to changed URL. 

Bugs can introduce <nofollow/noindex> to meta of pages you wish  the search engines to crawl. Hence keep a sharp eye on these activities. Or this can unknowingly happen whence minor tweaks of edits are being made by the web designer.       

Keep analyzing the webmaster tools so as to discover any anomaly in the process and the website. Define the preferred URL and the target country. This tool will also point out duplicate titles and meta which can harm your site traffic...terribly if it concerns the home page or the main pages of your website. Likewise keep a lookout on broken links or the stagnant one callously left by the designer whence of no use.    

Accidental removal or breaking of (rel=canonical) Can also drastically bring down the traffic. 

An error can also creep into XML and HTML site maps hence keep updating and checking them.  This can also happen with Robots.txt file This is purely used to keep low quality or pages with shallow contents out of purview of search engines. 

Technologies where CMS or content management systems are in use can sometimes deliver wrong contents to search engines. So one should keep check on what the spiders are being presented.  

With complicated technology in use many things can go wrong. Hence keep tab. One should also look out for updates in case of the automated or complex platforms on which websites are built.   

It is obvious that after recent changes to its algorithms and updates Google looks evenly at the on page factors hence more attention has to  be paid to the site. In earlier time rabid link building and keyword spam was tolerated even in case of badly built sites...no more. Hence follow guidelines set by the search engines if you wish to attain more traffic.