Sunday, October 11, 2015

Accelerated Mobile Pages - AMP & Instant Articles

In order to accelerate the download of mobile pages Google is working on methodology to make page download faster on mobile just like face book instant articles. This would possibly an open source code and shareable.   

This project by Google aims at speeding up the download on all platforms. This is to be done by eliminating Java script and limiting CSS. Even the HTML elements are to be restricted.   The open source also reduces HTTP: server queries as well as restrict image download in a certain manner.

AMP disallows embedding of Ads into the clients website.

This project is participatory in nature and may not receive the kind of applause as expected since it is restrictive in nature. Hence it many be a prerequisite for website owners to implement AMP. But with the search engine you never know how compulsive it may be.

For more info: Visit Ars Technica UK

Similarly Instant Articles also constitutes powerful publishing tools. The aim is to offer a better reading experience as well as watching auto play videos, interactive maps and full length articles.    

Those willing to publish articles as partners of Instant Articles are promised compatibility with many tools and interfaces.

For more information read: Instant Articles on Wikipedia.     

Sunday, October 4, 2015

Search and Ranking Updates

The Panda Update roll out by Google is still unrolling. This will take months and will affect many sites as it unfurls. The update is a spam filter that is programmed for remove sites which violate the guidelines of the search engine. 

This augurs good for the websites which have been affected by earlier roll out but have rectified accordingly. Hence websites that have removed spam elements will see favorable results as far as organic search is considered. 

In order to avoid being hit by such updates one need to follow guidelines set up by major search engines. Elements like external and internal duplicates, shallow and irrelevant contents face penalty. Excessive use of tags, improper internal linking, keyword stuffing are some of the factors that invite Panda.  

Saturday, September 26, 2015

Gearing Up Your Website

When gearing up your website for search engines then have a serious peak at it first.

1) Does it has a look and feel that is soothing and justifies the topic?
Does it create a trust in your visitor?

2) Is it bogged down with unimpressive heavy graphics that take an eternity to load and the visitors leave in frustration?

3) Is it bogged down with confusing number of links placed haphazardly? Hence is the navigation user friendly?

4) Does the site talks only of sales an advertisement with no quality content? Does it communicate properly with the visitors or prospective buyers?     

5) Are the content persuasive and convincing? Does the site carry relational pages with on topic information on related technology and useful resources? Give people reason to visit your website.

6) Does the site  contains a long list of affiliate links. Is it bloated with affiliate banners that speak loudly of the affiliate programs? 

7) Is your site secure protected by SSL? Does it contain relevant privacy policies and your contact information? Are their certifications or approvals on site? 

8) The affiliate program you join should be legal and hold a reputation on the Net. Else you will have a tough time selling online. Is the advertiser trustworthy does the company have a reputation and brand image?    

9) Testimonials with contact info?

Structured Microdata - Schema

Google may soon incorporate structured data using vocabulary provided by in its ranking algorithms. By incorporating the markups one can enhance the search friendliness of one's site. The types of format used are JASN-LD which is the latest introduced and is more script friendly . The other formats are RDFa and Microdata.  
The schema creates a machine readable language easily understood by the algorithms. The mark up is embedded in HTML documents. 

Attribute to define an item is called itemscope while its property is defined as itemprop. Types are defined as itemtypes. 

For Local Search 

Itemscope is used to define the item that is in this case a person
<section itemscope itemtype="">
Like wise itemprops are used to define properties of that person i.e name, address, work profile. The address would fall under itemtype. 

This vocabulary is also supported by other search engines.

The mark up data is also used for businesses, events etc.

Here are some examples 

Organization Schema Markup be sure to add the logo, corporate address and social profile (Links).

Website Schema Markup   this helps generate sitelink search box feature provided the utility is there on the website.  

Breadcrumbs Schema Markup helps generate breadcrumb rich snippets.   

Site Navigation Markup helps SEs understand your site structure and navigation.  

Video Markup using Schema for sites with video embedded it displays rich video snippets.

Schema could also be used for local business, restaurants, ratings, offers, person and more.  

 More Information could be found at Search Engine Land

Find more information at Google Developer

Saturday, September 19, 2015

Search Engine Optimization: Site Stabbers?

Elements or Acts That Kill The Site Traffic

One of my client website was ranking well on top search terms with traffic touching half a million every month. Then whence the success knocked his doors downfall followed. He placed an Ad above the fold. What this means is an advertisement above the main content. 

In spite of repeated efforts to address the malady there was no pliant and eventually the search engines discovered it and brought the site down. This happened with another adamant client. Put just a couple of topic related Ads on your web pages if you plan to earn from them and that to much below the fold. Search Engines can find out that the website is built for Advertisement purpose with little or no useful information for the visitors.   

Content scraping or duplicating contents have been discussed earlier.

Some More Errors      

It is very difficult to work with clients/designers with little knowledge of Seo. Why H1 Tag is not used? Why no internal linking? Would like to target our Mumbai based site to America put USA in the title.       

A site should be composed of natural elements and related contents. Websites with shallow contents rarely do well on competitive search terms. Bad architecture is another downfall. Create user friendly URLs with well defined hierarchy and create silos if possible.
Restructuring sites would invite changes in the URL structure of internal pages which the search engines would not accredit it to older URLs. Hence one remedy is placing 301 redirects and making sure that they remain that way. Changing from http: to more secure https: would also lead to changed URL. 

Bugs can introduce <nofollow/noindex> to meta of pages you wish  the search engines to crawl. Hence keep a sharp eye on these activities. Or this can unknowingly happen whence minor tweaks of edits are being made by the web designer.       

Keep analyzing the webmaster tools so as to discover any anomaly in the process and the website. Define the preferred URL and the target country. This tool will also point out duplicate titles and meta which can harm your site traffic...terribly if it concerns the home page or the main pages of your website. Likewise keep a lookout on broken links or the stagnant one callously left by the designer whence of no use.    

Accidental removal or breaking of (rel=canonical) Can also drastically bring down the traffic. 

An error can also creep into XML and HTML site maps hence keep updating and checking them.  This can also happen with Robots.txt file This is purely used to keep low quality or pages with shallow contents out of purview of search engines. 

Technologies where CMS or content management systems are in use can sometimes deliver wrong contents to search engines. So one should keep check on what the spiders are being presented.  

With complicated technology in use many things can go wrong. Hence keep tab. One should also look out for updates in case of the automated or complex platforms on which websites are built.   

It is obvious that after recent changes to its algorithms and updates Google looks evenly at the on page factors hence more attention has to  be paid to the site. In earlier time rabid link building and keyword spam was tolerated even in case of badly built more. Hence follow guidelines set by the search engines if you wish to attain more traffic.

Monday, September 7, 2015

Panda Updates: What Are They?

Panda Updates are rolled out time to time in order to prevent on page spam. Google does not like misuse of on page elements that are purely embedded for increasing the rank on SERPs. Hence the company introduced a filter that could discover spam elements on the page and deal accordingly.       

Content farms where the most targeted using this filter. Blogs and websites using inadequate contents in order to rank higher were pulled down. I had come across many blogs acting as content farms which were of very low quality written by novice writers. The articles  were used to publish on blogs, scraper sites and websites in order to rank high and eventually benefit from the published Ads.      

Google also targets such sites by making changes to its algorithms. The search engine giant has been busy cleaning up the mess. Many sites that have escaped the filters would eventually come under the net in future roll outs. 

Hence the moral of the story is that one should have unique informative contents if you wish your site to rank well and attain impressive traffic. Misuse of on page elements like H1, H2, H3 tags, repeated keywords and irrelevant and shallow literature on sites will result in rankings going down.     

Likewise link farms, as portals having too many links then is natural are also the target. Spam link architecture in order to promote few pages is also discouraged.

Though it is anybody's guess as to what content the search engine ascribes as unique. But it is simple approach that matters. No copied contents, no internal duplicate contents,  no verbosity or garbage. Be sensible in order to avoid online penalties which the search engines may impose.

Thursday, September 3, 2015

Common Search Engine Optimization Errors

As is mentioned in the previous blog entry duplicate contents lead to a big downfall on SERPs. A poor link architecture does as well. Your links should establish a perfect hierarchy with prime placement. Duplicate page titles not properly written or too short will also affect your rankings. 

Use the title tag to inform the visitors about the page content and the location. For location one should schema mark up.   Do not stuff keywords artificially and keep the length to 75 characters with space maximum. Your title should be effective in sending the right message and fetching an impression. 

Meta Descriptions though not very active in ranking are a must in all important pages. They should effectively add to the titles message and offer more details. Some add address and phone numbers to the tag as a portion is visible on the SERPs.  The description should be about 250 characters along with spaces. The message should be very clear regarding the offerings and hence compel the visitor to click.     

Write title & meta for all pages with right meaning and accurate description. This will also help whence Google exhibits sitelinks of your websites on the SERP. Add all meta element as relevant for example use index/follow or noindex/nofollow tag in all pages. 

Analytic code missing. You will not be able to gauge the performance of your site on SERPs and analyze for future action. Google analytic code is most popular it should be added in all pages before </body> tag. Keep checking for any errors that may creep in.       

Structured markup or rich snippets are required hence we will take this aspect separately.  The rich snippets carry ratings and reviews which can be effective as conversion tool for impressions. 

Do you forget to check for broken links and old unused codes? One should regularly check for these since in case of former site navigation is affected while the latter will slow down page download.   

Absence of site maps. Both html and XML site maps are needed. The former is for visitors and the latter is for search engines.