Monday, September 7, 2015

Panda Updates: What Are They?

Panda Updates are rolled out time to time in order to prevent on page spam. Google does not like misuse of on page elements that are purely embedded for increasing the rank on SERPs. Hence the company introduced a filter that could discover spam elements on the page and deal accordingly.       

Content farms where the most targeted using this filter. Blogs and websites using inadequate contents in order to rank higher were pulled down. I had come across many blogs acting as content farms which were of very low quality written by novice writers. The articles  were used to publish on blogs, scraper sites and websites in order to rank high and eventually benefit from the published Ads.      

Google also targets such sites by making changes to its algorithms. The search engine giant has been busy cleaning up the mess. Many sites that have escaped the filters would eventually come under the net in future roll outs. 

Hence the moral of the story is that one should have unique informative contents if you wish your site to rank well and attain impressive traffic. Misuse of on page elements like H1, H2, H3 tags, repeated keywords and irrelevant and shallow literature on sites will result in rankings going down.     

Likewise link farms, as portals having too many links then is natural are also the target. Spam link architecture in order to promote few pages is also discouraged.

Though it is anybody's guess as to what content the search engine ascribes as unique. But it is simple approach that matters. No copied contents, no internal duplicate contents,  no verbosity or garbage. Be sensible in order to avoid online penalties which the search engines may impose.

Thursday, September 3, 2015

Common Search Engine Optimization Errors

As is mentioned in the previous blog entry duplicate contents lead to a big downfall on SERPs. A poor link architecture does as well. Your links should establish a perfect hierarchy with prime placement. Duplicate page titles not properly written or too short will also affect your rankings. 

Use the title tag to inform the visitors about the page content and the location. For location one should schema mark up.   Do not stuff keywords artificially and keep the length to 75 characters with space maximum. Your title should be effective in sending the right message and fetching an impression. 

Meta Descriptions though not very active in ranking are a must in all important pages. They should effectively add to the titles message and offer more details. Some add address and phone numbers to the tag as a portion is visible on the SERPs.  The description should be about 250 characters along with spaces. The message should be very clear regarding the offerings and hence compel the visitor to click.     

Write title & meta for all pages with right meaning and accurate description. This will also help whence Google exhibits sitelinks of your websites on the SERP. Add all meta element as relevant for example use index/follow or noindex/nofollow tag in all pages. 

Analytic code missing. You will not be able to gauge the performance of your site on SERPs and analyze for future action. Google analytic code is most popular it should be added in all pages before </body> tag. Keep checking for any errors that may creep in.       

Structured markup or rich snippets are required hence we will take this aspect separately.  The rich snippets carry ratings and reviews which can be effective as conversion tool for impressions. 

Do you forget to check for broken links and old unused codes? One should regularly check for these since in case of former site navigation is affected while the latter will slow down page download.   

Absence of site maps. Both html and XML site maps are needed. The former is for visitors and the latter is for search engines.         

About Website Contents: How to Write?

I have noticed number of times weak contents of websites doing fairly well on the Internet. Well this will not work for long. The website contents should be well researched in order to present unique information. In case of cheaply built  websites the designers simply grab contents from other sites and publish. This is disastrous for those seeking online presence and good traffic.

The contents should be well written and easy to understand. This means structured sentences in logical order, and no grammatical and spelling mistakes. Avoid any short of duplicate sentences and information. This means do not repeat information again and again once is enough. 

In order to write website contents on needs to start from the white board stage. Once the topic is finalized draw the concept. What is it all about? How should it be presented? 

The next step is content creation where in you write down the topic of all the pages that have to be created. Please make sure that no extra content which is weakly related/unrelated to the topic should be included. Hence in a website about Jabalpur city do not add information pages about Bhopal. Indore etc. This will hardly benefit until unless the information is desired. The search engines may see this as spam.      

Once the content creation is complete and approved by the owner you can begin to write. Using choice vocabulary for LSI optimization write down the contents in short paragraph - without being verbose. The word limit in case of most of the websites is about five hundred per page.  

Visitors read selectively hence give a appropriate headings using tags (h2, h3 etc). This will attract visitors since they can then limit themselves to just what they wish to read.    

In case of e-commerce websites internal duplicate content is a serious issue though tolerated by search engines they do weaken your website ranking. Hence take time and write product description in unique manner, well as much as you can since the sites may have many variants of very similar products.      

Infuse search terms naturally but limit to under two percent along with many related search terms. In case of large portals pages having similar contents on should use the meta tag noindex = <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">. Use rel=canonical in order to direct to preferred pages.    <link rel="canonical" href="http://example.com/xyz">

Whence the contents are complete please add relevant title to all the pages and meta description to important pages along with keywords. 

Link the pages internally only if the panels do not contain site links already. Top panel links are most valued hence link important pages there. In the footer one can add functional pages like, contact, booking, etc. 

Publishing pages on Hubpages, top blogs and article sites will help you in website content writing for your clients. There are many publishing sites with editorial scrutiny which can help you to learn to write well.       

Wednesday, September 2, 2015

Internet Tools You Should Know About

Well Internet is here to stay and for good. The phenomenal reach of the Net and the amazing speed has brought about a paradigm shift in communication. It has changed the World and will keep doing so progressively.     

Internet is more recognized as a marketing tool and has created a separate sphere doing business far different from the land based activities. Online business do not operate like the land based ones and are able to complete deals and contracts with much physical contacts.Advertisement and promotion are being executed in a big way and are running parallel to offline Ads and promotions.   
 
There are websites offering business opportunities, seeking services and expert advice to name a few. Many of these sites also offer secure apparatus for receiving and sending payments. 

Site like Flipkart, Snap Deal and Amazon to name a few are excellent for purchasing products that are free of monopolies and over burdened pricing. The commercial bazaars offer wide choice and better pricing.         
I have been providing website promotion services without even meeting my clients in most of the cases. Anyway not everyone is here to do business. Entertainment, personal communication, grouping, cause promotion all these activities take the cake. But then there is a thin line dividing the activities on the cyber space. 

The simple tools have to be learned. For laymen understanding Internet Marketing websites like Facebook, Twitter, Myspace or Stumbleupon is essential. Many user are able to use these sites without much ado but then for many a bit of understanding will take further. 

Blogging is another activity that has tremendous scope for those who wish to publish, promote or preserve their work on Net. Similarly Flickr and Picassa are platforms where you can preserve your precious images for eternity. Well as long as they exist since many platforms lose on the Net.      

Forums are essentially made to discover secrets of topics from experiences of other people. Online communities help people discover their niche, promote views, share experiences and so on. 

Learning basic optimization techniques would help many in their endeavor online. Website designers and owners benefit from understating on page elements. Small and medium business owners benefit the most by learning how to use Net as promotion tool.       

I am advocating laymen, businessmen, writers, scientists, hobbyist just anybody with use of  cyber space to learn these online phenomenons.              

For those serious about online presence or interested in promotions of websites, products, advertisements and propaganda should master this field avidly. 

Internet or Digital Marketing can be self learned using online tutorials. One can also join Internet Marketing Classes for better understanding of the  platforms and techniques. Well it is up to an individual how he or she wishes to proceed in order to learn the tools useful to them.

Friday, August 14, 2015

Making a Mobile Friendly Website

There is a tremendous increase in mobile phone usage all over the World. This along with technology that allows website visibility on the phone more and more users are searching online. The design and development of websites differs whence the screening is concerned on mobiles.  


The mobile friendly features can be incorporated in a manner such that the friendliness relates to its screening on our desk top as well as on our handsets.


It is well known fact that search engine algorithms have difficulty assessing Java Scrips and Flash Pages. The latter also drags down the websites and slows their downloading speed. Hence these elements have to be avoided or restricted.


Fetch as Google is an ideal tool in webmaster to ascertain that the sites is being indexed properly. Another useful element is the robot.txt file which enable search engines to accept or neglect pages for indexing. In case robot txt file is not incorporated than that means no pages are restricted.  


Xml site map is another useful website element that help SE robots to find pages to index.  


Some other points of interest are:


Link are too close


Fonts  are not of proper size (Small or Too Large)


Viewport is not configured. 


Configuring viewports meta tag file


<meta name=viewport content="width=device-width, initial-scale=1">



View port also controls device pixels,hardware pixels and CSS pixels.



The content of your pages should fit within the size specified in the viewport.  




Sunday, February 9, 2014

Using The Disavow Tool

Link Building phenomenon shot sky high when in the initial stages of search engine proliferation on the Internet began. Sergey Mikhaylovich Brin & Larry Page created the algorithmic wonder that is Google. The information retrieval system simply fetches what you are seeking online or on the Internet. 

Since the phenomenal growth of search engine usage there is a constant effort to improve the information retrieval system. This has resulted in lot of updates, additions and changes in the evolutionary algorithms that are used for indexing and ranking by Google.  This very evolutionary impact has unnerved lot of webmasters or seo experts who are actually the people behind with an important role to play in the future of search practices on the Internet. The frequent updates have baffled many webmasters and have critically challenged their optimization skills time and time again. With every change the search engines force compliance to their guidelines some of which are penny pricking.    

The evolution of search engine began from Brin's data mining which eventually lead to Google. The earlier algorithm relied heavily on keywords on page as well as external links pointing to the site. Both factors created a revolution and are still relevant albeit diluted. The search engine look for greater signals now whence indexing and ranking websites and blogs as response to the user query or the search term.  The complexity of the metrics of this highly evolved system in recent times has made it difficult for many webmasters to overcome.   

The heavily relied upon link building campaigns have now become more difficult with links being classified by search engines as bad, weak, good and strong. A greater understanding of the web has been called to picture. The search engines since inception have created an industry around them with link building being the most popular. It takes little to realize that things will not always remain the same hence the industry has to follow the changing dynamics.  Webmasters with shallow knowledge base will go down the history lane. 

Inadvertently search engine optimization will always rely on external link building untill unless the search engines systems take over the ranking game totally and make organic search sterile.  The smart operators realize very well the role web masters play in popularity of their search engines. Those who do not will linger behind for every and keep their search offerings to themselves however good.      

Unlike earlier times link evaluation has become more critical and a bad link creation can pull your site down on the SERP on primary term. This means developing skills to evaluate the link building exercise as well as to evaluate the platform.  The number game has some importance left still but an equatable platform has been created for those who cannot indulge in massive link building campaigns like the big firms.

It is not only the link wheel or the keyword profile that counts the source counts as well. Association with relevant, unique content and editorial scrutiny is a must for a link to gain juice - as referred in seo parley.  

Sensible quality link building is the need of the hour. This does not mean all papers should have scholarly  impact. People are searching the Net for basic information relevant on day to day basis to quench their information need. A scholarly paper has less importance for average searcher who is more on the Net for curiosity sake "Net Gossip" as I call it.

Hence large link building campaigns have resulted in trash which after the release of Panda and Penguin have become internecine. The bad link profile earlier innocuous has now become short of auto immunity disease. Therefore it has become necessary to get rid of as many bad links as possible. In order to assist the webmasters Google has introduced the Disvow Tool.    

Using the Disavow tool webmasters or website owners can remove unwanted back links that are pulling their site down on SERPs.  For this purpose the links that have to be removed have to be uploaded is a text file to the disavow tool online. Removing bad links will also mean removal of oft repeated anchor texts which invite Google penalty as well.

As Matt Cutts the Google spokesperson says the removal has to be cautiously done after efforts to remove them manually has failed.  Judging a link will be the next part of the story...bye for now...     

Search Engine Optimization a shifting paradigm

Many of the sites with aggressive promotion or link building have taken a heavy toll on the serps. This has happened after the roll out of Panda Filter and Penguin algorithmic changes.  You may be surprised by the delay of my entries since the last one was on May 2012.

Well some of my sites experienced negative changes  while a couple with duplicate contents unknown to me suffered very badly. It is obvious that sites that have been hit have been aggressively promoted other than my own activities. 

The Aftermath

Clients 

Rule One: Of seo has emerged for client. Get one who does not have any idea of seo... but service them sincerely. 

Rule Two: Interfering clients and those with half backed knowledge of seo do not last long. They keeping moving from one aggressive link builder to another till the domain is exhausted and put out by the search engines for ever.     

Link Building

Rule Three: Rely very less on paid links. Go for topically relevant paid links only if strong referral generation is achieved.

Rule Four: Relevancy cannot be sacrificed. Search Engines are being influenced by many other mark ups and feeders. Hence be careful what you put on Google places, plus, classifieds etc.  

Rule Five: Try to get natural back links with unique informative content on board. This applies to internal or resource pages as well.

Rule Six: Get back links from sources with acclaimed editorial scrutiny. This means writing expert contents and staying away from quickfire postings on blogs, forums and resorting to minimal submissions elsewhere. 

Rule Seven: Number of back links count but those with approval ratings as mentioned above. Thresholds have to be kept in mind.Example one good article = ten bad articles.

Rule Eight: Keep searching for good emerging platforms. 

Rule Nine: Involve Social Media. 

Rule Ten:A good back link platform has a good page rank, relevant to the topic of your website, has less ads properly placed. Less number of links per page, all relevant links in the category. Submissions/publications go through editorial scrutiny. Certainly not made for unwarranted revenue generations. Use your experience to make out more about a good platform and inform me kindly. 

Rule eleven: Aim at long tailed/ secondary search terms for reaching the primary goal.  

On Page Optimization 


Rule Twelve: Begins with unique site contents. Sites with blemish in contents never reach anywhere on serps.

Rule Thirteen: Accurate content creation is a must for a successful websites. The Content Creation is important for achieving results on targeted terms.  

Rule Fourteen: User Friendly Architecture. All important information should be easily found from anywhere on site.  

Rule Fifteen: Use robot text file properly or add directions in meta for search engine to follow and index the page or avoid it.  

Rule Sixteen: Page Download Speed: Does the site load happily for you ..well that is the landmark.

Rule Seventeen: For Speed:  Reduce Http: requests, define images, optimize images, use server side includes,  keep the page length optimum.In all these matters consult an experienced web designer.

Rule Eighteen: Unique Title/Meta Desc for every page this is a must.

Rule Nineteen: Try Meta Data on home page that describes the ownership/service etc.

Rule Twenty: Optimize alt text.

Rule Twenty One: Place your internal links right. More useful on top.

Rule Twenty Two: Place ads according to SE guidelines. .

Rule Twenty Three: Properly Format text and make them easily readable.

Rule Twenty Four: Minimize Text Descriptive, do not use H1 tags for promotional purpose it should describe the page as well. Avoid bold, underline and tags etc. 

Rule Twenty Five: Keep keyword density 3 to 4 percent. Use keywords in contents natural or else avoid them.Write to describe the offerings and not for search engines. You contents should make the targeting of chosen search terms possible. 

Rule Twenty Six: Work up on LSI.

Rule Twenty Seven: Resort to Internal Linking if required.

There is more to come in time to come...keep on hanging