Friday, February 5, 2016

Writing Contents For Users

There has been excessive concern regarding search engines and how they index and rank contents. In earlier times this worked since the algorithms were is early stage of evolution but with constant addition and changes this is not the case anymore.   

In the contemporary times it has become mandatory to add meaningful content relevant to the topic or product in case of e-commerce sites. It is not necessary to add content just for the sake of it. But yes meaningful content is desirable that which is readable and useful. The placement of content is of importance as well. Too much content above the fold will hide or reduce visual element this is not good for user attraction. It is better to add a paragraph followed by an image or a video. 

In case of product description the read more element is very useful. Since adding too many lines will hide the product image away and detract visitor, it is better to add just two lines followed by read more for those interested in reading the description further. The format in case of product/item description may include reviews, ratings, specification, about and more. More angles to image are usually added so fit in that aspect as well. Creating product pages is a challenging job, hastily done it will detract the user in one go. In case of many similar products it is tempting repeat description which is a dead giveaway. Hire a content writer and unique descriptions.                

Remember when it comes to online success usability wins hands down. Better user experience is the right formula for success as for search engine they have become more sophisticated and can decipher your naturally written content to index and rank properly.   

Monday, January 18, 2016

Content Creation for Websites & Seo

With dynamic changes in search engine algorithms it is becoming more and more difficult for average webmasters to rank websites. This is because rules have been tightened and are being implemented more stringently day by day. Link building without any authority or editorial scrutiny hardly works. The out bounds are important but not the way they were earlier. Third grade content will not work at all.  

Recently the Panda filter was incorporated into the algorithms by Google bringing all sites under the scanner. Albeit the search engines have set in preference regarding which sites do better on SERPs there is plenty of scope to fetch in organic traffic. (One such example of this preference is the favor accorded to major brands.)  

The algorithms are constantly evolving bringing about perplexing changes. This evolution is always challenging webmasters as to how to perform better on SERPs. One issue that has been fixed by Panda is that there is no scope now for loosely written content for creating an opportunity to link. Scammers had a field day till the rot was stemmed.  Many blogs and websites where thrown out giving greater opportunity for those with relevant and properly written unique content. 


Website & Off Page Contents

Unique content means that which explains best the offerings of the site yet stays out of the glut. Hence if your are writing about  your site which is on hotels you should be more descriptive using important and LSI search terms in the write up. Use words like, rooms, suite, cottages, bathroom, accommodation, tariff, service, amenities, facilities, ambiance etc.    

When we talk about Internet Marketing  greater emphasis is accorded to search engine optimization. For off page promotion content creation in terms of articles, blogs, videos become important. So does what is there on your website or portal. 

Though unnecessary information is discouraged it is imperative that the interface contains relevant content. Sites with thin content will not rank well until unless they cater to some very unique and rare search term. Hence the first criteria before taking on a new project is to evaluate the site content along with the link architecture, usability, usage of tags, internal linking etc.     

There are hundreds of articles on how contents should be written according to the new guidelines. Usage of excessive keyword is not advisable at all. Add relevant LSI terms or simply put be descriptive without repeating important phrases that define the topic.   Remember search terms are important since the indexing and ranking process depends much on them...logical isn't it. 

When I say avoid glut I mean do not venture into a topic which is already saturated rather spend your efforts on something rare or less popular. Well for survival webmasters cannot be choosy and they have to take on projects as they come.    

For websites and content marketing well researched effort will always pay. This means doing extensive keyword research using various tools available online. You must write naturally in a flow as the writer do. Dig out aspects which have been ignored by others. Provide complete information about the products, services, topics or what not. On many instances location should be added - use structured mark up where ever you can. 

This way you will accord enough information to search engines as well as make the contents an interesting read. The latter is a prerequisite. In case if you are not able to naturally incorporate search terms in your article, do not worry. The SEs are becoming more and more smarter these days to dig out the crux.    

Another suggestion that I wish to give out to content writers is to work on numerous platforms which have editorial scrutiny for contents thereon. If you can score better in such platforms you are bound to become a better writer.  

Without over doing divide the write up into relevant paragraphs decorate in limit with tags, headings, italics, etc. Add relevant highly optimized light weight pictures with proper alt tags.  

As this articles states contents have become very important. If you cannot write yourself then hire a content writer for website or promotional efforts. Internet Marketing is heavily dependent on the written word be assured.     

Thursday, January 7, 2016

Website Accessibility: Usability: On Page Seo & Design Factors

Design & Content Guidelines 

Make a well defined link architecture using few H tags the H1 Tag being used only once. The font size, text to back ground contrast all should make readability easier.The font size used should be 12 preferable or more according to the design.

Whence writing contents use a unique language not copied from any source. Include words that people type to search relevant information in the Internet.

Use low weight but high resolution images labeled and added with the correct alt text. The height and width attributes should be in the place.

Write all important information in text especially headings since the search engines cannot read text embedded in the images. Label each block of content for visitors to know what they are going to read about. 

One of the important factors in on page optimization is website accessibility.  This is applicable to both the users and the search engines.

The configuration of robot.txt file is important this will help keep the less desired pages from search engine crawls while informing them of important pages.

Certain scripts may not gel well with browsers and different form of machines like Ipad etc.


Check whether server is well configured. Use the DNS Check Tool:

When you conduct a ping and trace route test the load time should be less then one minute. The less load time means your website is comparatively faster than many websites.

Mobile Friendly 

Use the view port meta  to make the site more friendly. Place reasonable distance between the links and use the font size as stated above. Above the use of flash and heavy images.

Use rich snippets to best describe your website to search engines. The structured data use should be limited and at present is restricted to  products, reviews, events, local business, videos and software applications. 

For better understanding read guidelines set up by search engines such as Google Guidelines 

Friday, December 25, 2015

SEO: HTTPS Over HTTP

In the webmaster blog Google announced its preference of HTTPS over HTTP protocol. This was bound to be since the search company is quite conscious of visitor privacy and security. Simply put this protocol encrypts message to be sent and on arrival decrypts messages to be received over HTTP protocol.   

HTTPS is a security protocol which carries a connection encrypted by TLS or Transfer Security Layer. The main objective is to secure visitor interest, prevent malicious packet injection Ad inserts, information espionage and privacy of course. Hence Data Exchange takes place in a secured environment for the benefit of the user.   

Earlier this protocol  was used for electronic payment systems over the WWW or Web. It was also preferred by corporate portals and e-mail providers in order to attain extreme secrecy.  

Browser software and certification authorities should be trustworthy in this type of data exchange. Some of the certification authorities are: Symantec, Comodo, GoDaddy and GlobalSign.

The encryption protection layer should also be trustworthy and provide security. However HTTPS can be called reasonably secure and do prevent tempering and malicious attacks but they are not hundred percent foolproof.       

Keeping in mind the good aspects of this protocol the search engine company will give preference to it over HTTP which is not as secured.    

In this case the port used is 443 and the URL begins with https://

For More Information visit:  HTTPS

Read at Google Webmaster Blog for more information on HTTPS 

robot.txt: What it is?

Search engines visit websites, blogs and other online portal to scan and then store or cache the information. This is then used to rank them according to the relevancy. In a large portal the whole website may be indexed according to criteria of search engines. Most of the search engine used indexed web pages collectively to rank the site. This means pages that are not important or hold information not meant for the engines are also indexed. 

In order to specify to the robots which pages to index and which to ignore a common protocol robot.txt is used. Hence a file is uploaded into the server along with the pages in the root directory. This is the robot.txt file which the robots first visit and index the pages accordingly. But remember this protocol may not be adhered to by some robots especially those with malicious intent. But all the popular search companies adhere to this standard which is public.       

The file contains the following command/commands:

User-agent: *
Disallow: /

More instructions are given below: 

To block the entire site, use a forward slash.

Disallow: /


To block a directory and everything in it, follow the directory name with a forward slash.

Disallow: /junk-directory/

To block a page, list the page.

Disallow: /private_file.html

To remove a specific image from Google Images, add the following:

User-agent: Googlebot-Image
Disallow: /images/dogs.jpg

To remove all images on your site from Google Images:

User-agent: Googlebot-Image
Disallow: /

To block files of a specific file type (for example, .gif), use the following:

User-agent: Googlebot
Disallow: /*.gif$

More information can be had from: Here Robot.TXT

Also visit Robottxt.org 

This information can also be given in the meta robot tag which should be present in every page. Another methodology is to insert x-robot header this is placed in the header which then works for all pages. Care should be taken that directives are proper and no important pages is barred.  This also applies to CMS portals.     

You have to regulary check the directives so that over time no misinformation/ wrong case is inserted. There are number of tools that can enable you to keep things in line.     
 

Wednesday, December 23, 2015

More About Accelerated Mobile Pages

As mentioned in my earlier blog these are stripped down HTML pages that download fast. This development also highlights the limitation of web design and development technology. The inability to handle heavily loaded pages points towards design limitation. But there are elements that may be necessary or fit the role but nevertheless not suitable for visibility on mobile or smart phones. The AMP should be able to take care of that. Well time will tell. 

The AMP pages use what is termed as diet HTML.  The Java Script is not allowed but off the shelf script library can be used this enables images to download whence you scroll to the point. This may also be incorporated in many platforms in time to come.       

There are lot or restrictions whence building an AMP pages. The rules if not followed will fail the Google validation test. The tool for this test is built in Google browser Chrome.   

These pages are designed for fast download and readability and not for interactivity. Thus they can be easily cached,  pre loaded and  pre rendered.   This is amazing and will Google in time to come create AMP version of regular page and present it to the visitors instantly.  

Further information can be had from:

ampproject.org – The main project web page, where you'll find a technical intro, tutorial, GitHub repository, and more
dis.tl/amp-pages – Further information on AMPs and how they work

Sunday, October 11, 2015

Accelerated Mobile Pages - AMP & Instant Articles

In order to accelerate the download of mobile pages Google is working on methodology to make page download faster on mobile just like face book instant articles. This would possibly an open source code and shareable.   

This project by Google aims at speeding up the download on all platforms. This is to be done by eliminating Java script and limiting CSS. Even the HTML elements are to be restricted.   The open source also reduces HTTP: server queries as well as restrict image download in a certain manner.

AMP disallows embedding of Ads into the clients website.

This project is participatory in nature and may not receive the kind of applause as expected since it is restrictive in nature. Hence it many be a prerequisite for website owners to implement AMP. But with the search engine you never know how compulsive it may be.

For more info: Visit Ars Technica UK

Similarly Instant Articles also constitutes powerful publishing tools. The aim is to offer a better reading experience as well as watching auto play videos, interactive maps and full length articles.    

Those willing to publish articles as partners of Instant Articles are promised compatibility with many tools and interfaces.

For more information read: Instant Articles on Wikipedia.