Sunday, May 27, 2018

Understanding The Domain Naming System - DNS

DNS is Short for Domain name System on the Internet.

The root address of all website hosted on the web server is the DNS. It is alphabetic in nature hence it has to be translated into numbers which the system understands.  A single group of the translated numbers is called IP address this is the protocol that Internet understands. Hence when you are writing your domain name websitepromotion-india.com it actually translates into an IP Address which in case of DNS is fixed. 

Using this service: http://www.hcidata.info/host2ip.cgi I can find that IP address of my domain is: 
182.50.130.158

When a user types a URL on web browser the query is sent to the operating system which forwards it to DNS root servers until the query is matched appropriately that is the IP address is located in the web host of the registrars.    

The IP Address in the DNS server is attached to a host server which dutifully sends the query to it. Once this is resolved the files on the web host server are resolved on the browser of your computer.  This is also known as TCP or transfer control protocol TCP is facilitated by HTTP which is hypertext transfer protocol.  The look up tools act as resolver according basic information about the name. No two domains can have the same IP address.      

All these services are managed by various organisations in order to be perfect and make feasible the browser queries. For resolution of domain name there are thirteen designated servers of which one can be where you domain is hosted. The resolver looks up at these 13 servers.    

For registering a domain name you have to go through a registrar to ICANN. This stands for Internet Corporation for Assigned Names and Numbers. The organisation operates the Internet Assigned Numbers Authority - IANA, and is in charge of maintaining the DNS root zone which keeps all the TLDs and is referred as TLD name server.

 A fee has to be paid for registration. After this you are the owner of the Domain and details can be had using the who is services on the Internet.   

More on DNS

The domain names are called TLD or top level domain generic or country specific. Example .com. .net, .org are generic TLDs while .eu, .in.   

Driectly under this heirarchy will be an SLD or 2LD a second level domain name example is CO.UK or AC.UK. While the former is concerned with a company the latter is concerned with and institution usually educational institute.   

More information on SLDs can be had here About SLD

Sunday, May 13, 2018

Internet Protocol Suite - Communication Protocol

Internet Layer makes internet working possible the primary protocol are defined below. In order address the right route IP address number is assigned to each computer on the Internet Network.   

IP Address are written in a way that humans can read and remember them.

Ex.192.192.1.1 

IP address are managed by Internet Assigned Numbers Authority and another five registrants responsible for regional database on addresses. 

They are:

  • African Network Information Centre (AFRINIC)
  • American Registry for Internet Numbers (ARIN)
  • Asia-Pacific Network Information Centre (APNIC)
  • Latin America and Caribbean Network Information Centre (LACNIC)
  • Réseaux IP Europeans Network Coordination Centre (RIPE NCC)


Commonly known as TCP/IP in the Internet Protocol Suite it is set of protocols that make transmission of data on the interconnected network of computers possible. In the Internet Protocol Suite the data is authenticated and encrypted and sent to the data stream.    

TCP stands for transmission control protocol provides flow of bytes on to the host computer on the Internet. WWW, file transfer, email messaging all rely on TCP.     

IP is an abbreviation for Internet Protocol.  The main function is to deliver packets from source host to the destination host using the IP Address to route.  Hence it address IP address of both the source and the destination host. The main hardware is the router that enables the transfer of datagram of packets.    

IP has two versions in is Internet Protocol Version 4 or IPv 4 and another IPv6 the latter is less popular but the usage is growing. IPv4 defines IP Address as 32 bit number while IPv6 defines IP address as 128 bit.    

Saturday, May 12, 2018

Web Hosting Control Panel

Web Hosting Control Panel connects to the server or space in the hosting computer and enables various activities. There are done to manage your website. The web hosting panel is provided by the hosting company and is operated through using a password and username. It is a GUI based interface and easy to understand and execute.  

The panel is an assimilation of different programs to manage various aspect of the website. Some of the activities done on regular bases are upload files and images, create ftp for downloading your site onto the folder of your computer, messaging, manage server, create analytics to name a few. 

The file manger is most important as it lets you download a file on to your computer and edit it using HTML editor. After the editing is complete and changes saved you can upload the page to its respective folder using FTP. In other case you edit the file directly using the facility in the control panel and save it right there.    

Some of the popular hosting panels in use are:
  • Plesk
  • CPanel
  • ZPanel 
  • CentOS Web Panel
  • Kloxo MR 
  • Ajenti 
Servers where sites are hosted require firewall and high security softwares the panel whould be abel to support such installation.  Sometimes in case of complex technology based websites managing files may be difficult. In order to edit such portal one needs to become familiar with the site structure and how to control it on the web hosting control panel.   


Friday, May 11, 2018

FTP - File Transfer Protocol

Whence you have to download from a server website files the process is called file transfer protocol. FTP stands as abbreviation. In simpler terms it is a network protocol that enables transfer of website files contained in a hosting server to client which may be a folder in your computer or laptop.   

An FTP tool usually contains two columns one on the left enables you to direct the files to a folder contained in your computer or lap top. For authentication a dialogue box is provided usually a clear text sign is done. But there can be various options for authenticating the user name and password. Encryption using SSL/TLS is preferred. TLS or transport layer security has preceded SSL or Secure Socket Layer. Symmetric cryptography is used to transmit data over two networked computer the methodology is too complex to be explained here. This offers a private and secure data transmission.

Another protocol used is SSH FTP or SFTP  this provides file access, transfer and management over a trusted data stream.         

For downloads many FTP software or tools are available on the Internet. The the IP address, username and password is obtained from the hosting service that configure it. The protocol used  is configured by obtaining details from the hosting service a well. 

Most of the ftp tools are available on the Internet without any charge but the one's meant for highly secured transmission may be available for a price. This applies to both FTP Client and FTP server the later is essential for server management and interact with the former.   

Thursday, May 10, 2018

About Hreflang Language Specific Attribute

Code Sample

<link rel="alternate" href="http://example.com" hreflang="en-us" />

hreflang attribute informs search engines as to which language you are using on the page. The code introduced by Google is placed in the head sections of website along with other attributes, title, keywords and description.   

Different language pages are created whence local audience is targeted. The page could be in any language and the attribute informs the search engine about it. This is essential in order not to confuse the search engine. 

Hence designers can create pages in different languages as per the audience targeted and literally make it more conversant as well as decrease the bounce rate. The visitors land on the right language page whence the algorithms detect their IP address as pertaining to a particular country where that language is in popularity.     

Even variants of a single language can be targeted using the attribute by adding annotation hreflang="es-es"/> to hreflang="es- mx"/> the latter being Spanish in Mexico. But this targeting is limited to countries only and not continents, unions of subcontinent.

Google and Yahoo support this attribute while Bing supports language meta tag. Coding should from ISO 639-1 and for regional targeting ISO3166 -1 Alpha 2. 

This tag generator tool is also available online making your work easier.  On one page depending upon the relevancy multiple attributes can be placed.

This attribute is preffered for International targeting in place of cannonclization. 

Wednesday, May 9, 2018

Anchor Text And Its Importance

Anchor text is a word or a phrase which is used to create a hyperlink. 

Example of HTML linking. 

<a href="http://www.xyz.com">anchor text</a> 

The phrase in red is the anchor text. This is very important for search engine optimisation as indexing and search engines use to to understand what the page is all about.  In order to retrieve contextual information and description of the page topic and popularity the algorithms analyse a number of anchor text besides the link itself. 

In order to provide relevant results on SERPs search engines use the anchor extensively. Hence webmaster use popular anchor text for ranking but then this can be manipulated hence the importance of anchor as ranking factor has been diluted. Overuse of anchor for ranking can often result in penalisation or drop in ranking of a website being promoted.  

As always keywords are used as anchor text by webmasters. Earlier they used same anchor text again and again to rank the site on that keyword. This does not work anymore. Hence it is better to use minimally naturally occurring words and phrases that describe the page well nothing more. 

Search engines have factored hundreds of metrics in their ranking algorithm hence repeated use of anchor is bound to fail.    

Contextual inform is deciphered using semantics or correlated words and phrases in anchor and surrounding content. 

There are few types of anchor texts:

Exact Match is a phrase that is used after the URL naming. For example if the page is about "website promotion" than that becomes the exact matching phrase similarly "top website promotion" will turn into partial match.  Brand names can also be used in hyperlink creation or the link can just be left naked with the linking world.    

Friday, April 13, 2018

Topic Clustering A New Approach to Search Engine Optimisation

Search algorithms are constantly evolving and so is the need for for webmasters to keep on updating themselves in order to produce results. The evolution of search engines in natural since there is a constant need to filter trash and weak or spammy content and to be in sync with ever changing technology and trends. 

Hence Google has come with many updates like Panda, Penguin and RankBrain to name a few. The evolution of  indexing and ranking mechanism means change in Seo practices. In the initial stages weak content with excess keywords worked. It does not any more. Hence more and more efforts are being made in creating contents on web pages in order to attract search engines to rank a site higher. 

Well in the initial stages creating content meant a series of pages that were relevant to the main topic. These where interlinked using search terms often repeated but it worked. As search engine explore for contents on websites they have begun to appreciate deep content clustering which are relevant and add to the information. 

Hence in topic clustering a pillar page is created which is linked reciprocally to other webpages or sub domain having intense but relevant information. The anchor used to link with the pillar page could be repeated in order to inform search engines of relevancy of the topic. This information architecture is helping sites ranking better and better.  The information architecture could be subdivided using silos creating cluster of pages that provide one set of information while another cluster could be providing another set of information but all relevant to the main topic. 

In topic clusters intense content and unique information is a must in order to qualify as useful pages. Slight repetition could lead to a penalty. I would further add blogs containing relevant and anecdotal contents should also be linked to the cluster provided they are updated regularly.       

Content creation for search engine optimisation is part of the requisite skill set in fact a skill set of highest order. It depends much upon the creativity and knowledge of optimisation of the webmaster in order to succeed. Another advantage of good content creation is assimilation of naturally gained links due to useful information contained.

Yes links still matter much!     

          

Wednesday, February 7, 2018

All About External Link Building


Albeit many metrics are now included in ranking and indexing of website links stand out. They stand out as a decisive factor and as indicators of topic and niche. Though mass scale link building is no more the criteria of performance on the SERPs a large number of authoritative links do help. 

Natural links are the best for any website and most liked by the search companies nevertheless a lot of hard work and ingenuity is required. The content creation with niche information, utilities, graphics and videos create an ambience for inviting linking. Easier said then done but all web masters should aim at achieving a large number of natural links.       

Another good option is to create contents and add a link and then post it to an article site with editorial scrutiny. This is applicable to creating external links on all platforms that are authoritative with good reputation online. 

The most important aspect of external link creation is relevancy. The external link should be embedded in content which is topic relevant. But while doing this posting should not be done at spam sites or farms which are built for this very purpose. In case of doubt add "nofollow" attribute to escape the wrath of the search engine. There are many "nofollow" sites where link can be embedded in a write up or article safely. But even here editors will not allow spamming. Hence create content which is unique all the time and post in these sites for building referrals.       

When you embed a link do it in a sentence which contains relevant information, and is not more than a 100 characters.  Use logical linking practise in a suitable anchor but not manipulate the sentence to add a primary one. Large number of related anchors perfectly suitable to the content should be used.     
A large list of relevant anchors used naturally help the algorithms determine on which keywords the side should rank. It is no longer to use one type of targeted anchor excessively. The indexing criteria has evolved much more and can decipher what the site is all about.      

There are many platforms available for attractive content publication. These are manned by editors and strict editorial  scrutiny is maintained. This is the way to for search engine professionals now. These sites can offer quality linking but a lot of hard work is required in order to understand their criteria.     

A blog is also a good option for link building but the write ups should be of serious nature. As applicable to all content creation topic relevancy is of utmost importance. Just do not create haphazard contents to as an option for linking. A large number of anecdotal and informative blog entries is the order of the day.   

Hence numbers have no meaning if the activity is not executed rightly. Qualitative linking should be done in a campaign for search engine optimisation. Remember quality and not quantity counts. Have patients for results to come through.

Use the content creation to gain popularity of social networking websites, blog aggregates, online communities. The reputation you build up gradually will offer long term stability high position to your client's website.

Not only this inactive brand mentions in contents can also reap a great harvest. Natural language processing and artificial intelligence are inching their way up in the whole process of indexing and ranking. This is slowly changing the paradigm which the webmasters will have to be in tandem with or leave the service.     

You are not a search engine professional if you stop analysing and stop reading informative articles on this topic. Internet is constantly evolving and so are the search engines. 

Saturday, February 3, 2018

Search Engine Optimisation A Common Sense Approach

It is a never ending ball play search engine optimisation is an exercise to rank websites on SERPs on first page. Day by day as search engines and their algorithms evolve the going goes harder or often obfuscated.    

Webmaster seek quickfire solution to rank their client website and get impressive traffic. The going is not easy anymore since the algorithms have become more cryptic so to say. In order to decipher them and be in tandem with guidelines set by the search engines is becoming a riddle and a task difficult. 

In earlier times spamming was easy and one could get away with easily. Not all webmasters where spammers but activities were more effective especially with link building in mass. With the induction of Panda and Penguin a paradigm shift has taken place. It is no longer easy to spam the search engines with content farms and ineffective link building. 

Though link building is still effective a word has been added to the whole process and that is authoritative. Making large scale authoritative links those with strict editorial scrutiny is good but it is no more the panacea for viral results. Nowadays even brand mentions, online reputation and community participation can make the site popular hence one should utilise social media effectively.    
In recent times plethora of metrics have been put into calculation for indexing and ranking. On page factors, technical seo inputs and sound link architecture along with unique contents and snippets have come into prominence. Page speed is slowly gaining importance as mobile search volume increases along with voice search. 

User Query & Intent 

Most of the principle of search engine optimisation remain in place albeit diluted to a varying degree. It is now a sum total of all. But the buck does not stop here as websites are ranked primarily on basis of user query relevancy. Until unless there are gross anomalies or spamming websites with answer to user query will always surface on top. The rest of the metrics will come into picture to decide between competitors answering the query as desired.    

The evolution of algorithms has been carried out keeping user satisfaction in mind. A product with high demand is placed in a prominent eye catching position in a department store. So is the website that seems most suitable to provide an answer to the query is placed high on the SERPs. 

Determination of user intent has come out by implementing methods to measure click through and bounce rate. This is the way to analyse user preference time and over a clear picture is emerging as to what online visitors seek. User intent is now gauged effectively hence if you type "Amazon" out pops links to the eCommerce company and not to the river in South America. Visitors overwhelmingly are seeking this company all over the Globe.         

Hence all factors being equal it is important to understand user intent before optimising a website. For example say you are optimising a website on a hotel. Whence a visitor types "hotels in Jabalpur" he will find well known portals offering details of various hotels in the town. The websites of individual hotels will be ranked well below. This is often misconstrued as brand preference favoured by the search engines. Well it is the user intent and search volume measured over the years that is in play. 

Search engines will always match  sites that seem to answer the user query effectively. Those which do not receive impressive click through and maintain a high bounce rate will be pushed back in spite of various metrics that are efficiently entrenched.        

Hence while creating a website write contents that satisfy queries within the topic that are most often typed on the SERPs.  Try to figure out how you will beat the contenders on page one. Make the site mobile friendly and offer good user experience and yes quality hosting with minimum outages will also help. Add preferred elements like videos, optimised graphics, photos and internal/outgoing links that could further the information. Add a company blog which is essential to keep the contents pouring in. 

Add effective titles, meta and snippets to help search engines come to a correct conclusion what your site is all about. Keep building links as well but maintain quality else your site could be penalised.