Showing posts with label DNS. Show all posts
Showing posts with label DNS. Show all posts

Tuesday, December 12, 2017

Domain Name & DNS

It is too difficult to remember IP addresses hence domain names have come into the picture. Domain names are associated with the websites and resolve into the assigned number. It is essential to register a domain name for a website.  

DNS

In order to connect domain name to the associated IP address DNS or domain name service is required. The service keeps track of domain names and their associated IP addresses. The DNS servers contain a section of the domain database and whence the right address is not located in the server the request is diverted to another till eventually the corresponding IP address is located.   

In case of a registered domain name the address is clubbed with the IP address of the hosting sever where the directory resides. This how a website becomes visible to us upon request. Whence a domain name is typed on the address bar of the browser a request is made to the DNS server clubbed with it. 

While configuring TCP/IP a primary DNS number is assigned which is actually a getaway to number of name servers which makes search possible for the right IP address.  

Domain Name Registration 

In order to register a domain name one needs service of a domain registrar who registers the name at ICANN. A fee is involved which you have to pay to the registrar for one year or more depending upon you. After the expiry period your site will not be visible since the associated name is no longer yours. Anyway you get forty five days to renew the name else it goes to an open pool such that anyone can register it in his own name.   

Who Is? 

These sites provide complete details of the domain name. The name of the registrant his address and contact details. The name server details are also available. The date of registration, date of updating and expiry can also be found.     

Monday, November 20, 2017

Website Crawl Errors & More

Search engine robots crawl your website to discover pages and then index and rank them. This is part of the ranking mechanism and is an important search engine optimisation issue. The site error as shown in the Google search console are an indicator of design of the site which can create substantial crawl issues.  

The crawl issues usually arise from badly configured robot.txt file which the robots score to find out permissions. This file tells robot which page to crawl or not. In case if the configuration is wrong the whole can be out of the purview of the search engines. This will certainly show crawl error to the tune of hundred percent. Hence immediately peak into the directory in the control panel and rectify. The absence of robot.txt file means that there are no instructions and the crawler will wade through all the pages if possible. 

Another reason for errors creeping are scripts that the robots are not able to decipher. This can also happen whence pages are blocked, directory is mistakenly deleted or pages simply do not exist have been mistakenly or deliberately deleted.     

Low error rates would just mean the site is wrongly configured or over burdened with too many pages. 

Other problems:

DNS Errors: DNS stands for domain name server where the domain is parked the DNs server is clubbed with the hosting server. Issues within this linkage may cause DNS error or some problem with the routing exists.    

Server Errors:  Server is the space where your site of the files and directories are hosted. There could be an error in form slow accessibility whence ther server times out. there could a breach by a virus or malware and related problems.  It could also happen due a mis-configured firewall or operating systems protection. There is a mechanism to block to many server queries this can sometimes go haywire and prevent Googlebot from accessing the files. This usually happens with the dynamic URLs. 

Webmasters can control how their site should be crawled usually pages with not importance are blocked.

There can be many issues the best is to visit a forum and discuss the issue if enough material is not available. 

The FetchAsGoogle Tool in the search console is ideal for checking the crawl ability of your website. Keeping the crawling fluid is important for indexing and ranking hence for SEO.