Showing posts with label server. Show all posts
Showing posts with label server. Show all posts

Friday, December 20, 2019

SEO Audit: Status Codes & Redirects

What are status codes? A server response to browsers request!

Whenever you query using a web browser you get a response from a server which is usually a web page which the search engine algorithms have indexed and ranked and the reply is a fitting acnswer to the query. In most of the web browsers you get ten answers as web page.

If  type a web page URL on the search column of the browser you will see the page if it is active. But if the page does not respond you will get a status code detailing the status recieved from the server. This is the response code or status code we are discussing here.  You will on most instance not notice the code.Redirects are placed/coded in server headers.

Types of Status Code

200 OK - Everything is all right please here is your page.

301 Redirect Moved Permenantly-   Page has moved permenantly to another location or URL. This is often used whence the URL or domain has changed and the link value of old has to be transferred to new. This is a useful redirect for SEO consultants whence dealing a change of their client website. The 301 redirect code is usually set up in the server to direct the search engine bot to new domain or Webpage. The coding differs depending upon the server type.

*Note in case the hosting server is changed than this is not applicable as the domain IP Address and Hosting Server IP Address are clubbed together.

302 Redirect Moved Temporarily- In this case the page has been temporarily for perhaps a short period of time.

401 Status Code prevents entry of the visitors and requires uername and password

403 Status Code prevents entry by all means.

404 Status Code this is a dead end code and used whence the webpage exists no more. The status is uncertain unlike in the case of 410 Status Code whence the server knows that this page is no more.

500 Internal Server Error something wrong at the server end.

503 Service Unavailable when service from server end is not available for some reason.

  • 303 – see other.
  • 307 – temporary redirect that works with HTTP/1.1.
  • 308 – that is a permanent redirect but according to another RFC.

There are tools to check status of response besides online services.

check response code 

====================================================
Uday provides Digital Marketing (SEO) Service and website contents in English.
He teaches Digital Marketing in Jabalpur his home town in India.
Contact: pateluday90@hotmail.com
Mobile: 9755089323 

Friday, May 11, 2018

FTP - File Transfer Protocol

Whence you have to download from a server website files the process is called file transfer protocol. FTP stands as abbreviation. In simpler terms it is a network protocol that enables transfer of website files contained in a hosting server to client which may be a folder in your computer or laptop.   

An FTP tool usually contains two columns one on the left enables you to direct the files to a folder contained in your computer or lap top. For authentication a dialogue box is provided usually a clear text sign is done. But there can be various options for authenticating the user name and password. Encryption using SSL/TLS is preferred. TLS or transport layer security has preceded SSL or Secure Socket Layer. Symmetric cryptography is used to transmit data over two networked computer the methodology is too complex to be explained here. This offers a private and secure data transmission.

Another protocol used is SSH FTP or SFTP  this provides file access, transfer and management over a trusted data stream.         

For downloads many FTP software or tools are available on the Internet. The the IP address, username and password is obtained from the hosting service that configure it. The protocol used  is configured by obtaining details from the hosting service a well. 

Most of the ftp tools are available on the Internet without any charge but the one's meant for highly secured transmission may be available for a price. This applies to both FTP Client and FTP server the later is essential for server management and interact with the former.   

Tuesday, December 19, 2017

Web Browser Software & Application Protocols

What you see, download, listen, record all has become possible thanks to the web browser. It is nothing but a software application that has been made interactive is such a manner that all said above has become a reality. This is what has made the Internet so popular - windows & web browser.   

For retrieving that all that is contained in a website, blog, video, audio the web browser interprets the URL. Modern browsers are equipped with GUI or graphic user interface. The user or the visitor is the retriever to whom the browser fetches information as per the URL. Hence the person inputs the URL information in the address bar or in case of SERPs he or she clicks on the desired link similarly one can click on an external link embedded on the website or blog etc. Plugin assist in enabling the retrieval of  application that is not part of the browser. Like wise browser extension enhance the functionality for a better user experience. Web browsers are enabled with many associated features like history, bookmarking facility and so on usually managed from the settings.  

Some of the popular web browsers are Firefox, Google Chrome, Opera, Internet Explorer, Safari and Microsoft edge.          

HTTP or HTTPS      

Enable this retrieval is http: or https: which is a hyper text transfer protocol the latter is secured where data is transferred in an encrypted form hence most preferred as well. The protocol is meant to enable request and subsequent response for the web browser. In simple terms the protocol enables the client to receive response from the server where the website if hosted. This is not just one request but rather a sequence of requests from the server using the TCP or transfer control protocol. There may be substantial number of queries arising from request for image, content, audio and scripts embedded in the html page using the http protocol. The HTTP consists of the protocol that is http: or https: followed by www then followed by domain name which could be .com. edu. .org, country specific name and so on.      

Ex. 

http://www.websitepromotion-india.com 

This information travels over TCP and then IP (Internet Protocal) and the response travels backwards to the users browser. The URL above is also known as the top level domain or TLD for short.   

Name Servers & ICANN

The name server is where the domain is hosted. They are served by root name servers or root zones that  converts the alphabetical request into numerical request. There are total thirteen root zones managed by an organisation called ICANN the full form is International Corporation of Assigned Name and Numbers. 

This name server is clubbed with the hosting server where the website is placed. Hence the request is instantly connected with the right address.     

Server Queries & Downloading speed. 

Though not related server queries end up making the download slower as the browser is able to send only a few queries at a time. Hence the design of the HTML page should be such that few queries are entertained which results of faster download.     

Website download speed is usually a comparison between the website on SERPS that respond to a user request in form of a keyword typed on Google page. Hence whence compared with fastest website downloading in one second the one that takes five second to download is deemed slower. 

Tuesday, December 12, 2017

Domain Name & DNS

It is too difficult to remember IP addresses hence domain names have come into the picture. Domain names are associated with the websites and resolve into the assigned number. It is essential to register a domain name for a website.  

DNS

In order to connect domain name to the associated IP address DNS or domain name service is required. The service keeps track of domain names and their associated IP addresses. The DNS servers contain a section of the domain database and whence the right address is not located in the server the request is diverted to another till eventually the corresponding IP address is located.   

In case of a registered domain name the address is clubbed with the IP address of the hosting sever where the directory resides. This how a website becomes visible to us upon request. Whence a domain name is typed on the address bar of the browser a request is made to the DNS server clubbed with it. 

While configuring TCP/IP a primary DNS number is assigned which is actually a getaway to number of name servers which makes search possible for the right IP address.  

Domain Name Registration 

In order to register a domain name one needs service of a domain registrar who registers the name at ICANN. A fee is involved which you have to pay to the registrar for one year or more depending upon you. After the expiry period your site will not be visible since the associated name is no longer yours. Anyway you get forty five days to renew the name else it goes to an open pool such that anyone can register it in his own name.   

Who Is? 

These sites provide complete details of the domain name. The name of the registrant his address and contact details. The name server details are also available. The date of registration, date of updating and expiry can also be found.     

Monday, November 20, 2017

Website Crawl Errors & More

Search engine robots crawl your website to discover pages and then index and rank them. This is part of the ranking mechanism and is an important search engine optimisation issue. The site error as shown in the Google search console are an indicator of design of the site which can create substantial crawl issues.  

The crawl issues usually arise from badly configured robot.txt file which the robots score to find out permissions. This file tells robot which page to crawl or not. In case if the configuration is wrong the whole can be out of the purview of the search engines. This will certainly show crawl error to the tune of hundred percent. Hence immediately peak into the directory in the control panel and rectify. The absence of robot.txt file means that there are no instructions and the crawler will wade through all the pages if possible. 

Another reason for errors creeping are scripts that the robots are not able to decipher. This can also happen whence pages are blocked, directory is mistakenly deleted or pages simply do not exist have been mistakenly or deliberately deleted.     

Low error rates would just mean the site is wrongly configured or over burdened with too many pages. 

Other problems:

DNS Errors: DNS stands for domain name server where the domain is parked the DNs server is clubbed with the hosting server. Issues within this linkage may cause DNS error or some problem with the routing exists.    

Server Errors:  Server is the space where your site of the files and directories are hosted. There could be an error in form slow accessibility whence ther server times out. there could a breach by a virus or malware and related problems.  It could also happen due a mis-configured firewall or operating systems protection. There is a mechanism to block to many server queries this can sometimes go haywire and prevent Googlebot from accessing the files. This usually happens with the dynamic URLs. 

Webmasters can control how their site should be crawled usually pages with not importance are blocked.

There can be many issues the best is to visit a forum and discuss the issue if enough material is not available. 

The FetchAsGoogle Tool in the search console is ideal for checking the crawl ability of your website. Keeping the crawling fluid is important for indexing and ranking hence for SEO.