Showing posts with label search console. Show all posts
Showing posts with label search console. Show all posts

Tuesday, February 23, 2021

Web Vitals Are Quality Signals That Rank A Site! & Measuring Tools

 Although not integrated yet these are future unified quality signals the Google is going to include in their vast set of metrics. Web vitals are issues which all webmasters should understand and enhance the quality with help of web developers. 

Mobile friendliness and good download speed are some of the metrics which search engines have time to time expressed emphasis. In fact making the site responsive is essential for better performance on the SERPs (Search Engine Result Page).   

User Experience is of utmost importance an no search engine is going to ignore this feature. UX is measured by your performance and is not limited few metrics but rather is holistic in nature. What makes visitors stick on to a site, spend time there and interact these signs of good onsite experience and how you bring about is through the character of your web page, attributes and smooth technical experience.     

Although on page content and attributes are most vital for ranking. These are assimilated on page and to be delivered smoothly without any glitch. Thus this gains importance as combination of factors.  A site with excellent contents relevant to the query takes eternity to down load has functions which are ill designed and coded is going to fail eventually.    

Thus the core web vitals revolve around the down load speed comprehensively, its visual stability and interactivity. Visual stability means the attributes like images, graphs, video and text are stable and do not shift here and there even going out of site. The third is interactivity the interactive functions should work well and do not fail whence most needed.      

There are tools that enable measurement of these aspects and aid in judging the page performance as:

Good (Green) 
Improvement Needed  (Orange) 
Poor (Red)

The webmasters should be able to access and analyse these vitals and engage the development team/ designer to fetch improvement. These parameters fall withing the purview of web developers.     

Get Familiar with some reports and tools here:

Chrome User Experience Reports 

This report is limited site download aspects but more metrics are going to be added. 

Search Console Web Vital Reports 

Poor page loading has severe effect on bounce rate in the negative and is a big turn off making the visitor to jump to another site promising the same answer. Read the report pertaining to these issue in the Google Search Console. 

For example if page load time increases from one second to three second the bounce rate would increase by thirty two percent. 

Although most seo specialists or webmasters would rely on the default character of their clients website those with good development support should scrutinize these vitals and get things improved. A substantial impact could be  assumed once these metric are fully loaded but time will tell. Hence we need to get familiar with web core vitals right now.    

After identification of issues in the rectification has to be done in the developers LAB. Here is an article on follow up of the Search Console Report.   

Some of the issues that can crop after analysis will improve understanding of technical aspect of a website. Well en good if you can provide the improvement else rely on the developers lab.  

SUGGESTIONS 

Eliminate render-blocking resources

Properly size images

Remove unused JavaScript

Serve images in next-gen formats

Reduce initial server response time

Accessibility

Elements with an ARIA `[role]` that require children to contain a specific `[role]` are missing some or all of those required children.

Image elements do not have `[alt]` attributes

`[user-scalable="no"]` is used in the `<meta name="viewport">` element or the `[maximum-scale]` attribute is less than 5.

Preload key requests

Does not use HTTPS

You can use this site performance tool to evaluate your site and find suggestions. 

=============================================  

Uday provides search engine optimization service for digital marketing. He provides website contents and content for authoritative links. 

He also teaches digital marketing in his home town Jabalpur. He can be contacted at: 

pateluday90@hotmail.com

09755089323          

Sunday, December 30, 2018

Fetch as Google & AngularJS

In order to be well documented by the search engines the robot should be able to traverse the site thoroughly with ease. This is imperative for indexing and ranking by search engines like Google. Although some companies have become more efficient in negotiating scripts the problem still persists. Using pre rendering platform can assist the robots to decipher the site properly since a fully cached version is presented to then by such platforms.   

Unfortunately some scripts impede the movement of the robot and this could result the site being partially indexed. This is more problematic whence the site uses scripts that accord dynamic functionality on page like AngularJS. The rendering of  HTML is made difficult by these scripts in case of the navigational links and indexable content. Albeit AngularJS has been made by Google it is quite susceptible to this problem and can affect you ranking and hence organic search traffic.

AngularJS is used to create dynamic web pages and it works on client site simultaneously without the need for the web page to refresh every time an action is requested. Using this script a mix of elements can be changed quickly. In ecommerce sites the script is highly useful where various product dimensions have to exhibited and quickly. There is no need in this case to ask for new pages or refresh this thus decreases the bounce rate by increasing user experience.  

It is important that the elements on a web page should not make rendering it or crawling difficult. In order to check whether your website is being properly rendered you should use Google tool available in the search console called Fetch As Google. This tool is available only for the current site which has been registered in that search console. You cannot check any other site with registering it.  This support provide accurate analysis of what the robot has been through your site and hence inform you about the status of indexing of the whole site as well as that of a single page.   The Fetch As Google works for both the desktop and mobile sites. The support at search console is detailed and should enable you to know the extent of indexing that is being done by the SE. 

Whence you register at search control and enter it you will find this support amongst other utilities. 
If you leave the fetch as Google button empty it will check the root. You can also check the status of a single fetch using the support.
=============================================

Uday provides SEO Services and Website Contents
He teaches Internet Marketing at Jabalpur
Contact: pateluday90@hotmail.com 

Monday, November 20, 2017

Website Crawl Errors & More

Search engine robots crawl your website to discover pages and then index and rank them. This is part of the ranking mechanism and is an important search engine optimisation issue. The site error as shown in the Google search console are an indicator of design of the site which can create substantial crawl issues.  

The crawl issues usually arise from badly configured robot.txt file which the robots score to find out permissions. This file tells robot which page to crawl or not. In case if the configuration is wrong the whole can be out of the purview of the search engines. This will certainly show crawl error to the tune of hundred percent. Hence immediately peak into the directory in the control panel and rectify. The absence of robot.txt file means that there are no instructions and the crawler will wade through all the pages if possible. 

Another reason for errors creeping are scripts that the robots are not able to decipher. This can also happen whence pages are blocked, directory is mistakenly deleted or pages simply do not exist have been mistakenly or deliberately deleted.     

Low error rates would just mean the site is wrongly configured or over burdened with too many pages. 

Other problems:

DNS Errors: DNS stands for domain name server where the domain is parked the DNs server is clubbed with the hosting server. Issues within this linkage may cause DNS error or some problem with the routing exists.    

Server Errors:  Server is the space where your site of the files and directories are hosted. There could be an error in form slow accessibility whence ther server times out. there could a breach by a virus or malware and related problems.  It could also happen due a mis-configured firewall or operating systems protection. There is a mechanism to block to many server queries this can sometimes go haywire and prevent Googlebot from accessing the files. This usually happens with the dynamic URLs. 

Webmasters can control how their site should be crawled usually pages with not importance are blocked.

There can be many issues the best is to visit a forum and discuss the issue if enough material is not available. 

The FetchAsGoogle Tool in the search console is ideal for checking the crawl ability of your website. Keeping the crawling fluid is important for indexing and ranking hence for SEO.