Monday, 29 June 2020

A Beginners Guide for Google Search Console

How to Use Google Search Console

Google search console

Google Search Console is a free web service offered by Google to check your website’s status on SERPs. Search console is the main communication of Google to the webmasters. You can oversee the website and troubleshoot the problems if any using the search console. Also, you will be able to understand the performance of your site on Google it involves the page visibility on google, total impressions, total clicks, etc.  The former name of the search console was Google Webmaster Tool. It can give an alert when Google encounters any spam or other issues on the website. You don’t have to log in separately to verify your web page, Search console will automatically verify the web page. Through this tool, you can see the indexing of your page on SERPs and can request for indexing of a new or modified we page. You are actually submitting your site to the search console and requesting it to evaluate your site for verification and to request for indexing. Steps involved for the verification of a website on Google search console,
  • Open the Google.com/webmaster/tools  and paste the copied URL of the site to be verified
    Search console window 1

  • In the next window confirm the verification process. There are 5 ways to verify our site if you are using a free blogger site as me you can go for the second option .I.e HTML tag verification. I will explain the five types of verification later on.
    Search console window 2
  • Select the HTML tag and copy it then paste it into the header HTML code of your Blogspot, if you are using Wordpress then simply paste it in the search console tag box available in the general settings. Then come back to the search console window and click on the "verify" option.
    HTML Tag search console 3

  • Now search console home page will be opened. Then paste the copied URL of the website on the URL inspection box and press the enter key. The result will be URL is not available on Google.
    Search console home window 4
  • Then click the live test button available on this window. Then the result will be the URL is available on Google. Live test URL 5
  • Click the request index button and confirm the request. 
    Google Index 6
Now you have successfully submitted your website to Google. Using the search console tool it’s easy to maintain the website on Google. For doing a better SEO you can modify your website at any time and submit it to search console for verification and you can ask for indexing. 

The next part is to do the verification of your particular web page of the website on Google to be crawled, catches, and indexed by Google. Then you need to paste the URL of the corresponding web page to the URL inspection box and then you can repeat the same process.
If you follow these steps your website will surely index in SERPs. You can repeat this process for every blog post you are adding to your blog.


 There are five ways to verify your website using google search console
  1. HTML file: verifying the website by uploading the HTML file to the rooter website, but if we dealing with free networks then we cannot go for this verification method because we may not be having the access to the root directory of the server.
  2. HTML Tag: copy the HTML tag and paste it into the header of the blog site. If you are using WordPress then simply paste it into the search console tag box
  3. Google Analytics: If you have already a Google Analytics account then, open an alternate tab and click on to GA radio button and click to verify
  4. Google Tag Manager: As long as the tag manager snippet is active on your site then you can simply do the procedure same as Google account verification
  5. Domain name provider: Instead of typing code you could add TXT record at the DNS provider for the verification.
Take some time to set up Google search console account and it will help you to reach to the target audience and to be indexed on SERPs.

Wednesday, 24 June 2020

Different Procedures Involved in On-page Optimization

On-page Optimization 
on-page optimization

On-page Optimization includes the practice of web page optimization to get a better rank in the SERP. Hence to increase the organic traffic to your website. On-page optimization concentrates more on the exact match keyword in the title, meta description, and contents of the web page. A web page is having two sections head section and body section.

Head section Optimization

The Head section of a web page has the URL, title, and meta description. it's very important to optimize your head section. Proper optimization of the head section can boost your ranking in SERPs. In the URL of the web page always tries to include the focusing keyword. So the search engine can easily crawl the URL and store it to the correct database. The next part to be optimized is the Title of the web page.

Snippet
Snippet

If we search for any query there will be a no of results in SERP, Snippet is the single result in it. Generally, a snippet consists of a URL, title, and Meta description of the page. The focusing keyword of your search will be there in the snippet and it will be highlighted in the description. 

SEO Title Optimization

Title tag optimization

In order to rank well SERPs, you need to optimize your web page titles for the search engine. A well written and relevant title for a web page is very important to get the attention of users. An optimized title can increase the traffic to your web pages. The title of the web page is very important since it's the primary factor that the search engine uses to determine whether or not your web page is relevant for the search query. To get organic traffic to your website you should focus on the title tags, which actually the HTML format of your title seen on the SERPs. Try to include the focusing keyword in the title. The title should provide a clear idea of the contents on the web page so that the search engine can rank you better. So it's very clear that if you want to get a higher ranking on SERPs you should focus on title tag optimization. 

Dos & Don'ts of Title Optimization

The title of a web page is the major factor to get more organic traffic hence getting a higher rank in SERPs. So while writing a title it is very important to avoid any mistakes that can affect the reach of your website to the relevant audience. Don't make any spelling mistakes, grammar mistakes, and not acceptable abbreviations. As per Google guidelines of title optimization never use full capital letters or full small letters while writing the title. like that title shouldn’t be either too long or too short. It's advisable to use a maximum of 55 to 60 characters to write the title of a web page. Pixel width is also very important if the pixel width is more than 512 pixel then the title can't be displayed completely but it may show incomplete words on SERPs. If you choose a very short title for your web page then the Google search engine may not be able to store the web page in the correct database. So never use less than three words to write the title of your web page. Google may not be able to fetch good quality content from your title because of that you may lose a relevant audience. By optimizing our web page we are intending to get a higher rank in SERPs. Actually, we are competing with other web pages having the same keyword to rank higher in SERPs, but if we gave the same title for two inner pages of our website then it may result in cannibalism between the pages. So in order to avoid the fight between our two web pages always use Unique Titles to the web pages.  If we didn't give any title to the web page then Google may show H1 or H2 of the body portion of our web page. Always try to write impressive Title, since it's very beneficial to get more click through reach of your website. When we submit our web page to the Google search engine, Google search console will show some errors of our web page in HTML improvements of the search console. So we can correct it and again submit to the Google search engine. by doing all the optimization practices we are actually helping Google to easily crawl and keep it to the correct database hence to show our web page for the query of relevant users. 

Meta Description Optimization

Meta description

Meta description of a web page on snippet is the content shown just below the title of the page. A meta description is a short paragraph about the contents of the page. The Meta description is written in the  Meta tag in HTML format. Keyword meta tags were formerly used as a ranking criterion later in 2012 Google announces that meta tags not having any role in ranking. Even it is not a ranking tool of search engine it has a vital role in click-through rate. Hence to get a better ranking it's very important to do the correct optimization practices on the meta description.

Elements to consider while writing Meta description

The meta description should follow all the telegraphic rules. The meta description can be of any length but it's advisory to limit between 155 - 160 characters or within 1024 pixel width. But for blog or article, the meta description has to limit within 155 characters since Google may show the date of publishing of the same. Always provide a clear idea of the content in an impressive way in the meta description. The use of quality content and focusing keyword is an add on to get the click-through rating. The meta description may have the same sentences from the inner pages but it shouldn’t be a pirated description from other sites. The meta description should be relevant and unique. If the given description is not relevant to the title or the description is too thin then, Google may create a meta description. It can be the sentences from the content or the description of our page shown in the Open directory project or even the HTML code of our page.

If our pages getting impressions but not getting any click-through we can modify the meta description in a marketing way. We can add marketing tools to highlight our web page. We can add tools like star rating, discounts, service specialties, etc.

Body Portion Optimization

The body portion of the web page contains the content of the page and images of the page. Body portion optimization includes Content optimization, Keyword optimization, Anchor tag optimization, and Image optimization. The most visible part of the body is the first heading of the content .i.e H1 given since the font size will be high. Hierarchically then the visible part is H2 and so on. Primarily we should optimize H1. We can write H1 as long as we want but it’s advisory to limit within 6 to 7 words. Along H1 won’t be good to see for the website visitors. No spelling mistake or grammar mistake is allowed in H1. H1 should be relevant to the content. A web page should be having only one H1 otherwise it may lead to cannibalism. Two contrasting H1 given to a web page may give confusion to Google search engine then Google will store the page in a temporary database called a sandbox. If a web page is stored in the sandbox then it will not get any reach and ranking improvements. Then the visible part is H2 of the web page. A web page supposed to have only one H2 but if it’s a very vast topic we can use more than one H2 on the same page. We can use as many H3 as we want on the web page. But the important thing is the contents below each heading should be relevant and matches the same.

Content Optimization 

Content Optimization

The content of a web page is any piece of information provided on the website, that can directly affect the ranking of the web page. The website should always be a user-friendly type to get organic traffic to the website.  The web page has a rich content can rank higher in SERPs if it’s optimized correctly. Always try to write simple sentences not exceeding 20 words and use understandable phrases. Each sentence should end with a full stop and give a space after full and coma. If we followed it correctly the web page can simply score higher in the reading easiness test of Google search engine. Either on the first or the second sentence of the content try to include the focusing keywords. 

Keyword Optimization 

Keyword Research

It’s very crucial to discover, research, analyze, and select the correct search engine keyword to the website. The selection of the best keyword for your page is very important, if you do a wrong selection then your entire effort will be in vain. Keywords play a major role in organic traffic and in PPC. Keyword optimization is the toughest part of SEO since optimizing keyword is not the practice you only do all the webmasters out there is doing the same. To get the maximum reach of the website you need deep researches on the target keywords. By choosing a correct focusing keyword we are intending to get qualified traffic from search engines. Our keyword selection focuses on the exact match of the query of the users. A good quality content along with focusing keyword will help to index higher in SERPs.

Keyword density is the occurrence percentage of the keyword in the whole content of the web page. It can be used for checking the relevance of a page for a focused keyword. But in Old school SEO practices( SEO practices that formerly used but in use nowadays)  they are telling so many SEO myths through old school SEO articles They spread that 5% of keyword density is very good to boost the ranking bit if it’s higher than 7% then the page may catch for keyword stuffing. Later, Google gave the explanation that the keyword density doesn’t have any role in ranking, but better to include the keyword a few times in the content to understand the  Google search engine is relevant to the focusing keyword. Another myth was that to get a better rank focusing keywords should write in bold letters, but it does not affect any ranking of the web page. It is advisable to use bold words in the content to provide more easiness for reading. Bold the keywords while writing will help the users to read the content without omitting any sentences. The users will first focus on the bold words then they can get an inference of the content. If the focusing keyword is introduced after a full stop or a coma it will boost the ranking. Always keep in mind while doing optimization that the optimization is not only for Google spider but also for the users to give a better reading experience.

Anchor Tag Optimization 

Anchor tag optimization

The anchor tag is the HTML used to add the hyperlink from a website to another site or location of Google. Anchor text is the visible characters or words that hyperlink show when a linking is given to another web page. Never ever use the focusing keyword as anchor text since it will reduce the ranking of your website. A good proportion of anchor text will act as fuel for the crawling of the Google spider. Don’t overdo the optimization practice. Avoid hyperlink to or from link spammy sites. For getting more links to your page you can engage in relevant guest blogging but never overdo, it can lead to link spamming. The add on of proper anchor text will help you rank better in SERPs.

Image Optimization

Image optimization

Image optimization is comparatively the easiest way of optimization. Use relevant images that match the content of your text. Try to reduce the file size of your image for faster loading of the site. Add the alt text to your image it will help the search engine to recognize the image. By adding the title of the image when the user touch over the image name will be shown over it. Both Alt text and Image title will help in easier scanning of the image. Image optimization also has a role to get indexed higher in the SERPs.

Google strong algorithm Updates Detailed Description

Google Strong Algorithm Updates 

Google algorithm updates

Google PANDA Update 

Google Panda update was the vital modification to the Google search results algorithm against content spamming. This update was done in 2001 on February 23. Panda update promises good quality search results by eliminating Low-quality content or Thin pages. This update has a crucial part in the ranking of web pages by Google. Any kind of content spamming will be caught by this update. Panda update provides a better user experience from Google.

Google Panda update

Categories of Content Spamming
  • Content duplication or Content plagiarism 
  • Low-quality contents, Spelling mistakes, Grammatical errors 
  • Content spinning 
  • Thin pages
  • Automated contents
Later on, the Panda update had many updates. Panda 4.0 was the major update given to Panda update. By this update, Google decided to bring in the panda update to the main search results algorithm of Google. By the introduction of Panda updates, many thin page websites eliminated from SERP, and content-rich websites got an increase in ranking in SERP. So webmasters tried to put quality rich contents hence the quality of SERP improved.
 
Penguin Update

After the success impact of the Panda update, Google comes with another strong update Google Penguin update. It’s launched in 2012 on April 24. Penguin update is introduced against Link spamming to get a quality search results on SERP.

Penguin update

Categories of link spamming
  • Paid links
  • Link exchange 
  • Link farming
  • Low-quality directory link submission 
  • Comment spamming
  • Wiki spamming
  • High increase in guest blogging 
A major update was given to the Penguin update is Penguin 4.0 update.  After this update, it’s become a real-time filter of Google search results algorithm.  After that whenever a link spamming is noticed the action by Google will be on time. It also improved the search results on Google SERP.

Pigeon Update 

Pigeon update is one of the local algorithm updates of Google to provide the ranking of local search results on SERP. It also affects the search results shown in Google Maps. By this an update, Google able to show the local search results for the situation we need a local search result such as a restaurant, saloon, supermarket, etc. Criteria for doing local SEO

pigeon update
  • The website should submit their full details on Google My Business 
  • The web page should contain all their contact  information and address
  • Add their contact number to local phone directory like Just dial, Ask dial 
  • Social media marketing preferably localized 
Hummingbird Update 

Hummingbird Updation
Hummingbird update is launched in 2013 and is announced after one month on September 26th. Unlike Panda and Penguin, Hummingbird update is not a place punishment-based update of update. It’s a significant update  did on google algorithm to give the users a better experience on the Google search engine. By this update google tried to provide a descriptive kind of search result. Google aimed to understand the intention of the search and try to give the best search results. 

RankBrain Update

Rankbrain update

Using Artificial Intelligence Google tries to provide results by understanding the situation of the search. Google confirmed the use of the RankBrain algorithm on October 26th in 2015. The search results will automatically update over location time, situation, etc.

Mobilegeddon Update

Mobilegiddon update

This update of the Google search engine algorithm is done on April 21st in 2015. In this update, Google aims to give more mobile-friendly website search results than desktop type websites since it’s difficult to handle on mobile phones. There are two major updates in mobilegeddon update i.e. mobilegeddon 1 and mobilegeddon 2.

Parked Domain Update
Parked domain update

This update aims to eliminate the parked domain from SERP. A parked domain is a domain but it’s not linked with any websites. It means when you book a domain for future use and not planned to do anything on the website currently. Earlier times Google tries to put this type of parked domain on SERP but after this update, Google prepared to eliminate the parked domains.

Exact Match Domain (EMD)

EMD update

This filter update is done in September 2012. EMD update is focused on eliminating the website having the exact same keyword as their name and having low-quality content. if the websites improved their quality and richness of the content then the sites may regain the rankings. 

Pirate Update

pirate update

Google pirate update aimed to prevent sites having pirated contents and is introduced in August 2012. We can compliant to Google by  Digital Millennium Copyright Act (DCMA) implemented in US back in 1998. Pirate update or DCMA penalty cause changes in Google search engine directly to remove the pirated websites that violated the copyright rules and regulations.

Monday, 22 June 2020

History and Evolution of Google's SEO


History and Evolution of Search Engine Optimization

seo




SEO is the process of improving the rank of a website in order to come in an organic search result. Search Engine Optimization uses different kinds of optimization practices to increase the rank of a website. Through an increase in rank more traffic to the website will be there. In order to study the history of search engine optimization firstly, we need an idea of the history of Google since during the evolution period of the Google search engine they introduced the SEO practices

history of Google

History of Google
Google was officially launched in 1998 by Larry Page and Sergey Brin to market Google Search, which has become the most used web-based search engine. Larry Page and Sergey Brin, students at Stanford University in California, developed a search algorithm at first known as "BackRub" in 1996, with the help of Scott Hassan and Alan Steremberg.

Google works using three process

1. Crawling

2. Catching

3. Indexing

Crawling :

Using a programmed algorithm of Google, it scans the entire website and stores the data in the corresponding database. Crawling is also known as the spider, web crawler, balls, robots, etc

Catching :

After scanning the web page Google will take a snapshot and keep it. This process is called catching.

Indexing :

As the name implies indexing means a website will be ranked according to the content, quality, and many other parameters.

During the initial times, Google was not a problem-free search engine it had many hurdles. Initially, if a person searches query on the Google search engine then the result will be given after 24 hours through the mail, but later on, the Google search engine became a real-time search engine. even then it had many problems many search results were missing for a relevant query. At the time of the crash of the world trade center America, it became a very big issue since Google was not able to give a good search result regarding this issue. Google conducted a meeting with search engineers to solve this problem and took their opinion from them. They gave an explanation that many of the websites are not able to crawl Then they discussed how to make the web page crawl by the spider. Search engineers explained that the webmaster can only do the process.

Then Google decided to provide an optimization practice for the webmasters, some search engineers oppose but trust and acceptance from the public was Google's major concern. At last, they published a starter guide for webmasters. Webmasters started to practice optimization which improved Google's database and search results. Some webmasters overdo it and badly affected Google search results there begins the process of the Evolution of Google


Evolution of Google    


Initially, Google was content-specific or niche-specific. If we use the focused keyword more on the web site more chances to get a better ranking in Google search results. It ultimately results in keyword stuffing. It's a black hat SEO technique.

Types of SEO

Three types of SEO

  • Black hat SEO 

  • White hat SEO 

  • Grey hat SEO 

White hat SEO 
White hat SEO

It’s the correct way of SEO practice purely ethical and following the guidelines from Google. It helps to improve the ranking of the website on the search engine results page(SERP).

Black hat SEO 

Black hat SEO

It’s an unethical and highly risky way of optimization. Even though it’s effective in getting a high rank in SERP, once it is caught that page will be reported. Never do any black hat SEO techniques.

Grey hat SEO 

Grey hat SEO

Grey hat SEO is not having a proper definition, it’s somewhere middle of White hat SEO and Black hat SEO. The practices that intended to increase the ranking of the page for that the webmaster adopt some black hat SEO practices. If you are using a grey hat SEO technique, then you might not be willing to inform Google about it.

Because of keyword stuffing and many other black hat SEO techniques, the quality of search results lessened.  Then Google changed the algorithm to Link specific. Link specific means the ranking of a website depends on the hyper link it gets from another website. But it also badly affected the google search results since many websites started to sell the links for money. It also gave low-quality search results. So Google comes up with the next evolution that Google changes the algorithm to Quality link specific. The quality of a website determined using a parameter called Page rank. Page rank is measured out of 10. Based on the backlink profile, page rank was google first algorithmic computation used to determine the rank of a website. Google check more than 200 parameters of a website to decide the page rank. Page rank grants the trust value of a website. Google counts the link from the high page rank website. As a result, web pages having high page rank tried to sell links for a higher cost which resulted in low-quality search results. In order to overcome the situation, Google made another update.


Passing the juice

Passing the juice



Link juice is a term used to show how strong a website is. If a website is having a very high link juice or equity which means it's having high strength and quality. Passing the juice means when a website is given a hyper link, then along with link some of the equity  is also passed through it. So as the no of the link given increases it may result in a reduction of the page rank of the link giving website. Link gets from on-topic web sites have a positive impact on page rank.  The hyperlink is given using an anchor tag. If we want to use the link for reference only then rel ="no follow" should include in the anchor tag. Then the link will be a reference only no link juice will be transferred through it.

Google Ads


Google Ads

Google Ads is a platform used to show the ad on the Google search engine in a prepaid method. We can customize our ad according to our needs on Google. The former name was Google Adwords. It's working CPC mode i.e. cost per click. Another name of Google Ads is Google PPC (pay per click).

Google Adsense


Google Adsense

Google Adsense is a simple way of getting money from Google by displaying ads on your web site. There are certain criteria to get Adsense approval from Google.

Later on, Google started to respond more in an interactive way. Google tried to auto-fill or suggest our query by storing our previous searches.


Bounce rate


bounce rate


 The bounce rate defines how much time a person remains on a website. A high bounce rate of a website badly affects the ranking of the site.

Personalized search result

 Google will document our search history and give more personalized search results. This kind of search results gives both re-targeting and up-sale type of ads on our search results.

By the next evolution, Google tried to be a personal assistant. Later on, the ranking of a website also dependent on social media signals. The website has more shares on social media will get a high rank. Then Webmasters ran behind the social media shares which resulted in low-quality search results. Google then modified the social media signal by introducing social media authority value. Social media authority value indicates the ranking not only depends on the number of shares but also depends on how many people interacted and the influential power of the interacted persons.

After this modification, Google became a better search engine. Then Google made some strong algorithm updates.




A Beginners Guide for Google Search Console

How to Use Google Search Console Google Search Console is a free web service offered by Google to check your website’s status on SERPs. ...