SEO 101

In this guide we have covered A to Z information about SEO, such as "What is SEO?", Types, Methods & Techniques, History & Evolution of Search, Google Algorithm changes, Industry wise & SEO tips.

Table of Contents

1.0 What is SEO
1.1 SEO Methods
1.2 SEO Types
1.3 History of SEO & Google Search
1.4 Search Engine Basics
1.5 Why is SEO Important
1.6 Should I hire an SEO professional, consultant, or agency?
1.7 Google Algorithm Updates  
1.8 Google 101
1.9 EAT

1.0 What is SEO

Search Engine Optimization (SEO) is a process of making changes to your website appearance(design) and contents to rank your websites at the top when a user searches in the search engine. Optimizing your websites will help you to increase the visibility among the users in the market.

seo 101

1.1 Methods of SEO

1.1.1 White Hat SEO

White Hat SEO is a process of ranking your websites without breaking rules(always follows the terms and conditions framed by the search engines).Failed to follow white Hat SEO practices may leads to ban your website by Google and other search engines.

So undoubtedly ,following white Hat SEO method is the best way to maintain your website ethical,reliable,sustainable and moreover successful. Examples: Fast loading time Mobile friendly Quality and long form content Easy navigation to your page.

1.1.2 Black Hat SEO

Black Hat SEO is just opposite to White Hat SEO. So Black Hat SEO is a process of following the techniques and strategies for ranking websites by breaking the rules of search engines.

Here comes the techniques should be avoided:

  • Automatically generated content

The content which has been generated programmatically which means publishing contents not to help users but just to manipulate search rankings.

  • Creating pages with little or no original content

The contents published in your page should not be tailored from any other web pages

  • Sneaky redirects

Redirecting the user or search engine to different urls which they didn’t requested.Using JavaScript to redirect users can be a legitimate practice

  • Doorway pages

 The site which is intended to rank highly in search engine.It leads users to arrive at the different destination and also display similar pages in user search results

  • Participating in affiliate programs without adding sufficient value

The sites which will use the same contents across many web pages without adding any useful content. The sites which used the contents from affiliated networks will suffer in Google’s search rankings.

  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware

The pages which download and install softwares without getting permission from the user. Google is not only focussing on displaying the most relevant search results for the users and also they are focussing on their user’s safety

  • Participating in link schemes

Introducing links in the websites just for the sake of page ranking. It also includes having links to your site or outgoing links from your site.

  • Cloaking

Cloaking is presenting content which is not relevant to the topic which they searched for.

  • Hidden text or links

Hiding links which are not easily viewable by the visitors of the page ,but just to manipulate search engine ranking. Text can be hidden using several ways. Some are

  • Using white text in white background
  • Having text behind images
  • Hiding links behind small characters such as hyphen in a paragraph
  • Scraped content

The contents which are copied from other sites and posting without any additional/useful information provided to make the site unique. It may lead to face copyright issues in some cases.

  • Loading pages with irrelevant keywords

 The pages which filled with keywords which are not even relevant to the page,which may cause negative user experience and also affect the site ranking

  • Abusing structured data markup

The pages which contain unstructured data may receive less favorable ranking. Those pages will not be eligible for rich results in google search engine in order to make the user experience high quality search results

  • Sending automated queries to Google

The pages cannot send any automated queries such as “how a web page ranks in google search results“ to google without getting permission from them. 

1.1.3 Grey Hat SEO

Grey Hat SEO is not something between White Hat SEO and Black Hat SEO but it is the set of rules which is not defined by google. It is very important for bloggers to know about Grey Hat SEO as It will not cause any negative consequences if you know it well or else it will make you lose thousands in poor traffic.

1.2 SEO Types

It has two types

1.2.1 On-Page SEO

         On-Page SEO is the measures that can be taken within your website such as optimizing the meta description, HTML code, title tags, and alt tags to improve its position in search rankings. Also it will take care of overall content quality, page performance, and content structure.

1.2.2 Off-Page SEO

         Off-Page SEO is the process which will look into the page ranking factors which comes outside of the website and also contains the full control over the website.The factors such as backlinks from another site,Domain authority and social promotion

1.3 History of SEO & Google Search

The history of SEO can be traced back to 1991 with the launch of the first website. It functions predominantly with search engine marketing, and officially, the SEO story picked up speed in 1997 with the official launch of the first web search engine. It was when the World Wide Web was still a very new concept and unknown to the common man at the time.

Before Search Engine Optimization was used as the official term, other names that were doing the rounds and came up with discussions were:

  • Search Engine Placement
  • Search Engine Positioning
  • Search Engine Ranking
  • Search Engine Registration
  • Search Engine Submission
  • Website Promotion

The initial years saw on-page activities as the only form of performing SEO. The purpose included, among other things, ensuring the content was right and relevant with precise HTML tags and complete with external and internal links.

As search engines became household names all over and more families and friends became connected to the Internet, there was greater ease in searching for information. However, the problem lay in the assessment of the quality of that information, which was made available to the user.

The year 2005 was a landmark year in the world of search engines. In January that year, Google joined hands with Yahoo and MSN to bring out the Nofollow Attribute. It partly reduces spammy links and comments on sites with a particular focus on blogs.

Then in June the same year, Google introduced the personalized search service. It began to display search results more relevantly based on the user’s search and browser history.

Then, later in November, the domain saw the launch of Google Analytics, which is used globally to measure campaign ROI and website traffic.

In recent years, the landmark announcement is attributed to Google’s update in 2015, which made it known that websites that weren’t mobile-friendly would be ranked lower in search results. It brought on the era of responsive websites designed across devices and had to work in conjunction with keywords and content.

1.3.1 History of Google

In 1996, Larry Page and Sergey Brin, who were computer science students at Stanford University, began to build a search engine that they initially called BackRub.

Larry Page helped conceive a system that crawled the internet to find out which pages linked with other web pages, as that could lead to the creation of a new search engine prototype.

Along with Sergey Brin’s mathematical expertise, both created the PageRank algorithm, which was named after Larry’s surname to rank the search results based on their unique linking behavior. These two new technologies lay the foundation for the most powerful search engine in the world and launched on Stanford University’s private network in August 1996.

As the web pages grew to connect a large number of links and search engines turned out far accurate results, they thought of renaming BackRub as Googol after the mathematical word for one (1) followed by a hundred zeros. Once, when one of their clients had to pay them, they accidentally wrote them a cheque with the company name spelled as Google. And thus, Google came about in 1998!

Larry and Sergey’s mission ‘to organize the world’s information and make it universally accessible and useful.’ is still at the core of all their endeavors.

1.4 Search Engine Basics

1.4.1 Webmaster Guidelines

The webmaster guidelines are a ready reckoner on how to navigate using Google’s expert advice.

Helping find sites, index, and rank them requires expert assistance, and that’s what the General Guidelines section covers all the essential aspects.

The illicit practices which lead to the removal of a website due to a spam action can be avoided when following Quality Guidelines. Typically, manual and algorithmic spam action leads to removal from the Google index and may not show up during search queries, even on Google partner sites.

The Content-specific Guidelines to make websites AMP-friendly, and help them rank better together by assisting Googlebot with Ajax-powered UX are also covered in the section along with tips on progressive enhancements.

1.4.2 General guidelines

1.4.2.1. Search Engine Optimization (SEO) Starter Guide

This guide helps those who own, manage, monetize, and promote online content using Google Search. SEO is almost always about making quick changes and variations to sections of the website, which can result in noticeable improvements when viewed together.

1.4.2.2. Secure the site with HTTPS

Hypertext Transfer Protocol Secure (HTTPS) is a protocol for internet communication which safeguards the integrity and confidentiality of data between the user’s personal computer and the website.

Transport Layer Security protocol (TLS) secures the data sent via HTTPS, which provides three essential layers of protection:

  • Encryption
  • Data integrity
  • Authentication

Here are some of the best practices when implementing HTTPS:

  • Using robust security certificates.
  • Using the server-side 301 redirects.
  • Verifying that Google can crawl and index the HTTPS pages.
  • Support HTTP Strict Transport Security (HSTS).
  • Considering the use of HSTS preloading.
  • Adding the new HTTPS property to Search Console when migrating from HTTP.

1.4.2.3. Keep a simple URL structure.

Higher numbers of URLs can be caused unwantedly by several reasons. Some of these include:

  • A set of items with additive filtering.
  • Generating documents dynamically.
  • URLs have a problematic parameter.
  • Sorting of parameters.
  • Having irrelevant parameters in the URL, like referral parameters, for example.
  • Having calendar issues.
  • Presence of broken relative links.

Steps to help resolve this problem:

To avoid potential problems with the URL structure, please follow the below steps:

  • In an attempt to block the access of Googlebot to problematic URLs, it’s advisable to consider using a robots.txt file.
  • Avoiding the usage of session IDs wherever it’s possible, in the URLs.
  • Adding a nofollow attribute links to future calendar pages created dynamically, especially when the website possesses an infinite calendar.
  • Shortening the URLs as and when possible, by trimming the unnecessary parameters.
  • Checking the website for any broken relative links.

1.4.2.4. Qualify the outbound links to Google

Remember, sitemaps and links of other websites can lead to the linked pages, which means it is crawled.

Using the robots.txt Disallow rule can prevent the following of a link by Google to a page on the same site.

Allowing to crawl and using the noindex robots rule can also prevent indexing a page by Google.

1.4.2.5. Tag site for child-directed treatment

Usually, marking the website and app as child-directed will not affect Google Search rankings. It’s advisable to keep the below in mind:

  • The entire domain or parts of it can be tagged and treated as child-directed.
  • It may take a while for the designation to get applicable in Google services, but webpages under a domain or directory are anyway tagged.
  • The inclusion of sub-domain and domain are limited in number by Google.
  • Tagging individual ad units to be treated as child-directed, helps in better control in managing the contents.

1.4.2.6. Browser compatibility

Every browser interprets the website code differently, which means that it can appear slightly different from visitors who use other browsers. Here are some quick steps to identify browser compatibility:

Test the site in as many browsers as possible:

  • Write good, clean HTML
  • Specify the character encoding
  • Consider accessibility

1.4.2.7. Avoid creating duplicate content.

Some quick examples of non-malicious duplicate content are:

  • Regular and stripped-down web pages that are targeted at mobile devices and generated in discussion forums.
  • Storing items that are shown or linked via multiple and distinct URLs.
  • Showing a printer-only version of web pages.

Some proactive steps to resolve duplicate content issues to help visitors view the proper content are listed below:

  • Try using 301s.
  • It pays to be consistent.
  • It’s advisable to use top-level domains.
  • One must syndicate carefully.
  • It should be minimal boilerplate repetitions.
  • It’s best to avoid publishing stubs.
  • Have an understanding of the content management system.
  • Help reduce similar publishing content.

1.4.2.8. Make the links crawlable.

Using proper <a> tags

A <a> tag with an href attribute is the only way to follow Google and its crawlers. Formats without an href tag are different, and Google’s crawlers cannot find it due to script events.

Link to resolvable URLs

They are making sure that the <a> tag linked to the URL is a proper web address that Googlebot can contact and send requests.

1.4.2.9. Best practices for website testing with Google Search

When different testing versions of the website or sections of the site happen, for collecting data and user reactions, it’s broadly called website testing.

A/B Testing

Running a test by creating various versions of a webpage with their URLs is called A/B testing. The test entails routing some of the visitors from the original URL to a variation of the URL to note their behavior and decide which page is better suited and more effective.

Multivariate Testing

Using software to help make changes to various parts of a website is called Multivariate Testing. In this testing method, one can see each of the variations with different combinations to analyze the effectiveness statistically. One thing to note here is that the URL involved is only one. It’s the webpage that has dynamic insertions of the variations.

It is advisable to make sure that testing variations in-page URLs and page content have a nominal effect on the performance of Google Search.

Listed below are quick pointers to keep in mind when testing site variations and avoid the adverse effects on Google Search:

  • It’s best not to cloak the test pages.
  • Use rel= “canonical” links at most times.
  • Always use 302 redirects and not 301 redirects.
  • The experiment is run only as long as it’s required.

1.4.3 Content-Specific Guidelines

1.4.3.1. AMP on Google Search guidelines

All the guidelines to make a Google-friendly site are also applicable to AMP. Here, we cover additional guidelines which are aimed explicitly at AMP on Google Search.

Ideally, the AMP page should follow the AMP HTML specifications. All the users should be able to experience the same content to complete the same actions on AMP pages and the corresponding canonical pages, wherever possible. The AMP URL scheme must make sense to the users.

AMP pages should be valid just so the pages work as required for users and can include AMP-related features. Please note that pages with invalid AMP are considered ineligible for some of the Search features. The structured data that adds to the page must follow the data policies.

Why aren’t my AMP-specific features appearing on my tablet or desktop?

On Google, the AMP-specific features like the Top Stories carousel currently available only on mobiles. Typically, AMP works across most device types, including desktop and tablets, but as of now, there are no plans to expand AMP-specific features to non-mobile platforms.

Are AMP pages mobile-only?

No, not really. As AMP pages get viewed on all device types, responsive design is coded in them.

How does AMP look on a desktop?

The AMP pages are displayed equally well both on mobile as well as desktop screens. If the AMP supports all the required functionalities that it needs, it may consider creating the pages as standalone AMP ones to help both desktop and mobile site visitors. It is to note that AMP on the desktop cannot get search-specific features in Google Search results, ideally.

1.4.3.2. AJAX-enhanced sites

Design AJAX-powered sites for accessibility

Ajax helps improve the site’s UX, create dynamic pages, and act as strong web applications.

The downside is Ajax makes it challenging to index site implementation carefully, much like Flash. Among the many issues, the two that stand out are ensuring the content is visible by search engine bots and enabling them to follow the navigation.

Let’s note that Googlebot finds it difficult to navigate using JavaScript around sites, even though it understands the HTML link structure.

It’s always a safe bet to create sites that provide HTML links to the content that is being crawled by other engines and Google.

. Design for accessibility

The site’s accessibility can be easily tested with a preview on the JavaScript browser or using Lynx, which is a text-only browser. A text-only view helps to identify content that would otherwise be tough to view for Googlebot, especially in text embedded in Flash or even images.

. iFrames to avoid – or their contents linked separately.

The contents that appear on Google Search and shown on iFrames may ideally not be indexed. It’s recommended not to use iFrames while displaying content. When including iFrames, it’s best to provide additional text-based links to help Googlebot crawl and index the posted content.

. Develop with progressive enhancements

When building the site’s structure and navigation, using only HTML is a good idea. Ajax’s support can enter at a later stage once the contents, links, and pages come in. It serves a dual purpose as modern browsers help users find the benefits of Ajax while HTML keeps Googlebot engaged.

. Follow the guidelines

It is advised to provide different experiences to the users but ensures the content remains the same. 

1.4.3.3. Images and video

 Google Image best practices

One of the fundamental ways to observe best practices is by opting out of image search with inline linking. An excellent UX means:

~Providing proper context: 

Ensure to have relevant visuals for topics. It’s best to show images where they bring original value.

~Optimizing placement:

Images placed in the vicinity of the relevant texts offer the best optimization. Sort the pictures from the top, based on the importance.

~Not embedding relevant text inside images: 

Avoid embedding text in images to ensure the content can be accessed at the highest level, usually menu items and page headings. Keep HTML text and use alt text for pictures.

~Creating informative and high-quality sites: 

An ideal combination is having good content matched with relevant pictures. It ensures that there are context and actionable results. To generate snippets for images, the page contents being used by Google as the content quality is essential for ranking the photos.

~Creating device-friendly websites: 

The site’s design must be responsive across devices, as more searches happen on mobiles over desktops. One way to check the pages’ efficiency on mobiles is to use the mobile-friendly testing to obtain feedback on further improvements.

~Creating a good URL structure for the images: 

Sort images to logically construct URLs, as Google makes use of the URL path and filename to understand the pictures better.

1.4.3.4. Podcasts

The podcast can be found out and indexed by Google to make it visible using mechanisms that include:

  • Google Search results as a link to a podcast page or as an embedded episode player
  • Google Search App for Android (ideally v6.5 or higher version of the Google Search App)
  • Google Podcasts app
  • Google Home
  • Content Action for the Google Assistant
  • Android Auto

1.4.3.5. Mobile

Mobile viewing on feature phones:

To help search every content, Google Web Search allows searching the index for desktop browsers. Google also adapts the images and text formatting to make them highly usable across devices.

Web Light: For Faster and lighter searches from mobile pages

For clients with slow mobile connections, Google shows light and fast versions of pages using a technology named Web Light. Since these pages are optimized, they load faster and consume far fewer data, which is achieved by transcoding web pages simultaneously into an optimized version for slow clients.

1.4.3.6. Google Discover

A summary of the pages is shown as a card by Google in Discover, a scrolling list of topics allowing users to browse on their devices.

The interactions with Google and its products make Discover show users a variety of content from videos, sports news, entertainment, financial news, etc. It acts as a content hub for all users.

1.4.3.7. Resources for developing mobile-friendly pages

As most users search from their mobiles, website designers must adopt a mobile-friendly UI/UX.

Here are a few pointers for smartphones and feature phones:

  • Develop the search results page based on the type of phone.
  • To look at how the page appears to be, use the App Indexing Search Preview.
  • To block user agents who are specific to mobile devices, use the robots.txt file.
  • To help pages load faster, insights from PageSpeed help analyze and provide tips.

1.4.3.8. Mention mobile billing charges upfront

When sites incur usage charges, it should communicate to the users. When Google detects the website does not inform its users adequately, the Chrome browser displays a warning before the page loads for the user.

Given below are a few best practices that ensure users are informed well about any such mobile charges that might incur:

  • Let the billing information be displayed.
  • Make the billing information visible to the user.
  • Clearly state the critical and essential information.
  • Make sure that the fee structure is understandable along with billing frequency.

1.4.3.9. Link the Android app with a website

Android apps on Google Play Store can be associated with Search Console with a site property. Once linked, Google indexes and crawls the website to boost rankings in search results.

Benefits of linking apps:

  • HTTP URL mapping for free.
  • Linking of website pages with apps for mobile search results with default views.

1.4.4 Quality Guidelines

Let’s have a quick look at what the guidelines are, which has been put forth by Google Search.

1.4.4.1 Automatically generated content

It means programmatically generated content. In a context when such content doesn’t help users but manipulated to rank higher in searches, Google takes appropriate actions as listed below briefly:

  • Texts that have keywords but aren’t useful to the viewer.
  • Texts have undergone translation using an automated web tool without any human review or manual curation before publishing.
  • Texts which are generated by automated processes, otherwise known as Markov chains.
  • Texts that get generated by the process of automated synonymizing or through any obfuscation techniques is an automatically generated content.
  • Texts which are generated from search results or scraping through Atom/RSS feeds.
  • Contents which are put together from other pages but without adding any significant value.

1.4.4.2 Sneaky redirects

It involves diverting visitors to another URL than the intended one. The genuine reasons for such actions are moving to a new web address or consolidating several different pages into one.

The problem arises when some redirects are deceptive and show content otherwise differently available to crawlers, which is a violation of Google Webmaster Guidelines. It’s similar to cloaking and displays content that’s different from Googlebot and users alike.

1.4.4.3 Link schemes

It’s a manipulation of PageRank in Google search results and violates Google Webmaster Guidelines, including links to the webpage and outgoing links to other websites.

Some examples of link schemes that impact negatively are:

  • Buying and selling links for money exchange or goods and services.
  • Excessive cross-linking of partner pages.
  • Guest posting with keyword-enriched anchor text for article marketing.
  • Inserting a link as part of terms of service or similar contracts without choice to third parties who own the content.
  • Unnaturally placed links at the footers or templates of various sites and optimized forum comments.

1.4.4.4 Cloaking

It refers to presenting other content and URLs to search engines and users for results, which is a violation as it shows unintended content than what to expect.

Some typical examples of cloaking are:

  • A display of Flash or images to users while serving HTML text to search engines.
  • Insertion of texts and keywords in a page when the user requests a search engine instead of a human.
  • Hackers using cloaking to make it a harder detection for the owner of the site.

1.4.4.5 Hidden Text and Links

It amounts to hiding texts and links in website content to enable manipulation of Google’s search rankings. While it is deceptively hidden in many ways, some of its methods are shared below:

  • Using white text fonts on a similar white background.
  •  Locating the text behind images.
  • Using Cascade Style Sheets (CSS) to position texts off-screen.
  • Choosing the font size to 0.
  • Smartly hiding a link to allow only the linking of one little character. Like for example, insertion of a hyphen in the middle of a paragraph in the website content.

1.4.4.6 Doorway Pages

They are pages created for accurately ranking higher during search queries, which can lead to multiple similar pages.

Some quick examples of doorways are below:

  • Funneling users to a single page while having domain names that are target specific regions.
  • Pages generated to enable funneling visitors to the actual portion of the site.
  • Pages that are very similar to the search engine results when compared to a clearly defined hierarchy.

1.4.4.7 Scraped Content

Content sourced from high profile sites in the belief of adding weightage without being relevant or unique is “scraped” content.

Some quick examples of scraping are:

  • Republished and copied content from other sites that lack originality.
  • Content copied from websites only to republish with slightly modified to escape plagiarism with substituted synonyms.
  • Content reproduced from feeds of other sites that provide no benefits and lack uniqueness to users.
  • Embedded content from other sites like images and videos without adding any significant value to the visitors.

1.4.4.8 Affiliate Programs

Original content has its pride of place, and websites must add value to users to participate in affiliate programs.

Some quick pointers on affiliate programs are:

  • Product descriptions feature across the sites of affiliate programs from affiliate websites. It can affect Google search rankings as the websites lack added value to the content to differentiate it, like mentioning the product category, purchase location, or pricing.
  • It’s advisable to choose an appropriate product category for targeted audiences who can build communities across blogs, discussion forums, and user views.
  • Update the content and keep it relevant to increase the chances of being crawled by Googlebot.

1.4.4.9 Irrelevant Keywords

“Stuffing” means manipulating site rankings on Google search by loading the web pages with keywords and is considered a negative experience for users.

Some quick examples of keyword stuffing are:

  • Phone number listings that have no significant value.
  • Ranking states and cities on a webpage through text blocks.
  • Repetitive usage of phrases and words which seems unnatural.

1.4.4.10 Creating pages with malicious behavior

Content or software distribution on the website, which behaves in a different manner than intended, violates the Google Webmaster Guidelines.

It includes any content manipulated in a certain way, like downloading or executing files on the user’s computer without their permission, which doesn’t comply with Google Unwanted Software Policy (GUSP).

Google only aims to present the users with search results relevant to their queries and keep them safe on the internet.

Some quick examples of malicious behavior are:

  • Manipulation of the location of the content on a page to trick the user into thinking they are clicking on a specific button or link, while a different part of the page registers the click.
  • Promotion or installation of software that injects new pop-ups or ads on web pages or even swaps the existing ones with other ads.
  • Inclusion of unwanted files in a download requested by the user.
  • Installation of viruses, trojans, ads, malware, and spyware on the computer of the user.
  • Changes in the search preferences or the homepage in the user’s browser without formal consent.

1.4.4.11 User-generated spam

Spam that is usually generated by malicious users, which allows them to create pages or add content to sites, is called user-generated spam.

Some usual specimens of spammy user-generated content are:

  • Spam accounts on free host sites.
  • Spammy posts on forum threads with malicious links.
  • Comment spam on blogs and insights sections of sites.

1.4.4.12 Ways to Prevent Comment Spam

Here are some essential pointers on how to protect sites from malicious comment spam:

  • Think before enabling the comments section.
  • Turn on comment and profile creation moderation.
  • Use anti-spam tools.
  • Use “nofollow” or more specific attributes.
  • Prevent untrusted content from showing in search.
  • Get help from the community.
  • Repeated attempts at spamming can be prevented by using a blacklist.
  • Monitor the site for spam content.

1.4.4.13 Report spam paid links or malware.

Spam 

If a site is a spam, spam is taken seriously by Google, and an appropriate investigation is undertaken and reported against. Further action is made by the webspam team to come up with solutions to counter spam.

Paid Links

PageRank makes buy and sell links less impactful with search results. Violation of Google Webmaster Guidelines by taking part in link schemes can negatively impact search results and page rankings.

Malware

Websites infected with malicious and harmful software must be reported to Google to initiate necessary remedial action.

1.5 Why is SEO Important

SEO is a cost-effective method to boost business online. It helps enhance the website’s UX, driving more traffic, and generating leads. As the site begins to rank higher, so does repeat customers’ experience and a loyal base. A quick roundup is given below on the multiple benefits: 

  • Better Traffic

While getting excellent visibility, it also increases the number of website visitors.

  • Higher Leads

Target the audience that is interested in products and services in Chennai.

  • Greater Revenue

The increase in targeted leads makes it much easier to turn them into paying customers.

  • Increased Brand Awareness

The brand has a higher recall value among customers to give an edge over all the competition in the industry.

  • Enhanced Business Opportunities

The new traffic qualified leads, and brand awareness all contribute to new growth opportunities.

  • Enhanced Trust and Control

After reaching the top of the search results, the customers perceive it as the industry authority.

1.6 Should I hire an SEO professional, consultant, or agency?

This is one question that is there on everybody’s mind. It’s best to request an SEO audit report from the service provider. It’s the quality of work and success rate and client testimonials that will matter in the end.

The quality of their reports can easily judge pertinent questions on how the service provider helps achieve the marketing objectives.

The two main factors to keep in mind while selecting an SEO provider are cost and quality. Decide on the quality of SEO service by asking the following questions to the client:

  • How to show the SEO performance during the project?
  • For which keyword can the website be ranked higher?
  • How much website traffic to expect every month?
  • What is the period taken to rank the website?
  • Who are the ranking competitors?

The SEO service provider must offer the answers ideally as a free audit report, instead of theoretical explanations.

Only trusted and reliable SEO service providers can offer a satisfactory audit report for easily understood websites and not peppered with jargon.

1.7 Google Algorithm Updates

Google brings out algorithmic updates from time to time, which helps improve aspects of SEO. These are much anticipated by watchers in the SEO community and have a fair share of advancements and changes.

1.7.1 Panda

In July 2015, Google brought out a Panda update, which was a data refresh. Google initially maintained that it would be months to roll out fully. While an immediate impact was not felt by many, no clear signs were visible from this 4.2 version of the algorithm’s update. Google had earlier brought out version 4.1 of the Panda update in September 2014, which impacted 3-5% of search queries.

1.7.2 Penguin

google penguin update

After almost two years of speculation, Google brought out the Penguin update in September 2016, followed shortly by another one in October. The second one in October reversed all the previous penalties after the rollout of the new code. Penguin, along with Panda, is a major algorithmic update that significantly impacts search results.

1.7.3 Hummingbird

Google Hummingbird update

The Hummingbird algorithm update released in September 2013 had drawn a parallel to Caffeine. It was an algorithm update that powered changes to semantic searches and the Knowledge Graph for many months.

1.7.4 Pegion

Google Hummingbird update

In July 2014, Google’s major local algorithm update, called the Pigeon, was rolled out in the US and later in the United Kingdom, Canada, and down under Australia. This update ended after months of speculation on the changes it would bring about the search results and patterns.

1.7.5 Mobile-friendly Update

google mobile friendly update

It was an update released in May 2016, a little over a year after the first version of the mobile-friendly update. This update gave an additional ranking signal boost for searches on sites that were mobile-friendly. The impact was not much, though, as most websites were already mobile-friendly by then.

1.7.6 RankBrain

google rankbrain update

This update came in October 2015, while Google had declared the release earlier in the year. This update gave a significant announcement, which mentioned that this algorithm had machine learning, which was part of it for months. It helped contribute to the 3rd most influential ranking factor.

1.7.7 Possum

google possum update

This update was brought out in September 2016 and was an interesting one, as the local SEO community noticed quite a difference in pack results. There is data to show that it also heavily impacted organic search results.

1.7.8 Fred

Google Fred Update

In March 2017, Google rolled out this major update, which showed a good impact among the SEO community. Gary Illyes, a webmaster trend analyst at Google, referred to this update as “Fred” in a joking manner, and the name supposedly stuck on. But, Gary later clarified that it wasn’t anything official about this name.

1.8 Google 101

1.8.1 How Google Search Works?

Learn how Google discovers, crawls, and serves web pages.

Technically, Google gets information from many different sources, including:

  • From web pages
  • Through user-submitted content like Google My Business and Maps user submissions
  • By book scanning
  • From public databases on the Internet and many other likely sources.

Usually, three necessary steps are followed by Google to generate web page results:

Crawling:

The essential exercise is to find out what pages exist on the web. There is no single central registry of all web pages. Ideally, Google continually looks for new pages and joins them to its own set of familiar pages after every search. It is the process of discovery that is known as crawling.

Google knows every page crawled. The discovery of other pages occurs when Google follows a new link from a familiar page to a newer page. Even then, other pages get highlighted when a website owner submits several pages (a sitemap) for Google to crawl and see. When using a managed web host, such as Blogger or Wix, they advise any new crawl or updated pages to Google, made in recent times.

Indexing:

When Google discovers a page, it tries to get a grasp and feel of the contents, called indexing. The analysis of the page and its contents by Google, catalog of images and video files embedded on the page, and the like are all Google’s attempts to understand the page. Google stores this information in its index, an extensive database stored in a large number of computers.

Serving (and ranking):

As and when a user types a query, Google begins to find the most relevant answer from its index based on many criteria. While Google determines the highest quality answers, it also factors other considerations that provide the best user experience and the most appropriate solution.

It uses While considering another thing, the user’s location, language, and the device used. For example, when searching for “motorcycle repair shops,” it would show other answers to a user in Chennai that would lead to a user in Singapore. To keep in mind, Google does not accept any payment to rank the pages higher, and please note that what they do is ranking programmatically.

1.9 EAT

eat seo

As is known, this sounds like a new acronym, E-A-T in SEO stands for Expertise, Authoritativeness, and Trustworthiness, which is impressive enough. Google’s Quality Raters’ Guidelines (QRG) lays out a great detail of this concept. An excellent example of E-A-T on and off the website can help improve rankings on Google over time.

1.9.1 How is EAT evaluated?

EAT stands for Expertise, Authoritativeness, and Trustworthiness. While the concept is similar, it’s not identical as each requires a separate criterion to evaluate them. Let’s have a look at them in detail.

1.9.1.1 Expertise:

Expertise is the possession of specialized skills and knowledge in a core field.

When a person has a high level of knowledge or skill in a particular field, he/she is known to have the expertise. It becomes evident when Google looks at a subject matter expert level’s content and not at an organizational or website level. 

To get this right, one has to be sound in their chosen field, have formal qualifications, and adequate experience. For critical topics, this is about formal education that aids one to write confidently. As an example, a CA is ideally the right choice to draft an article or a blog on accounts and taxes as compared to someone who has read a few articles and written one. 

1.9.1.2 Authoritativeness:

When one has a reputation among subject matter experts and social media influencers, he or she is supposed to be an authority on the topic. They become the first point of contact to seek an opinion or clarification.

Some raters surf the web to get an insight into the individual while evaluating their authoritativeness.

1.9.1.3 Trustworthiness :

The content on the website must be trustworthy and legitimate. It is how a website gains trust for its content being transparent and accurate.

While rating a website, one of the first things is ownership of the published content. It holds the importance of answers to critical topics and queries.

Open Chat
Hi,

How can we help?
Powered by