menu

Google search ranking factors 2022

100 factors affecting on ranking of website in search engines. And the most critical factors that prevent you from getting good positions in the search results.

SEO ranking factors - a historical digression

  • The ranking algorithm (reference) is launched for the first time. It was during this period that Page Rank was invented . His algorithm was based on the transfer of reference weight. That is, the more resources link to the page, the better the PR and the higher the place in the search results. And also, the higher the PR of the pages that link to the site, the more weight the links transfer. On September 4, 2001, the algorithm was patented by Google Inc.
  • Google presents to the general public a new development - an algorithm called Hilltop, which allows you to most accurately calculate the level of PR. This algorithm reads the geography and the degree of novelty of a document.
  • After that, they begins to notify webmasters so that they do not leave links on suspicious websites and "link dumps" .
  • When calculating Page Rank , the dynamic rank of the document is taken into account .
  • At the same time, they separated search results (SR) for commercial and non-commercial queries .
  • Basic link influence, hidden links and texts harm resources .
  • Dominic - they is starting to count and evaluate backlinks differently.
  • This algorithm modifies the SR by removing keyword-heavy pages , as well as pages with non-unique content and sites with purchased inbound links .
  • Austin (Austin) - There is an addition to the famous Florida. They continues to crack down on invisible text and keyword spamming .
  • They is starting to differentiate between synonyms. Keyword analysis is done differently.
  • Google has launched the nofollow option (and it's shared with Microsoft and Yahoo). The main goal is to fight spam, but it also had a big impact on links.
  • Bourbon - Bourbon was said to have affected duplicate content (i.e. there was a confrontation between www and non-www ).
  • Sitemaps in XML format - HTML sitemaps have been replaced with XML format. This allowed webmasters to speed up indexing by feeding cards through the webmaster panel.
  • Personalized search - SE has started using search history based on personal preferences . There was a small effect, but they decided to continue using history for applications.
  • Jagger (Jagger) - The purpose of the update is link farms , mutual , purchased and questionable links .
  • Big Daddy (or Big Daddy) - Technical basics have been updated: canonicalization, 301/302 redirect, and more .
  • Austin - This algorithm begins to take into account the trust of the resource and significantly reduces the issuance of less trusted resources. After that, young sites that did not have time to gain trust cannot get into the TOP-5 .
  • Vince - Large brands have gained a big advantage (according to the observations of optimizers).
  • Rel - canonical: tag got a new treatment - Canonical tags are now supported. This was reported by Google, Yahoo and Microsoft. Allowed webmasters to send canonical signals to robots.
  • Caffeine Preview - They announced a planned infrastructure change. New features include real-time SERP generation, advanced index, crawl speed and indexing. From this point on, the content and frequency of updating a web resource become the most significant . Internal optimization (i.e. internal linking and usability ) increases Google's trust in the site. Sanctions are introduced for spam and technical problems . So, a site that takes a long time to load and contains broken links can be downgraded in the SR .
  • Brand update - One of the very unusual Google ups. Now the SR can contain the same domain several times (previously, pages of a single domain could not be placed on one page).
  • Social signals - Social signals from Twitter and Facebook are now taken into account by search engines Bing and Google when ranking .
  • Negative Reviews - They started ranking user and business reviews after the New York Times trashed DecorMyEyes on its pages.
  • Overstock.com - The big and famous site Overstock.com has been punished for black hat SEO . A month later, the same thing happened to JCPenny.
  • Up the authorities - Scandalous cases of overspam received a response from Google - this update became it. He touched a quarter of all search queries.
  • Farmer Panda (Panda) - Refers to the largest ups, which affected 12% of the sites that were in the issuance of Google. The changes affected weak content as well as content farms . In April came to Europe.
  • +1 button - They introduced its +1 button, which was a kind of response to the competition between Facebook and Twitter. From now on, with its help, users are able to influence SR .
  • Panda version 2.0 - All requests in English are affected by this new app. New factors are now taken into account - such as blocking sites through Chrome or through search. In fact, Panda is a "scavenger" of Google's own issuance. The algorithm is required to clear the results of doorways, satellites, sites only for placing ads and links, as well as sites with non-unique content . Errors in the text are taken into account (stylistics, grammar, spelling), matching of tags and meta tags (title, h1, etc.) of the page to its content, oversaturation of content with keywords.
  • Schema.org - Yahoo, Google and Microsoft have introduced a set of new HTML tags to webmasters . It was they who led to the fact that the search was enriched with new results.
  • Advanced Links - They has made an update for queries related to brands.
  • Pagination - To improve the fight against duplicate content , they introduced two new options: rel=prev and rel=next. Improved Show All pages and automatic canonicalization.
  • Encryption of requests - They said that from now on, search queries will be encoded (encrypted). Webmasters encountered problems when working with popular keywords .
  • Search and your world - Big changes in the formation of human-specific SR. Information from social networks added to the search results. Finally there are buttons to cancel issuance based on personal preferences .
  • Page Layout Algorithm - Also known as "Baby Panda" or "Top Heavy". The algorithm analyzes the usefulness of the site content on the first screen (without scrolling). Sites containing heavy ads on the first screen began to rank lower .
  • Penalty for placing ads - From now on , Google penalizes for the abundance of ads at the very top of sites . It was said that Panda works almost the same way.
  • Venice (Venice) - They began to take into account the regionality of users for the formation of natural issuance . Users located in different regions began to receive different results in the SR feed.
  • Panda 3.4 - This update, the main purpose of which is to lower the position of a site with a low level of content , was reflected in 1.6% of search queries, although the principle of indexing remained the same.
  • Penguin - Launched a new algorithm that affected 3.1% of search queries, aimed at combating search spam, able to take into account both external and internal factors. Most sites using spam methods have been downvoted or removed from search . Also, Penguin is endowed with the ability to analyze the quantity / quality / rate of change of the link mass .
  • April 52 - They begins to take into account the data of previous user requests to form the issue . They index increased by 15%, search by region improved, snippet formation changed.
  • Panda 3.9 - The next update of this algorithm, aimed at removing from the issuance of sites containing low quality content , like the previous two versions, it affected about 1% of requests.
  • DMCA Penalty (Pirate) - From now on, they penalizes websites for using content that infringes copyright .
  • Panda 4.0 - After this update, the search algorithm has become more perfect, the approach to detecting "low-quality" sites has changed, the algorithm for ranking English-language results has changed. After the update, it became known that the issue that was affected by Panda for different countries had different indicators. For example, the English language has changed by 7.5%.
  • Penguin 2.0 - Up affected requests in different languages, including 2.3% of English-speaking ones. Thanks to the new technology, Penguin began to find web spam better, and the more spam there is on the site, the lower the web resource is ranked .
  • Hummingbird - There was a "humanization" of the search. From now on, they gives out more relevant results, forming the issue, not relying on keywords, but trying to find synonyms for the search query .
  • Penguin 2.1 - As before, Google responds to suspicious sites and anchor lists .
  • Dove - From now on , when receiving geo-specific queries, they gives the most informative, local SR for the user.
  • Penguin 3.0 - This app from Google once again made it clear that they does not welcome web spam . Many sites sank in the issuance. In turn, English-language queries according to MOZ were affected by less than 1%.
  • From 04/21/15, They begins to motivate mobile content and launches the new Mobilegeddon algorithm. Now , in mobile search results, sites with an adapted version of the site have priority . A tool for checking the mobile version of the site has appeared.
  • Google Possum - The tasks of "Opossum" are to diversify local issuance, worsen the ranking of spam content - to exclude the display of several similar pages of one company in search results.
  • They itself warned in 2016, and in 2017 announced that it was necessary to switch to the HTTPS protocol and that it would mark sites that collect data and continue to work on the insecure HTTP protocol, which it did. And for all sites, and not just collecting data and payment.
  • Google Speed Update (mobile) - The download speed of their mobile version acts as a ranking factor for Internet resources . Sites with popular and high-quality content , even if their download speed is low on mobile devices, do not go down in search results.
  • Google Medic Update - This update mainly affected sites dedicated to health and medical issues, to a lesser extent - financial and legal resources. Through the introduction of "Neural Matches" created using artificial intelligence, they is trying to "understand" the relationship between the words in the user's query and determine the concepts hidden behind the phrase. Articles with rewriting and low-quality content went down in the search results , and texts written by specialists (doctors) did not actually suffer.
  • Core Update - A large-scale update that covered all areas, but the changes affected the medical and financial sites the most. The update is intended to improve the relevance of search results for users, especially in YMYL topics.
  • Core Update - Another update of the core algorithm, there are no official answers about what is included and what will be affected by the update. But upon closer examination of the update, you can see the impact on the ranking of hosting speed , plugins and site theme , as well as design responsiveness . Additionally, there is an emphasis on the fight against misinforming posts , for example, in the wake of the coronavirus epidemic.
  • 100% mobile-first in indexing - With the rise of mobile traffic in the 2010s, they began to focus more on mobile-friendliness as a ranking factor in 2015 . A year later, mobile-first indexing was introduced, which refers to when Google crawls – and therefore ranks – your site's pages according to the mobile version of that content .
  • Product Reviews - The purpose of Google's April 2021 Product Reviews update is to encourage completion of product cards that don't just summarize the list of products, but rather provide detailed data, insightful analysis, and original content.
  • Link Spam Update - You should be especially careful with intruders if your site provides user interaction through comments, forums, etc.
  • Page Experience - In addition to indexing mobile sites, they has also introduced a specific set of metrics through a user experience update called Core Web Vitals . These are not new metrics, but rather new priority factors for quantifying page user experience.
  • The importance of user interaction has more than doubled this year (5% → 11%)
  • The first factor is the constant release of interesting content , which remained at 26% in 2022, which is important given the large share of user experience. They continues to reward trustworthy people who provide high quality content by allowing them to work within days of publishing the content based on the targeted keywords they use.
  • It can take over 6 months for a news website to know the status code, but posting the consistent content it uses to reach that position is the most important factor in Google's algorithm.
  • Keywords in title meta tags on our list were reduced more than any other factor (22% 17 17%) because they increased the automatic generation of meta tags under certain conditions. This is a unique factor, and choosing the right keywords for every page of your website is a prerequisite for ranking . While this is less important than last year, it is still necessary.
  • The Niche Expertise value has increased slightly (12% → 13%) and this shows the importance of Hub and Spoke SEO in which high level keyword targeted pages are classified into other pages that target related keywords are linked and linked.
  • Backlinks have decreased very slightly (16% → 15%), indicating a gradual decline in a factor that was once higher . While it is still the fourth most important factor in the algorithm, it is reminiscent of a time when Google's artificial intelligence had not yet evolved to judge the quality and authenticity of web content and needed other websites to do so.

Top SEO ranking factors

  • links on suspicious websites and "link dumps"
  • hidden links and texts harm resources
  • keyword-rich pages
  • pages with non-unique content
  • sites with purchased inbound links
  • invisible text and keyword spamming
  • duplicate content (with www and without www )
  • link farms , reciprocal , purchasable, and questionable links
  • sanctions for spam and technical problems ( for example, a site that takes a long time to load and contains broken links can be downgraded in search results)
  • black hat seo
  • weak content as well as content farms
  • doorways, satellites, sites only for placing ads and links, as well as from sites with non-unique content
  • sites containing heavy ads on the first screen began to rank lower ( they penalizes for the abundance of ads at the very top of the sites )
  • most sites using spam methods have been downvoted or removed from Google search
  • removal from the issuance of sites containing low quality content
  • Google penalizes websites for using copyright infringing content
  • web spam, the more it is on the site, the lower the web resource is ranked .
  • suspicious sites and anchor lists
  • exclusion of displaying several similar pages of the same company in search results
  • no HTTPS protocol
  • sites that collect data and continue to work on the insecure HTTP protocol
  • link spam in comments and on the forum

Search engine ranking factors in 2022

Search engine ranking factors with medium importance listed below:

  • Page Rank
  • nofollow for external links
  • sitemap in XML format
  • natural links
  • canonicalization, 301/302 redirect, etc.
  • resource trust, young sites that did not manage to gain trust cannot get into the TOP-5
  • big brands get a big advantage
  • the content and frequency of updating a web resource becomes the most significant
  • internal optimization (i.e. internal linking and usability ) increases Google's trust in the site
  • sanctions for technical problems
  • the site takes a long time to load and contains broken links, it can be downgraded in the search results
  • with social signals from Twitter and Facebook are taken into account by search engines Bing and Google when ranking
  • negative reviews - Google began to take into account the reviews of users and companies when ranking
  • quality and unique content
  • +1 button - with its help, users are able to influence search results
  • Schema.org - Yahoo, Google and Microsoft introduced a set of new HTML tags to webmasters
  • Pagination - To improve the fight against duplicate content , SE introduced two new parameters: rel=prev and rel=next
  • issuance based on personal preference
  • advertising penalty
  • the regionality of users is taken into account to form natural search results
  • lowering the position of a site with a low level of content
  • quantity/quality/rate of link mass change
  • takes into account the data of previous user requests to generate the issuance
  • geographic location of the site (when receiving geo-dependent queries, SE gives the most informative, local results)
  • in mobile issuance, priority is given to sites with an adapted version of the site
  • as a ranking factor for Internet resources is the download speed of their mobile version (with sites with popular and high-quality content , even if their download speed is low on mobile devices, they do not decrease in search results)
  • articles with rewriting and low-quality content rank worse
  • influence on the ranking speed of hosting , plugins and website theme , as well as design adaptability
  • 100% mobile-first indexing (SE ranks your site's pages according to the mobile version of that content)
  • encouragement to complete product cards ( which don't just summarize the list of products, but rather provide detailed data, subtle analysis and original content)
  • user experience called Core Web Vitals ( The importance of user interaction has more than doubled this year (5% → 11%) )
  • consistent release of interesting content (the most important factor, which remained at 26% in 2022)
  • choosing the right keywords for every page of your website is a prerequisite for ranking
  • Niche Expertise has increased slightly (12% → 13%) and this shows the importance of Hub and Spoke SEO , in which high level keyword targeted pages are classified into other pages that target related keywords and are linked and linked.
  • Backlinks have decreased very slightly (16% → 15%), indicating a gradual decline in a factor that was once higher . While it is still the fourth most important factor in the algorithm, it is reminiscent of a time when Google's artificial intelligence had not yet evolved to judge the quality and authenticity of web content and needed other websites to do so.



Was this helpful? thumb_up thumb_down





Start Optimizing Your Website - Fee SEO Audit


Please sign in to access more SEO tools.

RU