Full list of Google 2024 Ranking Algorithms

At the end of May 2024, 2.5 thousand pages of internal documentation of the American corporation Google leaked into the network. They describe in detail how Google’s search algorithm actually works.

The documentation also describes exactly what data Google collects from web pages, sites and from search engines, gives a description of analyzing and ranking web pages and what factors affect the ranking of small sites.

Google has several dozen ranking systems (algorithms). They evaluate billions of pages every day while taking hundreds of factors into account – all to give you the most useful results in a fraction of a second.

Here’s what these systems are called and what they’re responsible for.

This is an artificial intelligence system that “guesses” the essence of a query in much the same way our brains do, and works, shall we say, with the intersection of entities.

What concept is at the intersection of two other concepts, “king” and “woman”? Of course, “queen!” Your brain figured this out in a split second, and algorithms have long since mastered such a simple model as well. Now BERT similarly learns (and very successfully!) on more complex cases to understand what a person is actually looking for – even if the query does not include the most necessary and precise queries.

Google has separate systems that allow you to quickly find information in crisis situations: personal, involving threats of violence or murder, or natural disasters.

The second is SOS alerts and works to show messages from authorities. These include emergency numbers and websites, translations of useful phrases, maps and more – see Google Help for more details.

An algorithm can find thousands or even millions of pages with relevant content – but some of them may completely duplicate each other’s content. This is a useless result for the user, so by default Google hides duplicates.

Google’s ranking system takes into account the words contained in a domain – for it, this is one of the signals of content relevance. But the same system realizes that domain names like “best places to dine” are designed solely to capture the top, and does not take them into account when generating results.

Fresh content isn’t always by definition higher quality than content that came out a long time ago. But Google’s algorithms understand when the freshness factor may be more important, and raise recent publications higher in the search. For example, if there’s a review of a new movie, the review will be higher than the news of the movie’s production launch, and news of a major earthquake near you will be higher than a Wikipedia article.

This algorithm focuses on getting people to see more helpful content written by people for people in the results.

Google has systems that focus on how pages link to each other – by doing this, they understand what pages are about and which ones might be most useful.

In 2016, Google disabled the PageRank toolbar – one of its most famous link analysis algorithms, with SEO experts writing about the “death” of the system back in 2014. However, no one knows for sure whether the algorithm is working now.

The systems of issuing local news content work in Google, as it is stated in the blog of the company, “when it is appropriate”.

This isn’t Google News, which was shut down in Russia in the spring, but one of the quick results. But perhaps these algorithms work in conjunction: we tried to test the work of this algorithm on about a dozen queries – and didn’t see anything relevant.

This is an artificial intelligence system capable of both understanding and generating written speech. AI is not used for general search ranking, but only for some “applications”, such as finding information about the COVID-19 vaccine.

This is the AI Google uses to understand what queries and pages are saying – and to match those entities to each other.

These are algorithms that aim to prioritize original content. If content is duplicated on one site for some reason, the owner can simply specify one of the pages as canonical – Google has provided special markup for this purpose.

Google’s algorithms demote sites whose content is removed for two reasons:

– copyright infringement;

– because of the publication of personal information.

In essence, the site, getting into such conflicts, signals to the search engine that he is not all right with the content policy – since his content is removed on legal grounds.

This algorithm evaluates criteria that indicate a good or bad experience with a page: loading speed, mobile-friendliness, lack of intrusive cross-page ads, and security of service. Other things being equal, the search engine shows higher those pages that show better results on the listed criteria.

Algorithm that analyzes not the page itself, but individual sections or even pieces of content – this makes the search even better.

This is a system that does essentially the same thing as Helpful Content – that is, it “calculates” the most useful information and prioritizes it – but it’s product-specific. Otherwise, it’s the same: the algorithm evaluates reviews like regular content, “considering” the author’s expertise and experience first.

The algorithm “studies” the meaning of words it encounters for the first time, and it does this in conjunction with the language in which the query is made. For example, a query banker from an American the system will understand as “a person who works in a bank”, but for the British Google will add in the output also results with the meaning of “railroad locomotive” (because it is an additional meaning of the word banker in British English).

Google is very careful about the quality of information, including rapidly changing information. When the algorithms are not sure that specific data is reliable, it gives recommendations on how to perform the search in other ways – ways that will lead to reliable results.

As a general rule, Google doesn’t show more than two pages from the same site in the top results to eliminate the possibility of hijacking the results. However, common sense still takes precedence – if a few pages from one site are actually more relevant to a query than all others, then an exception is made.

The internet contains a huge amount of spam which, if not eliminated, will prevent the search engine from showing the most useful and relevant results. Google uses a number of spam detection systems. Spam, alas, is constantly improving – but so are the algorithms.

News Reporter