Search­metrics has been providing an annual, in-depth analysis of search engine rankings since 2012. However, the ranking factors specif­i­cal­ly tailored to the market leader, Google, appeared in this form for the last time at the end of 2016. Google explains the era of general SEO check­lists is over and has begun to use a more so­phis­ti­cat­ed approach to search engine op­ti­miza­tion. In the future, the company’s annual SEO report will include sector-specific analyses. One of the reasons for this is Google’s AI system, RankBrain, which is much more flexible than tra­di­tion­al search al­go­rithms thanks to machine learning. In June 2016, Google employee Jeff Dean disclosed to technical editor Steven Levy that RankBrain would be involved in the pro­cess­ing of all search engine requests. According to Levy’s report on Backchan­nel, the AI system is one of the three most important factors in search engine rankings. With the current study 'Ranking factors – rebooting for rel­e­vance', Search­metrics presents a catalog of factors that decide where a website places in the search engine results. The results of the in­ves­ti­ga­tion are primarily intended as guide­lines for in­di­vid­ual, sector-specific analyses. As in previous years, the study is based on a set of 10,000 search engine-related keywords. Current findings were in­ter­pret­ed with a view to previous in­ves­ti­ga­tions. We have sum­ma­rized the most important results from the Search­metrics study.

Content factors

Strength­ened by the recent changes to the Google core algorithm, one thing is certain for Search­metrics: a website’s content has to be the focus when op­ti­miz­ing for the search engine. One main factor here is content relevance. This only became a ranking factor in 2016. Op­ti­miz­ing in­di­vid­ual keywords, on the other hand, is losing sig­nif­i­cance in favor of holistic text design.

Content relevance is becoming the main ranking factor

Good SEO content is char­ac­ter­ized by how much it cor­re­sponds to what the user is looking for. However, this differs from search query to search query. The challenge that content marketers face is that they have to answer as many questions as possible with just one text. To achieve this, content is created based on holistic texts that take into account different aspects of a topic and are optimized for different keywords on a se­man­ti­cal­ly related topic. Holistic content therefore aims to achieve good search results for several relevant keywords. The in­di­vid­ual keyword, however, fades into the back­ground. The search metrics study analyzed the content relevance of texts in relation to the search term used. This was carried out on the basis of lin­guis­tic corpora and on the concept of semantic re­la­tion­ships. The result isn’t sur­pris­ing:                 'The URLs with the highest content relevance are those in position 3 to 6' The relevance score decreases con­stant­ly for sub­se­quent search results. Positions 1 and 2 need to be con­sid­ered sep­a­rate­ly in this study. According to Search­metrics, there are generally websites of well-known brands, which benefit from factors such as rec­og­niz­abil­i­ty, user trust, and brand image when it comes to the Google ranking, so their position isn’t nec­es­sar­i­ly due to the content being relevant.

The average number of words is in­creas­ing

The average word count of well-ranking landing pages has been steadily in­creas­ing for years. According to Search­metrics, this reflects a more intensive analysis of the re­spec­tive topic areas in terms of holistic content creation.

                'The average number of words increased by 50 percent in 2016'

Sig­nif­i­cant dif­fer­ences in the word count are shown here with the com­par­i­son of desktop content and mobile landing pages: according to the analysis, desktop versions of a website are on average 30 percent longer than their mobile coun­ter­parts.

The keyword in the title is losing relevance

This motto is still valid in the world of SEO: the keyword needs to be in the title - and placed as far forward as possible. Whether the search engine agrees with this as­sump­tion is shown by the findings of one of the Search­metrics studies:

                'In 2016 only 45 percent of the top 20 URLs had the keyword in the title'

This de­vel­op­ment can also be explained by the holistic approach of content creation, in which texts are optimized on topics rather than keywords. Google’s AI system is now capable of analyzing semantic re­la­tion­ships without the need for keywords.

Similarly, this de­vel­op­ment can be seen by looking at the headlines and de­scrip­tions. According to Search­metrics, only one-third of all the top 20 landing pages have the keyword in the main heading (H1).

User signals

In order to determine whether Google users are satisfied with the proposed URLs, the web operator doesn’t have to solely rely on indirect factors such as the semantic analysis of website content. In addition to the search engine, Google products such as the Chrome web browser, the web analytics plugin Analytics, the ad­ver­tis­ing system AdWords, and the mobile operating system Android, also provide detailed in­for­ma­tion on users’ online behavior.

Google can find out whether a website offers what it promises by analyzing detailed user signals such as clicks and bounce rates as well as the average retention time combined with a huge database. The Search­metrics study provides helpful input for website operators and SEO con­sul­tants.

The average click-through rate of positions 1 to 3 is 36 percent

Users put a lot of trust into the search engine’s relevance analysis. This was already indicated in the 2014 Search­metrics study when the average click-through rate (CTR) of the analyzed top URLs was de­ter­mined.

Con­se­quent­ly, websites on position 1 receive the most clicks. For pole position, the Search­metrics team cal­cu­lat­ed an average CTR of 44%. The per­cent­age decreases the further down the page the website is. At position 3 the CTR is already down to 29%.

                'The average click-through rate of position 1 to 3 is 36 percent.'

There’s a clear increase in the CTR in position 11. The first website on the second search results page actually receives more clicks than the websites po­si­tioned at the bottom on the first search results page.

Compared to 2014, the average CTR of the top 10 URLs in the Google ranking has risen sig­nif­i­cant­ly.

The bounce rate on the first search results page has increased to about 40 percent

In addition to the CTR, the so-called bounce rate is also taken into account when con­duct­ing a website relevance as­sess­ment. This shows how many users return to Google.com after clicking on a search result without looking at other URLs on the domain they’re visiting.

For the in­ves­ti­gat­ed URLs, the Search­metrics team found an average of 40% increase in the bounce rate for the top 10 positions compared to 2014. There’s a clear deviation between the top two positions.

According to analysts, however, a website’s relevance can only be partly de­ter­mined by the bounce rate. Users have different reasons for leaving a website and it could also be that they leave quickly even after finding what they were looking for. This is possible, for example, with quick search queries, which can easily be answered by looking through a glossary or dic­tio­nary.

One ex­pla­na­tion for the detected increase in the average bounce rate could be the high quality of Google’s algorithm, according to the Search­metrics team. Thanks to the new AI system, the search engine is now able to display the URL that best fits the user’s needs. This means that fewer URLs need to be clicked on to find the desired in­for­ma­tion.

The average retention time is in­creas­ing

What’s es­pe­cial­ly important for a relevant as­sess­ment of a website is mainly the time between clicking and leaving the site. We­b­an­a­lyt­ics can ac­cu­rate­ly measure the time a user spends on the website (time on site). Here, Search­metrics provides us with an average value for which there could be different reasons.

                'The time on site for the top 10 URLs is 3 minutes and 43 seconds.'

Comparing the value of the current study with that of 2014, a sig­nif­i­cant increase in the retention time can be seen for the top 10 URLs. This de­vel­op­ment can be at­trib­uted to the fact that website operators have started providing visitors with better quality content since they want to rank well with Google.

Although a high retention time can be due to high-quality content, this does not mean that a short stay nec­es­sar­i­ly means that you website is of low quality. For example, a user searching for a weather report or the results from last weekend’s soccer match won’t need to spend ages on the website.

Technical factors and usability

Creating high-quality valuable content is just one part of search engine op­ti­miza­tion. Even the best content will not reach the top positions in the search results if the basic technical re­quire­ments are missing. Topical websites must be analyzed for factors such as per­for­mance, security, and usability. In terms of technical pro­vi­sions regarding web content, two main SEO trends stood out in 2016. On top of general factors such as page loading time and file size, more and more website operators are striving to make mobile content available to their visitors. In addition, Search­metrics stresses the im­por­tance of transport en­cryp­tion via HTTPS for the search engine ranking.

HTTPS has become a necessity in 2017

Web­mas­ters planning to forego HTTPS transport en­cryp­tion in 2017 will find it hard to improve their position in the search results. Even in September 2016, Google announced that it planned to classify HTTP sites as 'unsafe' from 2017 if sensitive data is processed on them. You don’t have to be an SEO expert to figure out how much this will affect a user’s will­ing­ness to click on a website. Not least for this reason, the number of HTTPS-encrypted websites is growing steadily. While the number of top 10 websites based on HTTPS was still 14% the previous year, Search­metrics was able to record a sig­nif­i­cant increase to 33% in its current study.                 'One third of websites in the top 10 are now based on HTTPS en­cryp­tion.'

Mobile-friend­li­ness is a pre­req­ui­site for a good ranking

Search queries from mobile devices are con­stant­ly in­creas­ing. According to a Google report from 2015, the search engine market leader has more search queries coming from mobile devices than desktop computers in the USA and Japan. Google’s Mobile First Strategy shows what a different mobile friend­li­ness makes on the search engine ranking. In October 2016, the company announced it would use the mobile index as a main index for web search in the future. It is therefore assumed that this will also affect the ranking for desktop web search. Web­mas­ters who are still ne­glect­ing their mobile users should consider re­design­ing their website for 2017. Mobile sub domains, re­spon­sive web design, and dynamic content are all good ways of making your site more mobile-friendly.

Social signals

As in 2014, the 2016 Search­metrics study continued to show a strong cor­re­la­tion between a top rating in the search engine and so-called social signals. This refers to com­mu­nica­tive signals such as shares, likes, and comments on social networks like Facebook, Twitter, Pinterest, and Google+. Analysts still find it difficult to associate a good Google ranking with social media presence.

Backlinks

The number of backlinks on a website still cor­re­lates somewhat with its placement in the search results. However, link building is fading into the back­ground as a result of new de­vel­op­ments in search engine op­ti­miza­tion. According to Search­metrics, a website’s backlink profile is no longer the decisive factor for im­press­ing Google, but rather one of many.

Today, Google is able to analyze a website’s value through semantic context and direct user signals. The backlink profile as a quality feature has been replaced by content relevance. If there are numerous inbound links from rep­re­sen­ta­tive websites, web­mas­ters need to ask them­selves whether the users’ ex­pec­ta­tions are met by the content these websites offer.

You shouldn’t fear backlinks com­plete­ly dis­ap­pear­ing from the Google algorithm in the future. This was also confirmed by the Search­metrics study. However, backlink-oriented search engine op­ti­miza­tion is now outdated.

  'The cor­re­la­tions for backlinks are still high, but their im­por­tance for the ranking will continue to decrease.'               

Ranking factors in cor­re­la­tion to the search engine position

The following graphic shows the ranking cor­re­la­tion of general ranking factors for the top 20 URLs analyzed by Search­metrics as well as any other de­vel­op­ments compared to the previous year. Ranking factors, which were collected for the first time in 2016 as well as re­cal­cu­lat­ed factors, have been marked with an asterisk.

Click here to download the in­fo­graph­ic of  the Top 20 Rank Cor­re­la­tions from Google.com

Con­clu­sion: the future of search engine op­ti­miza­tion

The Search­metrics study 'Ranking Factors - Rebooting for Relevance' shows that there are still factors that correlate to a good Google ranking. However, these factors can no longer be trans­ferred to almost any website. The different demands that man and machine place on websites from different sectors can’t be properly depicted with general ranking factors.

At the time the current analysis was published, the Search­metrics team announced more so­phis­ti­cat­ed follow-up studies, which examine the needs of in­di­vid­ual sectors sep­a­rate­ly. The annual study on ranking factors and rank cor­re­la­tions will also be based on sector-specific keyword sets (e.g. for e-commerce, health, finance, etc.). The 2016 Search­metrics study can be used by web­mas­ters as a simple SEO checklist and a way to interpret trends and pre­dic­tions in the field of search engine op­ti­miza­tion on a sector-specific basis.

Go to Main Menu