Tag Archive for: ranking factors

For years, backlinks have been considered one of the most important ranking factors for ranking on Google’s search engine. In 2016, the company even confirmed as much when a search quality senior strategist said that the top ranking factors were links, content, and RankBrain.

According to new comments from Google’s Gary Illyes, an analysis for Google Search, things have changed since then. 

What Was Said

During a panel at Pubcon Pro, Illyes was asked directly whether links are still one of the top three ranking factors. In response, here is what he said:

“I think they are important, but I think people overestimate the importance of links. I don’t agree it’s in the top three. It hasn’t been for some time.”

Illyes even went as far as to say there are cases where sites have absolutely 0 links (internal or external), but consistently ranked in the top spot because they provided excellent content. 

The Lead Up

Gary Illyes isn’t the first person from Google to suggest that links have lost the SEO weight they used to carry. Last year, Dan Nguyen from the search quality team stated that links had lost their impact during a Google SEO Office Hours session:

“First, backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago. We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries.’

Other major figures at Google, including Matt Cutts and John Mueller, have predicted this would happen for years. As far back as 2014, Cutts (a leading figure at Google at the time) said:

“I think backlinks still have many, many years left in them. But inevitably, what we’re trying to do is figure out how an expert user would say, this particular page matched their information needs. And sometimes backlinks matter for that. It’s helpful to find out what the reputation of the site or a page is. But, for the most part, people care about the quality of the content on that particular page. So I think over time, backlinks will become a little less important.”

Ultimately, this shift was bound to happen because search has become so much more complex. With each search, Google considers the intent behind the search, the actual query, and personal information to help tailor the search results for each user. With so much in flux, we have reached a point where the most important ranking signals may even differ based on the specific site that is trying to rank.

It’s a question we all have dealt with at least once or twice, and one that rarely has a satisfying answer: “Why did my Google rankings suddenly drop?”

Sometimes, a simple audit will reveal a technical hiccup or issue that is downgrading your rankings. Just as often, though, it appears everything is working as it should but you are suddenly further down the page or not even on the first page anymore. 

In this situation, Google’s John Mueller says there are four major reasons for sites to lose rankings. 

John Mueller Explains Why Sites Lose Rankings

In a recent Google Webmaster Central chat, Mueller was asked why a publisher who had ranked well for “seven or eight years” had suddenly lost rankings for three different sites. Notably, the person asking the question couldn’t find any signs of problems in their inbound or outbound links, and all the sites used the same keywords (they sell similar products by different brands). 

Of course, Mueller couldn’t get too specific with his answer because he didn’t have actual data or analytics on the sites. Still, he did his best to address four general reasons sites may suddenly rank worse.

1) Rankings Are Temporary

Once a site is ranking at the top for its ideal keywords, many site owners feel like they have accomplished their mission and will continue to rank there. Unfortunately, John Mueller says that rankings are malleable and change constantly.

Mueller explained:

“In general, just because the site was appearing well in search results for a number of years does not mean that it will continue to appear well in search results in the future.

These kinds of changes are essentially to be expected on the web, it’s a very common dynamic environment”

2) The Internet Is Always Changing

The reason why rankings are so prone to fluctuations is that the internet itself is always changing. New sites are being created every day, links might die, competitors might improve their own SEO, and people’s interests change.

Each and every one of these can have a big impact on the search results people see at any given time. 

As Mueller put it:

“On the one hand, things on the web change with your competitors, with other sites…”

3) Google Changes Its Algorithms

To keep up with the constantly changing internet, Google itself has to regularly overhaul how its search engine interprets and ranks websites. 

To give you one idea how this plays out, a few years ago search results were absolutely dominated by “listicles” (short top 5 or top 10 lists). Over time, people got tired of the shallow information these types of lists provided and how easily they could be abused as clickbait. Google recognized this and tweaked its algorithm to better prioritize in-depth information hyper-focusing on a single topic or issue. Now, though a listicle can still rank on Google, it is considerably harder than it used to be.

As Mueller simply explained:

“On the other hand, things on our side change with our algorithms in search.”

4) People Change

This is one that has been touched upon throughout the list Mueller gave, but it really gets to the heart of what Google does. What people expect out of the internet is constantly changing, and it is Google’s job to keep up with these shifts. 

In some cases, this can mean that people outright change how they search. For example, simple keywords like “restaurants near me” or “fix Samsung TV” were the main tool people used to find information for years and years. As voice search has become widespread and people have gotten more accustomed to using search engines all the time, queries have expanded to frequently include full sentences or phrases like “What is the best Chinese restaurant in midtown?”

At the same time, what people expect out of the same queries is also shifting with technological innovation and content trends. 

Mueller describes the situation by saying:

“And finally on the user side as well, the expectations change over time. So, just because something performed well in the past doesn’t mean it will continue to perform well in search in the future.”

Always Be Monitoring and Improving

The big theme behind all of these reasons sites lose rankings is that they are standing still while the world moves past them. To maintain your high rankings, your site has to be constantly in motion – moving with the trends and providing the content users want and expect from sites at any given time. 

This is why successful sites are also constantly monitoring their analytics to identify upcoming shifts and respond to any drops in rankings as soon as they happen.

If you want to see the full response, watch the video below (it starts with Mueller’s response but you can choose to watch the entire Webmaster Central office-hours discussion if you wish).

Google has been encouraging webmasters to make their sites as fast as possible for years, but now they’re making it an official ranking requirement.

The company announced this week that it will be launching what it is calling the “Speed Update” in July 2018, which will make page speed an official ranking signal for mobile searches.

Google recommends checking your site’s speed using its PageSpeed report, as well as using tools like LightHouse to measure page speed and improve your loading times.

As Google’s Zhiheng Wang and Doantam Phan wrote in the announcement:

The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries. It applies the same standard to all pages, regardless of the technology used to build the page. The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content.

While Google says the update will only affect a “small percentage of queries”, it is impossible to tell exactly how many will be impacted. Google handles billions of queries a day, so a small piece of that could still be a substantial number of searches.

This is the first time page speed will be made a ranking factor for mobile searches, but it has been a ranking factor on desktop since 2010. It makes sense to expand this to mobile since there is a wealth of evidence showing that mobile users prioritize loading time when clicking search results. If a page doesn’t load within three-to-five seconds, they are likely to leave the page and find another relevant search result.

Everyone wishes there was a simple recipe to guarantee you’ll rank at the top of the search engines, but Google’s Gary Illyes says there is no such thing. In fact, there isn’t even a consistent top-three ranking factors for all content.

Instead, Illyes explains that the top-ranking factors for web pages vary depending on the query being searched. Going by that thought process, factors like links might be used to verify that something is newsworthy, while page speed, content quality, and keyword usage may be more useful for some types of content.

John Mueller, also a big figure at Google, joined the discussion to suggest that worrying about optimizing for specific ranking factors is “short-term thinking.”

Surprisingly, Illyes takes it even further by saying that links – often viewed as one of the most important signals for a website – are often not a factor in the search results at all. Long-tail search queries, in particular, are likely to pull up content with few to no links.

While this can be discouraging to brands or businesses looking for specific ways to improve their site and rank higher, the overall message is clear. A holistic approach that prioritizes people’s needs and desires is bound to benefit you, while myopically focusing on specific factors is bound to eventually leave you left behind.

As Mueller suggests – if you build something awesome, Google will come.

HTTPS

It has now been two years since Google announced it would be making HTTPS a minor ranking signal, and a recent study from Moz shows just how many sites have made the switch since then.

After Google’s announcement, there was an initial surge in sites changing from HTTP to HTTPS, but many held back to assess just how important the security protocol was to the search engine and ultimately decided it wasn’t worth the risk. Google only considers HTTPS a minor factor in their ranking algorithm and there has been concern about potential risks when making the switch.

To check up how far along the transition is, Dr. Pete Meyer from Moz compiled data to see just how close is Google is to changing the web over to HTTPS.

Before Google started including HTTPS in its algorithm, Meyer says only around 7% of all pages featured on the first page of Google search results used the more secure protocol. A week after the switch that number had climbed to 8%. Since then, the number has steadily been rising, reaching over 30% this year.

Moz reports that “as of late June, our tracking data shows that 32.5% (almost one-third of page-1 Google results now use the “https:” protocol.”

However, Meyer says he is not convinced everyone that has made the switch was motivated by algorithms and ranking signals. Instead, he believes it is a sign that Google’s PR campaign to make HTTPS more attractive and desirable for sites is working.

Meyer also says that in another 1 to 1.5 years we are likely to see 50% of the sites shown on the first page of search results to use HTTPS, which he predicts will lead Google to strengthen the ranking signal.

Ultimately, many are still hesitant about changing their entire site’s HTML structure to HTTPS and the risks that come along with site-wide changes like this. However, Dr. Meyers says it is wise to keep an eye on how many sites in your industry are using the protocol and to be watchful for any upcoming algorithm updates that may make HTTPS even more prominent in search results.

Every year, Moz publishes a complete review of the search ranking factors that most influenced the search results pages for the year. Now, they have released their latest study, which they say is the largest they have yet to do.

The study attempts to lift the veil on Google’s search ranking factors by surveying industry experts and using correlation studies to measure the search results and rankings. This year, Moz interviewed over 150 leading search experts, as well as using data from their own correlation studies and data from SimilarWeb, DomainTools, and Ahrefs.

The most notable finding from the new study is that, despite continuous cries of “links are dead”, links to the domain and page level are still the highest ranking factor for Google. The lowest factors included in the study were social metrics, TLDs, and basic on-page markup such as schema.

The infographic below summarizes the findings of the study, but you can also see the full study for more in-depth details.

rankingfactors-info

Every year, Moz details the local ranking factors they can identify in Google’s algorithm to help small businesses get a foot up in the listings. Earlier this week they announced the release of this year’s findings and everything seems… surprisingly the same.

Analysts have only found a few notable changes, but the findings are largely the same as last year’s. However, David Mihm did highlight a few important things to notice in the findings, including:

  1.  Behavioral signals such as click through rate, are more of a factor this year that others.
  2. With Pigeon‘s release, experts are saying Domain authority is more of a signal today.
  3. Google may have tuned up the proximity to searcher factor as well.

You can see the charts from the study below, or you can get more details from the results over at Moz.

moz-local-ranking-factors-2014-621x600

moz-local-ranking-localized-2014