Tag Archive for: SEO

We all know that the search results you get on mobile and the ones you get on desktop devices can be very different – even for the same query, made at the same time, in the same place, logged into the same Google account. 

Have you ever found yourself asking exactly why this happens?

One site owner did and recently got the chance to ask one of Google’s Senior Webmaster Trends Analyst, John Mueller.

In the recent SEO Office Hours Session, Mueller explained that a wide range of factors decide what search results get returned for a search query – including what device you are using and why this happens.

Why Are Mobile Search Rankings Different From Desktop?

The question asked to Mueller specifically wanted to clarify why there is still a disparity between mobile and desktop search results after the launch of mobile-first indexing for all sites. Here’s what was asked:

“How are desktop and mobile ranking different when we’ve already switched to mobile-first indexing.”

Indexing and Ranking Are Different

In response to the question, Mueller first tried to clarify that indexing and rankings are not exactly the same thing. Instead, they are more like two parts of a larger system. 

“So, mobile-first indexing is specifically about that technical aspect of indexing the content. And we use a mobile Googlebot to index the content. But once the content is indexed, the ranking side is still (kind of) completely separate.”

Although the mobile-first index was a significant shift in how Google brought sites into their search engine and understood them, it actually had little direct effect on most search results. 

Mobile Users and Desktop Users Have Different Needs

Beyond the explanation about indexing vs. ranking, John Mueller also said that Google returns unique rankings for mobile and desktop search results because they reflect potentially different needs in-the-moment. 

“It’s normal that desktop and mobile rankings are different. Sometimes that’s with regards to things like speed. Sometimes that’s with regards to things like mobile-friendliness.

“Sometimes that’s also with regards to the different elements that are shown in the search results page.

“For example, if you’re searching on your phone then maybe you want more local information because you’re on the go. Whereas if you’re searching on a desktop maybe you want more images or more videos shown in the search results. So we tend to show …a different mix of different search results types.

“And because of that it can happen that the ranking or the visibility of individual pages differs between mobile and desktop. And that’s essentially normal. That’s a part of how we do ranking.

“It’s not something where I would say it would be tied to the technical aspect of indexing the content.”

With this in mind, there’s little need to be concerned if you aren’t showing up in the same spot for the same exact searches on different devices.

Instead, watch for big shifts in what devices people are using to access your page. If your users are overwhelmingly using phones, assess how your site can better serve the needs of desktop users. Likewise, a majority of traffic coming from desktop devices may indicate you need to assess your site’s speed and mobile friendliness.

If you want to hear Mueller’s full explanation and even more discussion about search engine optimization, check out the SEO Office Hours video below:

Despite the difference in how the pages are used created and generally thought about, Google’s John Mueller says the search engine sees no difference between “blog posts” and “web pages.”

In a recent SEO hangout, Mueller was asked by site owner Navin Adhikari about why the blog section of his site wasn’t getting the same amount of traffic as the rest of his site. This, combined with the way Google emphasizes content within its guidelines, has made Adhikari suspect that the search engine may be ranking blog content differently. This would explain why the rest of his site would be performing consistently well, while the blog was underperforming.

However, Mueller says this isn’t the case. In fact, Mueller explained that while the distinction between blog content and other areas of a site is something the search engine does not have access to, it is also not something the company would heavily factor into results if it could.

Google’s John Mueller Says Google Sees All Pages Similarly

In most cases, Mueller says the distinction between “blog posts” and “web pages” is entirely artificial. It is something provided for convenience on a website’s content management system (CMS) to help creatives generate content without the need for code skill and to help keep pages organized. 

So, while the blog part of your site may seem entirely separate to you while you are creating posts, it is just another subsection of your site in Google’s perspective.

“I don’t think Googlebot would recognize that there’s a difference. So usually that difference between posts and pages is something that is more within your backend within the CMS that you’re using, within WordPress in that case. And it wouldn’t be something that would be visible to us.

“So we would look at these as if it’s an HTML page and there’s lots of content here and it’s linked within your website in this way, and based on that we would rank this HTML page.

“We would not say oh it’s a blog post, or it’s a page, or it’s an informational article. We would essentially say it’s an HTML page and there’s this content here and it’s interlinked within your website in this specific way.”

Why A Blog May Underperform

If Google wasn’t ranking Adhikari’s blog differently, why would his blog specifically underperform? Mueller has some ideas.

Without access to in-depth data about the site, Mueller speculated that the most likely issue in this case would be how the blog is linked to from other pages on the site.

“I think, I mean, I don’t know your website so it’s hard to say. But what might be happening is that the internal linking of your website is different for the blog section as for the services section or the other parts of your website.

“And if the internal linking is very different then it’s possible that we would not be able to understand that this is an important part of the website.

“It’s not tied to the URLs, it’s not tied to the type of page. It’s really like we don’t understand how important this part of the website is.”

One way to do this is to generate a feed of links to new content on the homepage of your site. This helps to quickly establish that your blog content is important to your audience.

To hear the Mueller’s full response and more discussion on the best search engine optimization practices for Google, check out the full SEO Office Hours video below:

Have you gotten your brand’s website ready for the upcoming Google Page Experience ranking signal update? 

If not, Google Developer Martin Splitt says there’s no need to panic. 

In an interview on the Search Engine Journal Show on YouTube, host Loren Baker asks Splitt what advice he would give to anyone worried their site isn’t prepared for the update set to launch in mid-June. 

Giving a rare peek at the expected impact of the impending update, Splitt reveals the Page Experience signal update isn’t going to be a massive gamechanger. Instead, it is more of a “tiebreaker.”

As a “lightweight ranking signal”, just optimizing your site’s Page Experience metrics isn’t going to launch you from the back of the pack to the front. If you are competing with a site with exactly the same performance in every other area, however, this will give you the leg up to receive the better position in the search results. 

Don’t Ignore The Update

While the Page Experience update isn’t set to radically change up the search results, Splitt says brands and site owners should still work to optimize their site with the new signals in mind. 

Ultimately, making your page faster, more accessible on a variety of devices, and easier to use is always a worthwhile effort – even if it’s not a major ranking signal. 

As Splitt says:

“First things first, don’t panic. Don’t completely freak out, because as I said it’s a tiebreaker. For some it will be quite substantial, for some it will not be very substantial, so you don’t know which bucket you’ll be in because that depends a lot on context and industry and niche. So I wouldn’t worry too much about it.

I think generally making your website faster for users should be an important goal, and it should not just be like completely ignored. Which is the situation in many companies today that they’re just like ‘yeah, whatever.’”

As for how he thinks brands should approach the update, Splitt recommended focusing on new projects and content rather than prioritizing revamping your entire site upfront. 

… For new projects, definitely advise them to look into Core Web Vitals from the get-go. For projects that are already in maintenance mode, or are already actively being deployed, I would look into making some sort of plan for the mid-term future — like the next six months, eight months, twelve months — to actually work on the Core Web Vitals and to improve performance. Not just from an SEO perspective, but also literally for your users.”

Much of the discussion focuses on the perspective of SEO professionals, but it includes several bits of relevant information for anyone who owns or manages a website for their business. 

To hear the full conversation, check out the video below from Search Engine Journal:

For many small-to-medium businesses, appearing in search results around their local area is significantly more important than popping up in the results for someone halfway across the country. 

This raises the question, though. How many of the countless searches made every day are actually locally based?

We now have the answer to that question thanks to a new tool released by LocalSEOGuide.com and Traject Data.

What Percent Of Searches Are Local?

Working together, the companies analyzed over 60 million U.S. search queries and found that over a third (approx. 36%) of all queries returned Google’s local pack – indicating the search was location-based. 

Perhaps the biggest surprise from the data is that locally-based searches have remained largely consistent throughout the year. Following an uptick in early 2020 (likely driven by the coronavirus pandemic), the rate stayed around 36% over the course of the year. The only significant exception came in September, where the data shows a significant decrease in locally-driven searches. 

This data shows just how important it is for even brands that are strictly local to establish their brands online and optimize for search engines. Otherwise, you might be missing out on a big source of potential business.

Other Features In The Local Pack-O-Meter

Along with data on the appearance of local packs in Google search results, the Local Pack-O-Meter includes information on several other search features. These include:

  • Knowledge Graphs
  • “People Also Ask” Panels
  • Image Boxes
  • Shopping Boxes
  • Ads
  • Related Searches
  • And more

Though the current form of the tool doesn’t include ways to more selectively filter the information, there is plenty to take from the information for planning what search features you need to prioritize and which can be put on the back burner. 

To explore the Local Pack-O-Meter for yourself, click here.

Throughout 2020, approximately 65% of searches made on Google were “zero-click searches”, meaning that the search never resulted in an actual website visit.

Zero-click searches have been steadily on the rise, reaching 50% in June 2019 according to a study published by online marketing expert Rand Fishkin and SimilarWeb.

The steep rise in these types of searches between January and December 2020 is particularly surprising because it was widely believed zero-click searches were largely driven by mobile users looking for quick-answers. Throughout 2020, however, most of us were less mobile than ever due to Covid restrictions, social distancing, and quarantines.

The findings of this latest report don’t entirely disprove this theory, though. Mobile devices still saw the majority of zero-click Google searches. On desktop, less than half (46.5%) were zero-click searches, while more than three-fourths (77.2%) of searches from mobile devices did not result in a website visit.

Study Limitations

Fishkin acknowledges that his reports do come with a small caveat. Each analysis used different data sources and included different searching methods, which may explain some of the variance. Additionally, the newer study – which included data from over 5.1 trillion Google searches – had access to a significantly larger data pool compared to the approximately one billion searches used in the 2019 study.

“Nonetheless, it seems probable that if the previous panel were still available, it would show a similar trend of increasing click cannibalization by Google,” Fishkin said in his analysis.

What This Means For Businesses

The most obvious takeaway from these findings is that people are increasingly finding the information they are looking for directly on the search results pages, rather than needing to visit a web-page for more in-depth information.

It also means that attempts to regulate Google are largely failing.

Many have criticized and even pursued legal action (with varying levels of success) against the search engine for abusing their access to information on websites by showing that information in “knowledge panels” on search results.

The argument is that Google is stealing copyrighted information and republishing it on their own site. Additionally, this practice could potentially create less reason for searchers to click on ads, meaning Google is contributing to falling click-through rates and making more money off of it.

Ultimately, Google is showing no signs of slowing down on its use of knowledge panels and direct answers within search results. To adjust to the rise of zero-click searches, brands should put more energy into optimizing their content to appear in knowledge panels (increasing your brand awareness) and diversify their web presence with social media activity to directly reach customers.

In a Google Search Central SEO session recently, Google’s John Mueller shed light on a way the search engine’s systems can go astray – keeping pages on your site from being indexed and appearing in search. 

Essentially the issue comes from Google’s predictive approach to identifying duplicate content based on URL patterns, which has the potential to incorrectly identify duplicate content based on the URL alone. 

Google uses the predictive system to increase the efficiency of its crawling and indexing of sites by skipping over content which is just a copy of another page. By leaving these pages out of the index, Google’s engine has less chances of showing repetitious content in its search results and allows its indexing systems to reach other, more unique content more quickly. 

Obviously the problem is that content creators could unintentionally trigger these predictive systems when publishing unique content on similar topics, leaving quality content out of the search engine. 

John Mueller Explains How Google Could Misidentify Duplicate Content

In a response to a question from a user whose pages were not being indexed correctly, Mueller explained that Google uses multiple layers of filters to weed out duplicate content:

“What tends to happen on our side is we have multiple levels of trying to understand when there is duplicate content on a site. And one is when we look at the page’s content directly and we kind of see, well, this page has this content, this page has different content, we should treat them as separate pages.

The other thing is kind of a broader predictive approach that we have where we look at the URL structure of a website where we see, well, in the past, when we’ve looked at URLs that look like this, we’ve seen they have the same content as URLs like this. And then we’ll essentially learn that pattern and say, URLs that look like this are the same as URLs that look like this.”

He also explained how these systems can sometimes go too far and Google could incorrectly filter out unique content based on URL patterns on a site:

“Even without looking at the individual URLs we can sometimes say, well, we’ll save ourselves some crawling and indexing and just focus on these assumed or very likely duplication cases. And I have seen that happen with things like cities.

I have seen that happen with things like, I don’t know, automobiles is another one where we saw that happen, where essentially our systems recognize that what you specify as a city name is something that is not so relevant for the actual URLs. And usually we learn that kind of pattern when a site provides a lot of the same content with alternate names.”

How Can You Protect Your Site From This?

While Google’s John Mueller wasn’t able to provide a full-proof solution or prevention for this issue, he did offer some advice for sites that have been affected:

“So what I would try to do in a case like this is to see if you have this kind of situations where you have strong overlaps of content and to try to find ways to limit that as much as possible.

And that could be by using something like a rel canonical on the page and saying, well, this small city that is right outside the big city, I’ll set the canonical to the big city because it shows exactly the same content.

So that really every URL that we crawl on your website and index, we can see, well, this URL and its content are unique and it’s important for us to keep all of these URLs indexed.

Or we see clear information that this URL you know is supposed to be the same as this other one, you have maybe set up a redirect or you have a rel canonical set up there, and we can just focus on those main URLs and still understand that the city aspect there is critical for your individual pages.”

It should be clarified that duplicate content or pages impacted by this problem will not hurt the overall SEO of your site. So, for example, having several pages tagged as being duplicate content won’t prevent your home page from appearing for relevant searches. 

Still, the issue has the potential to gradually decrease the efficiency of your SEO efforts, not to mention making it harder for people to find the valuable information you are providing. 

To see Mueller’s full explanation, watch the video below:

Blog comments are a tricky issue for many business websites. 

On one hand, everyone dreams of building a community of loyal customers that follow every post and regularly have a healthy discussion in the comments. Not only can it be helpful for other potential customers, but comments tend to help Google rankings and help inspire future content for your site. 

On the other hand, most business-based websites receive significantly more spam than genuine comments. Even the best anti-spam measures can’t prevent every sketchy link or comment on every post. For the most part, these are more annoying than being an actual problem. However, if left completely unmonitored, spam could build up and potentially hurt your rankings.

This can make it tempting to just remove comments from your blog entirely. If you do, you don’t have to worry about monitoring comments, responding to trolls, or weeding out spam. After all, your most loyal fans can still talk about your posts on your Facebook page, right?

Unfortunately, as Google’s John Mueller recently explained, removing comments from your blog is likely to hurt more than it helps. 

John Mueller Addresses Removing Blog Comments

In a Google Search Central SEO hangout on February 5, Google’s John Mueller explored a question from a site owner about how Google factors blog comments into search rankings. Specifically, they wanted to remove comments from their site but worried about potentially dropping in the search results if they did. 

While the answer was significantly more complicated, the short version is this:

Google does factor blog comments into where they decide to rank web pages. Because of this, it is unlikely that you could remove comments entirely without affecting your rankings. 

How Blog Comments Impact Search Rankings

Google sees comments as a separate but significant part of your content. So, while they recognize that comments may not be directly reflective of your content, it does reflect things like engagement and occasionally provide helpful extra information. 

This also means that removing blog comments is essentially removing a chunk of information, keywords, and context from every blog post on your site in the search engine’s eyes. 

However, John Mueller didn’t go as far as recommending to keep blog comments over removing them. This depends on several issues including how many comments you’ve received, what type of comments you’ve gotten, and how much they have added to your SEO.

As Mueller answered:

“I think it’s ultimately up to you. From our point of view we do see comments as a part of the content. We do also, in many cases, recognize that this is actually the comment section so we need to treat it slightly differently. But ultimately if people are finding your pages based on the comments there then, if you delete those comments, then obviously we wouldn’t be able to find your pages based on that.

So, that’s something where, depending on the type of comments that you have there, the amount of comments that you have, it can be the case that they provide significant value to your pages, and they can be a source of additional information about your pages, but it’s not always the case.

So, that’s something where I think you need to look at the contents of your pages overall, the queries that are leading to your pages, and think about which of these queries might go away if comments were not on those pages anymore. And based on that you can try to figure out what to do there.

It’s certainly not the case that we completely ignore all of the comments on a site. So just blindly going off and deleting all of your comments in the hope that nothing will change – I don’t think that will happen.”

It is clear that removing blog comments entirely from your site is all but certain to affect your search rankings on some level. Whether this means a huge drop in rankings or potentially a small gain, though, depends entirely on what type of comments your site is actually losing. 

To watch Mueller’s full answer, check out the video below:

When Google releases a major algorithm update, it can take weeks or months to fully understand the effect. Google itself tends to be tight-lipped about the updates, preferring to point website owners and businesses to its general webmaster guidelines for advice on an update. 

Because of all this, we are just starting to grasp what Google’s recent algorithm updates did to search engines. One thing that has become quickly apparent, though, is that one of the biggest losers from Google’s 2020 algorithm updates has consistently been online piracy. 

This is most clear in a new end-of-year report from TorrentFreak and piracy tracking company MUSO

How Google’s Algorithm Updates Affected Digital Piracy

Overall, the analysis shows that site traffic to piracy sites from search engines has fallen by nearly a third from December 2019 to November 2020. Notably, the two big periods leading to this loss of traffic line up perfectly with Google’s algorithm updates earlier this year. 

In January 2020, piracy traffic began dwindling shortly after the January 13th core update. 

After experiencing a short uptick at the start of the COVID pandemic in March, the May 4th core update then hit online pirates even harder, sending piracy traffic plummeting. 

Early indications from the public and some analysts suggest the December 2020 core update continued this trend, though it is too early to know for sure. 

Interestingly, TorrentFreak and MUSO say they corroborated the findings of their report with operators of one of the largest torrent websites online:

“To confirm our findings we spoke to the operator of one of the largest torrent sites, who prefers to remain anonymous. Without sharing our findings, he reported a 35% decline in Google traffic over the past year, which is in line with MUSO’s data.”

Is Google Completely Responsible?

It should be noted that while Google’s algorithm updates likely played a large role in the decline of search traffic to piracy sites, other factors almost certainly contributed as well. 

TorrentFreak’s report shows that direct traffic to piracy-related sites experienced a gradual 10% decline over the course of the year. This may suggest overall interest in pirating content may have fallen somewhat on its own. 

Additionally, 2020 was a unique year with less content coming out than usual. The COVID pandemic disrupted pretty much every industry, including creative industries. Music releases were pushed back or cancelled as it became difficult to safely record in studios. The closing of theaters led to the delay of many major movies, and TV creators had to completely rework how they wrote and filmed their shows. 

With less content from major studios and artists, it is highly likely users just had less available content that they were interested in pirating. 

Why This Matters

The good news is that the vast majority of business-related websites have absolutely nothing to do with online piracy and therefore should be safe from these effects of Google’s most recent algorithm updates. 

The less good news is that Google’s core algorithm updates are designed to impact a huge portion of websites around the globe, and certainly had impacts outside the realm of digital piracy. 

Still, we felt it important to highlight a real-world way a major Google algorithm update can impact an entire industry on a wide-scale within search results. 

Ultimately, the takeaway for most website owners is that keeping an eye on your analytics is essential.

If you are watching, you can respond to major shifts like this with new strategies, optimization, and even ask Google to recrawl your site. If you aren’t monitoring your analytics, however, you could lose a huge chunk of your traffic from potential customers with no idea why.

I don’t think it is an overstatement to say that 2020 changed everything for businesses around the world – no matter what industry you are in. The spread of COVID-19 accelerated the migration of small and local businesses to the internet, making having an online presence no longer an option but a necessity. 

In turn, these changes have had a massive impact on digital marketing, driving a wave of new competition and seismic shifts in how we connect with customers every day. 

For better or worse, many of these changes are bound to stick around well into 2021, influencing the ways we shop, advertise, and connect with customers for the foreseeable future. 

With this in mind, predicting next year’s search trends is a little easier than it has been in the past, with some clear indicators of what businesses need to do to stay relevant and efficient in a post-COVID world. 

The 5 Online Marketing Trends You Need To Know In 2021

The Effects of COVID Will Linger

The most obvious trend brands will need to be prepared for in 2021 will continue to be the ongoing COVID-19 pandemic. While vaccinations are finally rolling out and we can be optimistic to relatively soon be returning to something resembling normality, it is also clear that many shopping habits and consumer behaviors are permanently changed. 

For example, virtual events and trade shows are all but guaranteed to stick around. Now only do they provide an easier and more affordable way to bring together top members of your industry from around the country, they do it without massively interrupting your day-to-day operations. 

Likewise, many customers will continue to prefer using online ordering and curbside pickup from local businesses out of convenience well after social distancing is a thing of the past. 

Social Media Purchasing Goes Mainstream

For years, social media has been a major tool for consumers to find and learn about new products they otherwise would have never known about. Recently, though, they have been expanding to allow shoppers to not just find products, but to buy them right then and there. 

The ease of going from discovering something cool to making a purchase without ever having to leave your current app is fueling a rush to provide the best social shopping experience and this trend is only going to get bigger in 2021. 

We Are Past Peak Facebook

Facebook has been the undeniable king of social media for more than a decade now, but the platform has been facing increasing challenges that are getting hard to deny. 

In sheer numbers, the social network still far outranks any other platform out there, but a growing number of its users are aging, with younger demographics turning to hipper alternatives like Instagram, Snapchat, and TikTok. 

Add in the continuous issues with the spread of fake news, concerns about echo chambers, a relatively recent data breach scandal, and recent calls for the breakup of Facebook’s extended network of services (including Instagram and WhatsApp) – it quickly becomes clear Facebook is past its prime and is no longer the single platform you should be focusing on. 

Video Content Is The Standard

For the past few years, my year-end lists have consistently included one thing – video content has been increasingly important for brands looking to maintain effective marketing and outreach. 

Well, call 2020 the tipping point, because video content is no longer “on the rise”. It is the standard and it is here to stay. 

While blog content remains important for technical SEO and connecting audiences with some specific types of information, the data makes it very clear that consumers prefer the quick, digestible, and entertaining nature of videos over long, often repetitive blog posts. 

At this point, rather than clicking to your blog page shoppers are more likely to check out your YouTube and Instagram page when trying to find out the details of what you offer and why they should choose you over the competition. 

Mobile SEO Is Now an Oxymoron

Since Google introduced its “Mobile-First Search Index” the writing has been on the wall. Having a mobile-friendly website was no longer an option or convenience. Mobile-optimized websites were quickly becoming the first thing anyone – including search engines – were likely to see when checking out your brand. 

With the recent announcement that Google would be dropping all desktop-only websites from its primary index starting in March 2021, the final nail is being pounded into the coffin. To be included on search results from the biggest search engine in the world, your website must be compatible with all the current mobile-friendly standards. 

With all this in mind, the age of considering separate SEO tactics and strategies for mobile users is long gone. There is just “SEO” and you must plan for mobile users if you want to have a chance of succeeding. 


We are all hoping that 2021 is a little less chaotic and a bit smoother than the past year has been. Still, even if we have the most tranquil year in history, there are bound to be a number of surprising new twists and factors in how Google ranks websites and content for users. If you want to remain competitive in an increasingly digital world, it is important that you stay up to date with all the latest from Google and be prepared to respond. 

Google confirmed this week that its most recent broad core update, which began rolling out on December 3, 2020, is now completely rolled out to all search users.

Google’s SearchLiason account announced “the December 2020 Core Update rollout is complete,” yesterday following almost two weeks of anxious waiting from webmasters and SEOs.

What We Know

Google is notoriously tight-lipped about its “secret recipe” used to rank websites around the world. Still, this update was big enough that the search engine felt it necessary to alert the public when the December core update started rolling out. 

This may simply be because the update rollout is global, affecting all users in all countries, across all languages, and across all website categories. 

However, early signs suggest the algorithm update was uncommonly big, with many reporting huge gains or losses in organic traffic from search engines. 

What Is a Broad Core Update?

Google’s “broad core updates” are essentially a tuneup of the search engine’s systems. Rather than adding a specific feature, targeting a singular widespread issue like linkspam, or prioritizing a ranking signal, a core update more subtly tweaks Google’s existing systems. This can be rebalancing the impact of some search signals, refining Google’s indexing tools, or any other combination of changes. 

What To Do If You Are Affected

The first thing any webmaster should do is thoroughly check their analytics to ensure they haven’t experienced a significant change in search traffic. 

If you have, you will be disappointed to hear that Google has not provided any specific guidance for how to recover from this update. In fact, the company suggests a negative impact from a core update may not even reflect any actual problems with your website.

What the search engine does offer is a series of questions to consider if you have been affected by a recent core update. Though not as useful as actual suggestions for fixing lost rankings, these questions can help you assess your site and identify areas for improvement before the next broad core update.