Posts

For many small-to-medium businesses, appearing in search results around their local area is significantly more important than popping up in the results for someone halfway across the country. 

This raises the question, though. How many of the countless searches made every day are actually locally based?

We now have the answer to that question thanks to a new tool released by LocalSEOGuide.com and Traject Data.

What Percent Of Searches Are Local?

Working together, the companies analyzed over 60 million U.S. search queries and found that over a third (approx. 36%) of all queries returned Google’s local pack – indicating the search was location-based. 

Perhaps the biggest surprise from the data is that locally-based searches have remained largely consistent throughout the year. Following an uptick in early 2020 (likely driven by the coronavirus pandemic), the rate stayed around 36% over the course of the year. The only significant exception came in September, where the data shows a significant decrease in locally-driven searches. 

This data shows just how important it is for even brands that are strictly local to establish their brands online and optimize for search engines. Otherwise, you might be missing out on a big source of potential business.

Other Features In The Local Pack-O-Meter

Along with data on the appearance of local packs in Google search results, the Local Pack-O-Meter includes information on several other search features. These include:

  • Knowledge Graphs
  • “People Also Ask” Panels
  • Image Boxes
  • Shopping Boxes
  • Ads
  • Related Searches
  • And more

Though the current form of the tool doesn’t include ways to more selectively filter the information, there is plenty to take from the information for planning what search features you need to prioritize and which can be put on the back burner. 

To explore the Local Pack-O-Meter for yourself, click here.

Throughout 2020, approximately 65% of searches made on Google were “zero-click searches”, meaning that the search never resulted in an actual website visit.

Zero-click searches have been steadily on the rise, reaching 50% in June 2019 according to a study published by online marketing expert Rand Fishkin and SimilarWeb.

The steep rise in these types of searches between January and December 2020 is particularly surprising because it was widely believed zero-click searches were largely driven by mobile users looking for quick-answers. Throughout 2020, however, most of us were less mobile than ever due to Covid restrictions, social distancing, and quarantines.

The findings of this latest report don’t entirely disprove this theory, though. Mobile devices still saw the majority of zero-click Google searches. On desktop, less than half (46.5%) were zero-click searches, while more than three-fourths (77.2%) of searches from mobile devices did not result in a website visit.

Study Limitations

Fishkin acknowledges that his reports do come with a small caveat. Each analysis used different data sources and included different searching methods, which may explain some of the variance. Additionally, the newer study – which included data from over 5.1 trillion Google searches – had access to a significantly larger data pool compared to the approximately one billion searches used in the 2019 study.

“Nonetheless, it seems probable that if the previous panel were still available, it would show a similar trend of increasing click cannibalization by Google,” Fishkin said in his analysis.

What This Means For Businesses

The most obvious takeaway from these findings is that people are increasingly finding the information they are looking for directly on the search results pages, rather than needing to visit a web-page for more in-depth information.

It also means that attempts to regulate Google are largely failing.

Many have criticized and even pursued legal action (with varying levels of success) against the search engine for abusing their access to information on websites by showing that information in “knowledge panels” on search results.

The argument is that Google is stealing copyrighted information and republishing it on their own site. Additionally, this practice could potentially create less reason for searchers to click on ads, meaning Google is contributing to falling click-through rates and making more money off of it.

Ultimately, Google is showing no signs of slowing down on its use of knowledge panels and direct answers within search results. To adjust to the rise of zero-click searches, brands should put more energy into optimizing their content to appear in knowledge panels (increasing your brand awareness) and diversify their web presence with social media activity to directly reach customers.

In a Google Search Central SEO session recently, Google’s John Mueller shed light on a way the search engine’s systems can go astray – keeping pages on your site from being indexed and appearing in search. 

Essentially the issue comes from Google’s predictive approach to identifying duplicate content based on URL patterns, which has the potential to incorrectly identify duplicate content based on the URL alone. 

Google uses the predictive system to increase the efficiency of its crawling and indexing of sites by skipping over content which is just a copy of another page. By leaving these pages out of the index, Google’s engine has less chances of showing repetitious content in its search results and allows its indexing systems to reach other, more unique content more quickly. 

Obviously the problem is that content creators could unintentionally trigger these predictive systems when publishing unique content on similar topics, leaving quality content out of the search engine. 

John Mueller Explains How Google Could Misidentify Duplicate Content

In a response to a question from a user whose pages were not being indexed correctly, Mueller explained that Google uses multiple layers of filters to weed out duplicate content:

“What tends to happen on our side is we have multiple levels of trying to understand when there is duplicate content on a site. And one is when we look at the page’s content directly and we kind of see, well, this page has this content, this page has different content, we should treat them as separate pages.

The other thing is kind of a broader predictive approach that we have where we look at the URL structure of a website where we see, well, in the past, when we’ve looked at URLs that look like this, we’ve seen they have the same content as URLs like this. And then we’ll essentially learn that pattern and say, URLs that look like this are the same as URLs that look like this.”

He also explained how these systems can sometimes go too far and Google could incorrectly filter out unique content based on URL patterns on a site:

“Even without looking at the individual URLs we can sometimes say, well, we’ll save ourselves some crawling and indexing and just focus on these assumed or very likely duplication cases. And I have seen that happen with things like cities.

I have seen that happen with things like, I don’t know, automobiles is another one where we saw that happen, where essentially our systems recognize that what you specify as a city name is something that is not so relevant for the actual URLs. And usually we learn that kind of pattern when a site provides a lot of the same content with alternate names.”

How Can You Protect Your Site From This?

While Google’s John Mueller wasn’t able to provide a full-proof solution or prevention for this issue, he did offer some advice for sites that have been affected:

“So what I would try to do in a case like this is to see if you have this kind of situations where you have strong overlaps of content and to try to find ways to limit that as much as possible.

And that could be by using something like a rel canonical on the page and saying, well, this small city that is right outside the big city, I’ll set the canonical to the big city because it shows exactly the same content.

So that really every URL that we crawl on your website and index, we can see, well, this URL and its content are unique and it’s important for us to keep all of these URLs indexed.

Or we see clear information that this URL you know is supposed to be the same as this other one, you have maybe set up a redirect or you have a rel canonical set up there, and we can just focus on those main URLs and still understand that the city aspect there is critical for your individual pages.”

It should be clarified that duplicate content or pages impacted by this problem will not hurt the overall SEO of your site. So, for example, having several pages tagged as being duplicate content won’t prevent your home page from appearing for relevant searches. 

Still, the issue has the potential to gradually decrease the efficiency of your SEO efforts, not to mention making it harder for people to find the valuable information you are providing. 

To see Mueller’s full explanation, watch the video below:

Blog comments are a tricky issue for many business websites. 

On one hand, everyone dreams of building a community of loyal customers that follow every post and regularly have a healthy discussion in the comments. Not only can it be helpful for other potential customers, but comments tend to help Google rankings and help inspire future content for your site. 

On the other hand, most business-based websites receive significantly more spam than genuine comments. Even the best anti-spam measures can’t prevent every sketchy link or comment on every post. For the most part, these are more annoying than being an actual problem. However, if left completely unmonitored, spam could build up and potentially hurt your rankings.

This can make it tempting to just remove comments from your blog entirely. If you do, you don’t have to worry about monitoring comments, responding to trolls, or weeding out spam. After all, your most loyal fans can still talk about your posts on your Facebook page, right?

Unfortunately, as Google’s John Mueller recently explained, removing comments from your blog is likely to hurt more than it helps. 

John Mueller Addresses Removing Blog Comments

In a Google Search Central SEO hangout on February 5, Google’s John Mueller explored a question from a site owner about how Google factors blog comments into search rankings. Specifically, they wanted to remove comments from their site but worried about potentially dropping in the search results if they did. 

While the answer was significantly more complicated, the short version is this:

Google does factor blog comments into where they decide to rank web pages. Because of this, it is unlikely that you could remove comments entirely without affecting your rankings. 

How Blog Comments Impact Search Rankings

Google sees comments as a separate but significant part of your content. So, while they recognize that comments may not be directly reflective of your content, it does reflect things like engagement and occasionally provide helpful extra information. 

This also means that removing blog comments is essentially removing a chunk of information, keywords, and context from every blog post on your site in the search engine’s eyes. 

However, John Mueller didn’t go as far as recommending to keep blog comments over removing them. This depends on several issues including how many comments you’ve received, what type of comments you’ve gotten, and how much they have added to your SEO.

As Mueller answered:

“I think it’s ultimately up to you. From our point of view we do see comments as a part of the content. We do also, in many cases, recognize that this is actually the comment section so we need to treat it slightly differently. But ultimately if people are finding your pages based on the comments there then, if you delete those comments, then obviously we wouldn’t be able to find your pages based on that.

So, that’s something where, depending on the type of comments that you have there, the amount of comments that you have, it can be the case that they provide significant value to your pages, and they can be a source of additional information about your pages, but it’s not always the case.

So, that’s something where I think you need to look at the contents of your pages overall, the queries that are leading to your pages, and think about which of these queries might go away if comments were not on those pages anymore. And based on that you can try to figure out what to do there.

It’s certainly not the case that we completely ignore all of the comments on a site. So just blindly going off and deleting all of your comments in the hope that nothing will change – I don’t think that will happen.”

It is clear that removing blog comments entirely from your site is all but certain to affect your search rankings on some level. Whether this means a huge drop in rankings or potentially a small gain, though, depends entirely on what type of comments your site is actually losing. 

To watch Mueller’s full answer, check out the video below:

When Google releases a major algorithm update, it can take weeks or months to fully understand the effect. Google itself tends to be tight-lipped about the updates, preferring to point website owners and businesses to its general webmaster guidelines for advice on an update. 

Because of all this, we are just starting to grasp what Google’s recent algorithm updates did to search engines. One thing that has become quickly apparent, though, is that one of the biggest losers from Google’s 2020 algorithm updates has consistently been online piracy. 

This is most clear in a new end-of-year report from TorrentFreak and piracy tracking company MUSO

How Google’s Algorithm Updates Affected Digital Piracy

Overall, the analysis shows that site traffic to piracy sites from search engines has fallen by nearly a third from December 2019 to November 2020. Notably, the two big periods leading to this loss of traffic line up perfectly with Google’s algorithm updates earlier this year. 

In January 2020, piracy traffic began dwindling shortly after the January 13th core update. 

After experiencing a short uptick at the start of the COVID pandemic in March, the May 4th core update then hit online pirates even harder, sending piracy traffic plummeting. 

Early indications from the public and some analysts suggest the December 2020 core update continued this trend, though it is too early to know for sure. 

Interestingly, TorrentFreak and MUSO say they corroborated the findings of their report with operators of one of the largest torrent websites online:

“To confirm our findings we spoke to the operator of one of the largest torrent sites, who prefers to remain anonymous. Without sharing our findings, he reported a 35% decline in Google traffic over the past year, which is in line with MUSO’s data.”

Is Google Completely Responsible?

It should be noted that while Google’s algorithm updates likely played a large role in the decline of search traffic to piracy sites, other factors almost certainly contributed as well. 

TorrentFreak’s report shows that direct traffic to piracy-related sites experienced a gradual 10% decline over the course of the year. This may suggest overall interest in pirating content may have fallen somewhat on its own. 

Additionally, 2020 was a unique year with less content coming out than usual. The COVID pandemic disrupted pretty much every industry, including creative industries. Music releases were pushed back or cancelled as it became difficult to safely record in studios. The closing of theaters led to the delay of many major movies, and TV creators had to completely rework how they wrote and filmed their shows. 

With less content from major studios and artists, it is highly likely users just had less available content that they were interested in pirating. 

Why This Matters

The good news is that the vast majority of business-related websites have absolutely nothing to do with online piracy and therefore should be safe from these effects of Google’s most recent algorithm updates. 

The less good news is that Google’s core algorithm updates are designed to impact a huge portion of websites around the globe, and certainly had impacts outside the realm of digital piracy. 

Still, we felt it important to highlight a real-world way a major Google algorithm update can impact an entire industry on a wide-scale within search results. 

Ultimately, the takeaway for most website owners is that keeping an eye on your analytics is essential.

If you are watching, you can respond to major shifts like this with new strategies, optimization, and even ask Google to recrawl your site. If you aren’t monitoring your analytics, however, you could lose a huge chunk of your traffic from potential customers with no idea why.

I don’t think it is an overstatement to say that 2020 changed everything for businesses around the world – no matter what industry you are in. The spread of COVID-19 accelerated the migration of small and local businesses to the internet, making having an online presence no longer an option but a necessity. 

In turn, these changes have had a massive impact on digital marketing, driving a wave of new competition and seismic shifts in how we connect with customers every day. 

For better or worse, many of these changes are bound to stick around well into 2021, influencing the ways we shop, advertise, and connect with customers for the foreseeable future. 

With this in mind, predicting next year’s search trends is a little easier than it has been in the past, with some clear indicators of what businesses need to do to stay relevant and efficient in a post-COVID world. 

The 5 Online Marketing Trends You Need To Know In 2021

The Effects of COVID Will Linger

The most obvious trend brands will need to be prepared for in 2021 will continue to be the ongoing COVID-19 pandemic. While vaccinations are finally rolling out and we can be optimistic to relatively soon be returning to something resembling normality, it is also clear that many shopping habits and consumer behaviors are permanently changed. 

For example, virtual events and trade shows are all but guaranteed to stick around. Now only do they provide an easier and more affordable way to bring together top members of your industry from around the country, they do it without massively interrupting your day-to-day operations. 

Likewise, many customers will continue to prefer using online ordering and curbside pickup from local businesses out of convenience well after social distancing is a thing of the past. 

Social Media Purchasing Goes Mainstream

For years, social media has been a major tool for consumers to find and learn about new products they otherwise would have never known about. Recently, though, they have been expanding to allow shoppers to not just find products, but to buy them right then and there. 

The ease of going from discovering something cool to making a purchase without ever having to leave your current app is fueling a rush to provide the best social shopping experience and this trend is only going to get bigger in 2021. 

We Are Past Peak Facebook

Facebook has been the undeniable king of social media for more than a decade now, but the platform has been facing increasing challenges that are getting hard to deny. 

In sheer numbers, the social network still far outranks any other platform out there, but a growing number of its users are aging, with younger demographics turning to hipper alternatives like Instagram, Snapchat, and TikTok. 

Add in the continuous issues with the spread of fake news, concerns about echo chambers, a relatively recent data breach scandal, and recent calls for the breakup of Facebook’s extended network of services (including Instagram and WhatsApp) – it quickly becomes clear Facebook is past its prime and is no longer the single platform you should be focusing on. 

Video Content Is The Standard

For the past few years, my year-end lists have consistently included one thing – video content has been increasingly important for brands looking to maintain effective marketing and outreach. 

Well, call 2020 the tipping point, because video content is no longer “on the rise”. It is the standard and it is here to stay. 

While blog content remains important for technical SEO and connecting audiences with some specific types of information, the data makes it very clear that consumers prefer the quick, digestible, and entertaining nature of videos over long, often repetitive blog posts. 

At this point, rather than clicking to your blog page shoppers are more likely to check out your YouTube and Instagram page when trying to find out the details of what you offer and why they should choose you over the competition. 

Mobile SEO Is Now an Oxymoron

Since Google introduced its “Mobile-First Search Index” the writing has been on the wall. Having a mobile-friendly website was no longer an option or convenience. Mobile-optimized websites were quickly becoming the first thing anyone – including search engines – were likely to see when checking out your brand. 

With the recent announcement that Google would be dropping all desktop-only websites from its primary index starting in March 2021, the final nail is being pounded into the coffin. To be included on search results from the biggest search engine in the world, your website must be compatible with all the current mobile-friendly standards. 

With all this in mind, the age of considering separate SEO tactics and strategies for mobile users is long gone. There is just “SEO” and you must plan for mobile users if you want to have a chance of succeeding. 


We are all hoping that 2021 is a little less chaotic and a bit smoother than the past year has been. Still, even if we have the most tranquil year in history, there are bound to be a number of surprising new twists and factors in how Google ranks websites and content for users. If you want to remain competitive in an increasingly digital world, it is important that you stay up to date with all the latest from Google and be prepared to respond. 

Google confirmed this week that its most recent broad core update, which began rolling out on December 3, 2020, is now completely rolled out to all search users.

Google’s SearchLiason account announced “the December 2020 Core Update rollout is complete,” yesterday following almost two weeks of anxious waiting from webmasters and SEOs.

What We Know

Google is notoriously tight-lipped about its “secret recipe” used to rank websites around the world. Still, this update was big enough that the search engine felt it necessary to alert the public when the December core update started rolling out. 

This may simply be because the update rollout is global, affecting all users in all countries, across all languages, and across all website categories. 

However, early signs suggest the algorithm update was uncommonly big, with many reporting huge gains or losses in organic traffic from search engines. 

What Is a Broad Core Update?

Google’s “broad core updates” are essentially a tuneup of the search engine’s systems. Rather than adding a specific feature, targeting a singular widespread issue like linkspam, or prioritizing a ranking signal, a core update more subtly tweaks Google’s existing systems. This can be rebalancing the impact of some search signals, refining Google’s indexing tools, or any other combination of changes. 

What To Do If You Are Affected

The first thing any webmaster should do is thoroughly check their analytics to ensure they haven’t experienced a significant change in search traffic. 

If you have, you will be disappointed to hear that Google has not provided any specific guidance for how to recover from this update. In fact, the company suggests a negative impact from a core update may not even reflect any actual problems with your website.

What the search engine does offer is a series of questions to consider if you have been affected by a recent core update. Though not as useful as actual suggestions for fixing lost rankings, these questions can help you assess your site and identify areas for improvement before the next broad core update.

With the announcement that Google will begin including the “Core Web Vitals”  (CWV) metrics in its search engine algorithm starting next year, many are scrambling to make sense of what exactly these metrics measure and how they work.

Unlike metrics such as “loading speed” or “dwell time” which are direct and simple to understand, Core Web Vitals combine a number of factors which can get very technical.

To help you prepare for the introduction of Core Web Vitals as a ranking signal next year, Google is sharing a comprehensive guide to what CWV measures, and how they can affect your website. 

What Are Core Web Vitals

The first thing to understand is what exactly Core Web Vitals are. Simply put, CWV are a combination of three specific metrics assessing your page’s loading speed, usability, and stability. These three metrics appear very technical at first, but the gist is that your site needs to load quickly and provide a secure and easy to use experience. As for the specifics, Core Web Vitals include:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

Importantly, in the new guide, Google reaffirmed its intention to start using Core Web Vitals as a ranking signal in 2021. 

“Starting May 2021, Core Web vitals will be included in page experience signals together with existing search signals including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines.”

Does Every Page Need To Meet CWV Standards?

In the help document, Google explains that the Core Web Vitals standards it set out should be seen as a mark to aim for, but not necessarily a requirement for good ranking. 

Q: Is Google recommending that all my pages hit these thresholds? What’s the benefit?

A: We recommend that websites use these three thresholds as a guidepost for optimal user experience across all pages. Core Web Vitals thresholds are assessed at the per-page level, and you might find that some pages are above and others below these thresholds. The immediate benefit will be a better experience for users that visit your site, but in the long-term we believe that working towards a shared set of user experience metrics and thresholds across all websites, will be critical in order to sustain a healthy web ecosystem.

Will Core Web Vitals Make or Break Your Site?

It is unclear exactly how strongly Core Web Vitals metrics will be able to affect your site when they are implemented, but Google’s current stance suggests they will be a significant part of your ranking.

Q: How does Google determine which pages are affected by the assessment of Page Experience and usage as a ranking signal?

A: Page experience is just one of many signals that are used to rank pages. Keep in mind that intent of the search query is still a very strong signal, so a page with a subpar page experience may still rank highly if it has great, relevant content.

Other Details

Among the Q&A, Google also gives a few important details on the scope and impact of Core Web Vitals.

Q: Is there a difference between desktop and mobile ranking? 

A: At this time, using page experience as a signal for ranking will apply only to mobile Search.

Q: What can site owners expect to happen to their traffic if they don’t hit Core Web Vitals performance metrics?

A: It’s difficult to make any kind of general prediction. We may have more to share in the future when we formally announce the changes are coming into effect. Keep in mind that the content itself and its match to the kind of information a user is seeking remains a very strong signal as well.

The full document covers a wide range of technical issues which will be relevant for any web designer or site manager, but the big picture remains the same. Google has been prioritizing sites with the best user experience for years, and the introduction of Core Web Vitals only advances that effort. 

Find out more about Core Web Vitals here.

Google is adding a new set of ranking signals to its search engine algorithm in the coming year, according to an announcement this week. 

The search engine says it will begin factoring “Core Web Vitals” as a ranking signal starting in May 2021, combining with already existing user experience-related ranking signals. 

Google has been measuring Core Web Vitals since earlier this year, assessing the speed, responsiveness, and stability of web pages. 

These factors are what Google calls the Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

These signals will be joining the already announced page experience signals:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security
  • Intrusive interstitial guidelines

“These signals measure how users perceive the experience of interacting with a web page and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.”

Based on recent data assessments, this should concern the majority of websites out there. A study published in August suggests less than 15% of all websites would pass a Core Web Vitals assessment if the signals were implemented today. 

The search engine has also hinted at the potential to introduce new labels in search results, highlighting pages with the best user experience. Though nothing is set in stone, this would provide even more motivation for pages trying to maintain the best place in search results. 

For more information about updating your site for Core Web Vitals, you can explore Google’s resources and tools here

A lot has changed at Google over the past few years, but one thing remains the same – the majority of people will click the top link on any search result page. 

A new study of over 80 million keywords and billions of search results found that an average of 28.5% of users will click the top organic result for a given search. 

From there, the average CTR for results sharply declines. Listings in the second place receive an average of 15% of clicks, while third place falls to 11%. 

By the time you get to the last listing of a results page, links receive only a 2.5% click-through rate. 

You can imagine what the CTRs for anything after the first page would be like. 

Other Factors Influencing Search CTRs

Unsurprisingly, there is quite a bit of variance in the actual click-through rates for some results pages. In the study, Sistrix found click-through rates for listings in the first position swung from 13.7% to almost 50%. 

While the relevance of the top listing has some effect on its CTR, the study suggests another major factor is the SERP layout. 

For example, search results including sitelinks extensions significantly outperformed those without. 

On the other hand, the study found that search results including featured snippets had a significant negative impact, dropping click-through rates by at least 5% on average. 

Similarly knowledge panels reduced the average CTR from 28% to 16%.

In these situations, the researchers believe users don’t feel the need to investigate further when provided with quick answers directly within the search results pages:

“The CTR in the first two organic positions drops significantly compared to the average. Many users appear to find the information they are looking for in the Knowledge Panel – especially on their smartphones, where each time a page is loaded it takes a lot of time.“

For more information, you can explore the full study report here.