Google My Business is an essential tool for any local business trying to spread their name online. It is also deceptively complicated. 

At first glance, GMB seems very simple and easy to set up. You just fill out a few forms, answer a few questions, upload a couple of pictures,, and presto! You’ve got a GMB listing. 

Actually optimizing that listing to ensure it appears in nearby customers’ searches, however, is where things get complicated. 

As usual, Google is remarkably non-transparent about how it ranks local searches.There are a few things that have become very apparent over the years. It is pretty much undeniable that having a lot of 5 star reviews will help you rank better. On the other hand, there is reason to believe some sections have absolutely no impact on your local rankings. To get to the truth of how the algorithm works, we have to look at data from tests.

Recently, MozCon speaker Joy Hawkins shared some findings her and her team have made from their own tests and data about what GMB sections help you rank better.

Which Google My Business Sections Affect Rankings

1) Business Name

Sometimes the simplest things can become unbelievably complicated. You almost certainly chose your business name well before making a listing, and you can’t exactly change it now. 

Unfortunately, this puts some businesses at a disadvantage while others get a natural step up. 

According to Hawkins, businesses with a keyword in their name get a boost in local rankings. There is one things you can do though.

As she explains:

“The real action item would be to kind of look to see if your competitors are taking advantage of this by adding descriptive words into their business name and then submitting corrections to Google for it, because it is against the guidelines.”

2) Categories

This is another section that seems like it should be very simple. You can check up to 10 boxes that match your business, including everything from Aboriginal Art Gallery to Zoo. Where this becomes tricky is ensuring the categories you choose remains the most accurate for your business. 

Hawkins’ team found that Google is updating it’s list of categories between 2 to 10 times each month on average. In some cases, they are adding new categories that may be a more specific match for your business. Other times, they may entirely remove categories they feel are irrelevant or unnecessary. 

Either way, it is up to you to keep your business categorized properly to protect your ranking.

3) Website

The vast majority of listings use the homepage of their website as their primary website listing on everything, including Google My Business. It makes sense, and it works perfectly fine. 

What Hawkins’ found, though, is that some businesses actually benefit from choosing a more specific page of their site. For example, businesses with multiple locations can link to a specific location page to specify exactly which store you are directing them to. 

In this section, there is no agreed upon best practice. Instead, Hawkins says to test several pages over time to ensure you are maximizing your exposure. 

4) Reviews

I mentioned it up above but it bears repeating. The number of positive reviews absolutely affects your ranking in local search results. 

There is a small catch, however. According to the what Hawkins’ team has seen, increasing the number of reviews on your listing may have diminishing returns.

“So for example, if you’re a business and you go from having no reviews to, let’s say, 20 or 30 reviews, you might start to see your business rank further away from your office, which is great. But if you go from, let’s say, 30 to 70, you may not see the same lift. So that’s something to kind of keep in mind.”

Still, reviews have consistently been shown to be a major ranking factor AND they improve the click-through-rate of listings. This is obviously an area you will want to invest some energy in. 

If you want to learn a little more about how these sections impact your rankings or you want to see which fields have absolutely no effect, you can read Joy Hawkins’ original post here.

The past six months have seen upheavals in just about every area of life, from schooling, to work, to our daily shopping habits. Now, a report from BrightEdge suggests these shifts are going to continue at unprecedented levels through the holiday shopping season. 

The report, based on an evaluation of eCommerce clients across a wide range of industries predicts a historic online holiday shopping season for a massive range of interests – emphasizing the need to start preparing now. 

What Changed

It’s no secret that the COVID-19 pandemic has forced many to do more online shopping this year. In-store shortages, quarantines, and general concern with public gatherings has made online shopping a go-to choice for both essentials and luxuries.

Interestingly, the analysis suggests that though shoppers are doing significantly more online shopping compared to 2019, the revenue per order remains relatively stable. 

The report offers two theories for why this shift is occurring:

“Here are our hypotheses:

1. Shoppers shifted purchasing behaviors online during the first few weeks of the COVID-19 pandemic and are more keenly aware of their budgets – refraining from placing big-ticket purchases online, while stocking up on more essential goods or affordable luxuries.

2. Shoppers started buying more frequently online after COVID-19, supplanting offline purchases. This would include the buy-online, pickup-curbside behavior that may include cheaper items that were previously always purchased in-store.”

Black Friday Goes Virtual

Another major factor contributing to the predicted surge in online shopping during the 2020 holiday shopping season is the cancellation of many Black Friday events. 

Not only does this mean consumers will be forced to look elsewhere for big deals, it is expected that many retailers will be pushing huge online sales for Black Friday to make up for lost revenue. It is also expected for Cyber Monday to gain an even higher profile this year. 

What You Can Do

With all these factors in mind, BrightEdge has one recommendation for brands trying to regain their footing this holiday season – invest in search engine optimization. 

As the report says:

“An impressive 60% of consumers have been shopping online more often since COVID-19, and of that group, 73% plan to continue after the pandemic. What digital marketers and SEOs have long known is finally coming to fruition: online shopping is convenient and easy. Now the trick is to make SEO important within your organization.”

Other Takeaways

In the conclusion of the report, BrightEdge offered a few key insights into the current behavior of online shopping and what they expect to see in the future:

  • “The research suggests that shoppers browse more frequently, leading to more purchases and overall revenue, though these purchases are smaller in value.
    This could be because shoppers are becoming more aware of their budgets – refraining from placing big-ticket purchases online while stocking up on more essential goods or affordable luxuries.
  • As we enter Q4 and the holiday shopping season, search is helping reveal radical changes in real-time.
  • The traditional customer journey is being radically altered in many industry sectors of the economy.  As a result, it has never been more important to truly understand how consumer behavior and use this understanding to drive engaging experiences.”

To view the full report, check out BrightEdge’s complete holiday shopping guide here.

It’s a question we all have dealt with at least once or twice, and one that rarely has a satisfying answer: “Why did my Google rankings suddenly drop?”

Sometimes, a simple audit will reveal a technical hiccup or issue that is downgrading your rankings. Just as often, though, it appears everything is working as it should but you are suddenly further down the page or not even on the first page anymore. 

In this situation, Google’s John Mueller says there are four major reasons for sites to lose rankings. 

John Mueller Explains Why Sites Lose Rankings

In a recent Google Webmaster Central chat, Mueller was asked why a publisher who had ranked well for “seven or eight years” had suddenly lost rankings for three different sites. Notably, the person asking the question couldn’t find any signs of problems in their inbound or outbound links, and all the sites used the same keywords (they sell similar products by different brands). 

Of course, Mueller couldn’t get too specific with his answer because he didn’t have actual data or analytics on the sites. Still, he did his best to address four general reasons sites may suddenly rank worse.

1) Rankings Are Temporary

Once a site is ranking at the top for its ideal keywords, many site owners feel like they have accomplished their mission and will continue to rank there. Unfortunately, John Mueller says that rankings are malleable and change constantly.

Mueller explained:

“In general, just because the site was appearing well in search results for a number of years does not mean that it will continue to appear well in search results in the future.

These kinds of changes are essentially to be expected on the web, it’s a very common dynamic environment”

2) The Internet Is Always Changing

The reason why rankings are so prone to fluctuations is that the internet itself is always changing. New sites are being created every day, links might die, competitors might improve their own SEO, and people’s interests change.

Each and every one of these can have a big impact on the search results people see at any given time. 

As Mueller put it:

“On the one hand, things on the web change with your competitors, with other sites…”

3) Google Changes Its Algorithms

To keep up with the constantly changing internet, Google itself has to regularly overhaul how its search engine interprets and ranks websites. 

To give you one idea how this plays out, a few years ago search results were absolutely dominated by “listicles” (short top 5 or top 10 lists). Over time, people got tired of the shallow information these types of lists provided and how easily they could be abused as clickbait. Google recognized this and tweaked its algorithm to better prioritize in-depth information hyper-focusing on a single topic or issue. Now, though a listicle can still rank on Google, it is considerably harder than it used to be.

As Mueller simply explained:

“On the other hand, things on our side change with our algorithms in search.”

4) People Change

This is one that has been touched upon throughout the list Mueller gave, but it really gets to the heart of what Google does. What people expect out of the internet is constantly changing, and it is Google’s job to keep up with these shifts. 

In some cases, this can mean that people outright change how they search. For example, simple keywords like “restaurants near me” or “fix Samsung TV” were the main tool people used to find information for years and years. As voice search has become widespread and people have gotten more accustomed to using search engines all the time, queries have expanded to frequently include full sentences or phrases like “What is the best Chinese restaurant in midtown?”

At the same time, what people expect out of the same queries is also shifting with technological innovation and content trends. 

Mueller describes the situation by saying:

“And finally on the user side as well, the expectations change over time. So, just because something performed well in the past doesn’t mean it will continue to perform well in search in the future.”

Always Be Monitoring and Improving

The big theme behind all of these reasons sites lose rankings is that they are standing still while the world moves past them. To maintain your high rankings, your site has to be constantly in motion – moving with the trends and providing the content users want and expect from sites at any given time. 

This is why successful sites are also constantly monitoring their analytics to identify upcoming shifts and respond to any drops in rankings as soon as they happen.

If you want to see the full response, watch the video below (it starts with Mueller’s response but you can choose to watch the entire Webmaster Central office-hours discussion if you wish).

Bing has officially completed the launch of its new Bing Webmaster Tools, which streamlines the old tool suite while offering a number of new features. 

According to the announcement, the process managed to condense the old version’s 47 unique links to just 17 different links without losing any of the functionality previously available. This was done by bundling redundant or related functions into more powerful tools.

At the same time, Bing announced it had introduced a new URL inspection tool, a Robots.txt testing feature, a site scan tool, and revamped webmaster guidelines. 

Choose Your Bing Webmaster Tools

For now, webmasters can choose to use the new or old version of Bing Webmaster Tools. The old suite is available at https://www.bing.com/webmaster/. The new version can be found at https://www.bing.com/webmasters/

However, the old version won’t be sticking around for too long. The announcement says it will be disabled sometime next month. 

Enhanced Tools

While streamlining the platform, Bing expanded the functionality of several tools. These updates include:

  • Backlinks lists backlinks for any site, including similar websites.
  • Keyword Research lets you filter data by countries, languages, and devices. 
  • URL Submission is better streamlined for easier navigation. This includes simplifying URL submission via the Bing WordPress plugin for faster indexing. 
  • SEO Reports provides improved classification of errors or issues. 

New Tools

Along with the consolidated and enhanced tools from the old version of Bing Webmaster Tools, the company revealed several new tools. These include:

  • URL Inspection: A beta feature that allows Bing to inspect crawled versions of your site for potential indexing issues.
  • Site Scan: A site audit tool that crawls and checks your site for common SEO issues which may affect your search ranking. 
  • Robots.txt Tester: Check your robots.txt file using the same inspection tools Bing uses to verify your URLs.

Google appears to be testing the idea of “upgrading” Google My Business profiles with a special “Google Guaranteed” badge for a $50 monthly fee.

Twitter user Tom Waddington shared a screenshot for a promotional page within the GMB dashboard offering the profile upgrade.

What Is Google Guaranteed?

The search engine has been playing with the “Google Guaranteed” badge since last year, though it has typically been used in Local Service Ads for home services businesses. 

To be eligible for the badge, businesses must meet a number of criteria including certification, licensing, and background checks. 

The idea appears to be to inspire more confidence in brands listed in Google’s local results by highlighting those who have been vetted. 

Why Would Anyone Pay For This?

On its face, the idea of paying $50 a month for what amounts to a stamp of approval sounds a little silly. However, the badge comes with some backing which may help customers feel more at ease.

Along with the Google Guarantee badge, businesses which pass the screening process are also backed with a customer satisfaction guarantee. If a customer finds your business through the search engine and is not satisfied with the results, Google will refund the amount paid up to $2,000.

Along with this aspect, there is always the issue of getting ahead of your competition. Any little advantage can be the key to standing apart from your competitors.

Just an “Experiment” … For Now

When asked about the program via email, a Google spokesperson told Search Engine Journal:

“We’re always testing new ways to improve our experience for our advertisers, merchants, and users. This experiment will show the Google Guaranteed badge on the business profile. We don’t have anything additional to announce right now.”

A lot has changed at Google over the past few years, but one thing remains the same – the majority of people will click the top link on any search result page. 

A new study of over 80 million keywords and billions of search results found that an average of 28.5% of users will click the top organic result for a given search. 

From there, the average CTR for results sharply declines. Listings in the second place receive an average of 15% of clicks, while third place falls to 11%. 

By the time you get to the last listing of a results page, links receive only a 2.5% click-through rate. 

You can imagine what the CTRs for anything after the first page would be like. 

Other Factors Influencing Search CTRs

Unsurprisingly, there is quite a bit of variance in the actual click-through rates for some results pages. In the study, Sistrix found click-through rates for listings in the first position swung from 13.7% to almost 50%. 

While the relevance of the top listing has some effect on its CTR, the study suggests another major factor is the SERP layout. 

For example, search results including sitelinks extensions significantly outperformed those without. 

On the other hand, the study found that search results including featured snippets had a significant negative impact, dropping click-through rates by at least 5% on average. 

Similarly knowledge panels reduced the average CTR from 28% to 16%.

In these situations, the researchers believe users don’t feel the need to investigate further when provided with quick answers directly within the search results pages:

“The CTR in the first two organic positions drops significantly compared to the average. Many users appear to find the information they are looking for in the Knowledge Panel – especially on their smartphones, where each time a page is loaded it takes a lot of time.“

For more information, you can explore the full study report here.

Google is making a change to how featured some featured snippets function by taking users directly to the associated text when clicked. 

Featured snippets are the highlighted search results that appear at the top of some results pages, showing a specifically relevant bit of text.

The company announced the update through its Google SearchLiason Twitter account, which posted:

“As we have done with AMP pages since December 2018, clicking on a featured snippet now takes users to the exact text highlighted for HTML pages, when we can confidently determine where the text is.”

While it is a relatively small change, it makes featured snippets even more useful to searchers (and thus, more essential for businesses to put in place). 

Surprisingly, the company says there is no additional code or special markup needed to prepare your featured snippets for this change. 

Instead, the search engine is essentially using a trick that highlights specific text by tweaking the URL for each snippet.

As Roger Montti explained over at Search Engine Journal, this is a feature previously used for Accelerated Mobile Pages (AMP), which has the dual benefits of being easy to implement and trackable.

Google appears to be testing the idea of integrating its normal web search into search results on YouTube. 

Reddit users have been reporting seeing results and links to traditional web pages when doing searches on the video platform, as you can see in the screenshot below:

YouTube Showing Google Search Result

As you can see, Google places a single web page result among the video results, with an option to click the link or jump to a search directly on Google. 

The test appears to be limited, with many (including myself) being unable to replicate it. However, there are enough reports to conclude this is a legitimate test and not a glitch or hoax. 

So far, reaction to the move has been mixed. Many have decried the potential new feature as “annoying” and said they would “ryin the YouTube experience.”

However, there have also been those who see potential in the concept, saying it could make it easier to leap to Google when YouTube doesn’t provide the results someone is hoping for.

As one user described:

“Sometimes I’m looking for a tutorial but I want a video explaining it, and if it doesn’t exist now I have the option to do a quick Google search in the app.”

Personally I see some utility in integrating a single, non-obtrusive link within video search results. Obviously, those searching on YouTube are primarily looking for exclusively video content, but there are certainly scenarios where users are moving back and forth between YouTube and Google. This would be a convenient option for those situations.

Google will soon be updating their search ranking algorithm with a new ranking signal. This new signal will combine a number of existing signals with a recently introduced metric known as Core Web Vitals. 

The search engine says the goal of the new update is to better rank pages based on the quality of users’ experiences with the site. 

In addition to the new ranking signal, the company announced a few other changes it will be making to its systems in the coming future:

  • Incorporating page experience metrics into rankings for Top Stories in Search on mobile
  • Removing the AMP requirement for content to be shown in Top Stories

The “New” Ranking Signal

While the new signal is being called the Page Experience Signal, it actually combines a few existing search ranking signals with the recently introduced Core Web Vitals details. The metrics being brought under the umbrella of Core Web Vitals include:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security certification
  • Following intrusive interstitial guidelines

As the company said in its announcement

“The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile.”

How To Monitor Your Core Web Vitals

To help prepare webmasters for the coming update, Google has also created a new report section within Search Console. The goal is for the new report to replace the need for a suite of tools aimed at specific issues such as page speed and mobile-friendliness.

The tool can also filter data based on those which are “Poor,” “Needs Improvement,” or “Good.”

When Will The Update Happen

While the update doesn’t really change all that much regarding how webmasters and SEO specialists should approach managing sites, the company sees it as important enough to give a significant notice ahead of the release. 

In fact, Google says these changes to the algorithm will not be happening before 2021. Additionally, the search engine will provide another notice 6 months before it is rolled out.

In recent weeks, LinkedIn has been updating its algorithm it uses to rank content with new signals like “dwell time” or how long users spend with each piece of content. 

Even more, the company has also revealed its secret ranking recipe by using a blog post to dig deep into exactly how it ranks content. 

How LinkedIn Ranks Content

Similar to other major algorithms like those used by Facebook, YouTube, and Google, LinkedIn tries to tailor users’ feeds to their specific interests and niches. To do this, LinkedIn follows a specific process.

When a user logs on, there tend to be tens of thousands of potential posts the social network could choose to show you. To filter these down, the algorithm first applies a lightweight ranking algorithm referred to as a “first-pass candidate generation layer”. This helps choose specifically which posts you might see based on a number of factors including connections and keywords. 

From here, the algorithm now has to determine what order these posts will be shown in. 

As the company describes, “If Alice’s connection Bob recently shared an interesting article, what determines where Bob’s post will appear in Alice’s feed?”

For this, LinkedIn looks at what it calls “viral actions” which include:

  • Reacts
  • Shares
  • Comments

Based on individual users’ actions, the algorithm weighs these interactions with content to determine which content is most likely to create user engagement.

How Dwell Time Fits Into This

While LinkedIn’s algorithm has largely been successful at curating a feed with content most likely to generate user actions, the company says it has noticed some downsides to this approach. 

Specifically, actions like clicks and shares are relatively rare when compared to the total number of people seeing each piece of content. In the grand scheme, focusing on some binary metrics like clicks may miss out on other more passive forms of engagement which may reflect quality content. 

In other words, LinkedIn’s old system could see simple measures like whether someone clicked a post, but it wasn’t factoring in more complex metrics like how long a person was spending with a piece of content after taking action. 

This creates problems when content simply doesn’t live up to its promise or users could potentially share misleading posts to drive clicks.

When this happens, people might click on a post and almost immediately return to their feed.

With the old system, these posts would get rewarded for the number of clicks made, despite the content being unsatisfying. 

Because of these issues, LinkedIn says accounting for dwell time provides numerous advantages for its algorithm:

LinkedIn Dwell Time Benefits

How This Affects You

Overall, this update should have very little negative impact on those already creating informative and engaging content on the professional social network. If anything, you may benefit as the new algorithm punishes those sharing clickbait.

However, it is unclear if LinkedIn’s latest system also accounts for the overall length of content. This could potentially create issues where shorter updates might be downplayed over more in-depth content simply because people spend less time with each individual post. This may be something to keep in mind as the impact of this update takes effect.