Tag Archive for: search engine rankings

Google has confirmed that it is “slowly” rolling out the Page Experience update for desktop search results.

Back in November, the search engine notified webmasters that it planned to extend the Page Experience update – originally limited to just mobile search results – to desktop search results by February 2022. 

While the update is rolling out on schedule, the company says the update will not be completed until closer to the end of March. 

What You Should Know About The Desktop Page Experience Update

For the most part, the algorithm update looks identical to the update introduced to mobile search results last year. As such, the most important aspect of the update is the use of Core Web Vitals metrics to measure website performance.

“This means the same three Core Web Vitals metrics: LCP, FID, and CLS, and their associated thresholds will apply for desktop ranking. Other aspects of page experience signals, such as HTTPS security and absence of intrusive interstitials, will remain the same as well.”

One factor is being dropped from the desktop Page Experience update, however. For obvious reasons, this version of the update will remove the mobile-friendliness signal which was originally built into the update.

What Does This Mean For Your Site?

If you have been having good performance on mobile search results, you are probably fairly safe from the newer desktop version of the Page Experience update. However, if you’ve seen drops in search visibility or performance from mobile searches in the past year, this update is likely to compound your pain. 

To help you predict how the algorithm update will impact you, Google Search Console is launching a new report specifically dedicated to Page Experience metrics for desktop versions of sites. 

This report is available in the Page Experience tab of Google Search Console, immediately under the mobile report.

Google confirmed this week that its most recent broad core update, which began rolling out on December 3, 2020, is now completely rolled out to all search users.

Google’s SearchLiason account announced “the December 2020 Core Update rollout is complete,” yesterday following almost two weeks of anxious waiting from webmasters and SEOs.

What We Know

Google is notoriously tight-lipped about its “secret recipe” used to rank websites around the world. Still, this update was big enough that the search engine felt it necessary to alert the public when the December core update started rolling out. 

This may simply be because the update rollout is global, affecting all users in all countries, across all languages, and across all website categories. 

However, early signs suggest the algorithm update was uncommonly big, with many reporting huge gains or losses in organic traffic from search engines. 

What Is a Broad Core Update?

Google’s “broad core updates” are essentially a tuneup of the search engine’s systems. Rather than adding a specific feature, targeting a singular widespread issue like linkspam, or prioritizing a ranking signal, a core update more subtly tweaks Google’s existing systems. This can be rebalancing the impact of some search signals, refining Google’s indexing tools, or any other combination of changes. 

What To Do If You Are Affected

The first thing any webmaster should do is thoroughly check their analytics to ensure they haven’t experienced a significant change in search traffic. 

If you have, you will be disappointed to hear that Google has not provided any specific guidance for how to recover from this update. In fact, the company suggests a negative impact from a core update may not even reflect any actual problems with your website.

What the search engine does offer is a series of questions to consider if you have been affected by a recent core update. Though not as useful as actual suggestions for fixing lost rankings, these questions can help you assess your site and identify areas for improvement before the next broad core update.

It’s a question we all have dealt with at least once or twice, and one that rarely has a satisfying answer: “Why did my Google rankings suddenly drop?”

Sometimes, a simple audit will reveal a technical hiccup or issue that is downgrading your rankings. Just as often, though, it appears everything is working as it should but you are suddenly further down the page or not even on the first page anymore. 

In this situation, Google’s John Mueller says there are four major reasons for sites to lose rankings. 

John Mueller Explains Why Sites Lose Rankings

In a recent Google Webmaster Central chat, Mueller was asked why a publisher who had ranked well for “seven or eight years” had suddenly lost rankings for three different sites. Notably, the person asking the question couldn’t find any signs of problems in their inbound or outbound links, and all the sites used the same keywords (they sell similar products by different brands). 

Of course, Mueller couldn’t get too specific with his answer because he didn’t have actual data or analytics on the sites. Still, he did his best to address four general reasons sites may suddenly rank worse.

1) Rankings Are Temporary

Once a site is ranking at the top for its ideal keywords, many site owners feel like they have accomplished their mission and will continue to rank there. Unfortunately, John Mueller says that rankings are malleable and change constantly.

Mueller explained:

“In general, just because the site was appearing well in search results for a number of years does not mean that it will continue to appear well in search results in the future.

These kinds of changes are essentially to be expected on the web, it’s a very common dynamic environment”

2) The Internet Is Always Changing

The reason why rankings are so prone to fluctuations is that the internet itself is always changing. New sites are being created every day, links might die, competitors might improve their own SEO, and people’s interests change.

Each and every one of these can have a big impact on the search results people see at any given time. 

As Mueller put it:

“On the one hand, things on the web change with your competitors, with other sites…”

3) Google Changes Its Algorithms

To keep up with the constantly changing internet, Google itself has to regularly overhaul how its search engine interprets and ranks websites. 

To give you one idea how this plays out, a few years ago search results were absolutely dominated by “listicles” (short top 5 or top 10 lists). Over time, people got tired of the shallow information these types of lists provided and how easily they could be abused as clickbait. Google recognized this and tweaked its algorithm to better prioritize in-depth information hyper-focusing on a single topic or issue. Now, though a listicle can still rank on Google, it is considerably harder than it used to be.

As Mueller simply explained:

“On the other hand, things on our side change with our algorithms in search.”

4) People Change

This is one that has been touched upon throughout the list Mueller gave, but it really gets to the heart of what Google does. What people expect out of the internet is constantly changing, and it is Google’s job to keep up with these shifts. 

In some cases, this can mean that people outright change how they search. For example, simple keywords like “restaurants near me” or “fix Samsung TV” were the main tool people used to find information for years and years. As voice search has become widespread and people have gotten more accustomed to using search engines all the time, queries have expanded to frequently include full sentences or phrases like “What is the best Chinese restaurant in midtown?”

At the same time, what people expect out of the same queries is also shifting with technological innovation and content trends. 

Mueller describes the situation by saying:

“And finally on the user side as well, the expectations change over time. So, just because something performed well in the past doesn’t mean it will continue to perform well in search in the future.”

Always Be Monitoring and Improving

The big theme behind all of these reasons sites lose rankings is that they are standing still while the world moves past them. To maintain your high rankings, your site has to be constantly in motion – moving with the trends and providing the content users want and expect from sites at any given time. 

This is why successful sites are also constantly monitoring their analytics to identify upcoming shifts and respond to any drops in rankings as soon as they happen.

If you want to see the full response, watch the video below (it starts with Mueller’s response but you can choose to watch the entire Webmaster Central office-hours discussion if you wish).

Google will soon be updating their search ranking algorithm with a new ranking signal. This new signal will combine a number of existing signals with a recently introduced metric known as Core Web Vitals. 

The search engine says the goal of the new update is to better rank pages based on the quality of users’ experiences with the site. 

In addition to the new ranking signal, the company announced a few other changes it will be making to its systems in the coming future:

  • Incorporating page experience metrics into rankings for Top Stories in Search on mobile
  • Removing the AMP requirement for content to be shown in Top Stories

The “New” Ranking Signal

While the new signal is being called the Page Experience Signal, it actually combines a few existing search ranking signals with the recently introduced Core Web Vitals details. The metrics being brought under the umbrella of Core Web Vitals include:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security certification
  • Following intrusive interstitial guidelines

As the company said in its announcement

“The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile.”

How To Monitor Your Core Web Vitals

To help prepare webmasters for the coming update, Google has also created a new report section within Search Console. The goal is for the new report to replace the need for a suite of tools aimed at specific issues such as page speed and mobile-friendliness.

The tool can also filter data based on those which are “Poor,” “Needs Improvement,” or “Good.”

When Will The Update Happen

While the update doesn’t really change all that much regarding how webmasters and SEO specialists should approach managing sites, the company sees it as important enough to give a significant notice ahead of the release. 

In fact, Google says these changes to the algorithm will not be happening before 2021. Additionally, the search engine will provide another notice 6 months before it is rolled out.

Google has announced it is rolling out a widespread update to its search engine algorithm which it is simply titled the ‘January 2020 Core Update’.

The update began rolling out late yesterday and will affect how the search engine ranks all web pages around the world. However, as it is a “broad core” update, there is no specific issue or ranking signal being prioritized like in past mobile or speed-related updates.

Rather, Google’s recommendations for optimizing for this update remain the same as past core updates, which can be found here.

In the past, Google has described its broad core updates using a metaphor:

“One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before.”

While the update is unlikely to radically shift search engine rankings, Google’s announcement of the update is relatively uncommon. Typically, Google prefers to quietly roll out broad updates and only confirm core updates when they relate to specific issues or are widely recognized.

This may signal that Google expects relatively large impacts on some search results, though it will take some time for the full impact of the update to become apparent.

Have you ever wonder exactly how Google works? How it sorts through the billions upon billions of web pages to find the best results for users?

The latest video in the company’s “Search for Beginners” series helps explain the basics behind how the search engine functions, including crawling, indexing, and ranking sites in its search results – specifically from the perspective of a business owner trying to get their site ranking well.

While the video doesn’t get into more advanced concepts like Search Engine Optimization, it lays out a very clear picture of how the broad strokes of online search engines work.

If you’ve just set up your first website or you’re considering investing in online marketing, this clip will be enlightening and point you towards some valuable resources like the Google Webmaster guidelines, SEO starter guide, and Google Webmasters portal.

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”

A lot of people have come to think of search engine optimization and content marketing as separate strategies these days, but Google’s John Mueller wants to remind webmasters that both are intrinsically linked. Without great content, even the most well-optimized sites won’t rank as high as they should.

The discussion was brought up during a recent Google Webmaster Central hangout where one site owner asked about improving rankings for his site.

Specifically, he explained that there were no technical issues that he could find using Google’s tools and wasn’t sure what else he could do to improve performance.

Here’s the question that was asked:

“There are zero issues on our website according to Search Console. We’re providing fast performance in mobile and great UX. I’m not sure what to do to improve rankings.”

Mueller responded by explaining that it is important to not forget about the other half of the equation. Just focusing on the technical details won’t always lead to high rankings because the content on the site still needs to be relevant and engaging for users.

The best way to approach the issue, in Mueller’s opinion, is to ask what issues users might be having with your products or services and what questions they might ask. Then, use content to provide clear and easily available answers to these questions.

In addition to these issues, Mueller noted that some industries have much stronger competition for rankings than others. If you are in one of these niches, you may still struggle to rank as well as you’d like against competition which has been maintaining an informative and well-designed site for longer.

You can read or watch Mueller’s answer in full below, starting at 32:29 in the video:

“This is always kind of a tricky situation where you’re working on your website for a while, then sometimes you focus on a lot of the technical details and forget about the bigger picture.

So what I would recommend doing here is taking your website and the queries that you’re looking [to rank] for, and going to one of the webmaster forums.

It could be our webmaster forum, there are lots of other webmaster forums out there where webmasters and SEOs hang out. And sometimes they’ll be able to look at your website and quickly pull out a bunch of issues. Things that you could be focusing on as well.

Sometimes that’s not so easy, but I think having more people look at your website and give you advice, and being open to that advice, I think that’s an important aspect here.

Another thing to keep in mind is that just because something is technically correct doesn’t mean that it’s relevant to users in the search results. That doesn’t mean that it will rank high.

So if you clean up your website, and you fix all of the issues, for example, if your website contains lots of terrible content then it still won’t rank that high.

So you need to, on the one hand, understand which of these technical issues are actually critical for your website to have fixed.

And, on the other hand, you really need to focus on the user aspect as well to find what are issues that users are having, and how can my website help solve those issues. Or help answer those questions.”

Online marketing can be a scary world sometimes. You invest time and money on building up your brand and improving your online visibility, but it can all come crashing down overnight. It’s terrifying to think of, but it is the reality of the internet – everything is always changing.

While there is no 100% guaranteed way to protect your brand from this risk, you can take a few steps to help you sleep easier and feel assured you won’t wake up to a marketing nightmare. As long as you cover these bases, you’ll be safe from the most common disasters that befall brands online.

Don’t let your domain get snatched up

Website domains are kind of a funny thing. They are the foundation you build all your online marketing efforts on. But, once you have it set it is incredibly easy to forget. It’s one of those things that seems like it should last forever.

But, of course, that’s not the case. If you aren’t keeping an eye on things, it can be easy to one day wake up and discover your domain registration has expired. That alone can be enough to make for a stressful morning, but in some cases you may find you have lost your domain entirely.

Having your domain snatched up when it lapses is shockingly common and can happen to businesses of any size. Even huge brands and public figures like the Dallas Cowboys and Microsoft have dropped the ball and had to pay huge amounts to regain control of their domains.

Thankfully, most registrars now offer services to help you maintain control of your domain. Some allow for auto-renewal, while others offer text message warnings before your domain expires. In the case of GoDaddy, the service keeps your domain safe for almost 20 days after it expires so that you can get them back.

Watch your rankings (properly)

I know more than a few business owners who maintain the same ritual. Every few days, they will pull up the browser of choice and check their ranking for a few specific keywords. So long as they remain fairly high on the first page, they stay happy.

If that sounds like you, I have some bad news. The search results you see aren’t necessarily what others are seeing and you may have been sliding down the page for everyone else all this time.

Online marketing disaster

Every major search engine does some form of customizing search results for individual users based on a variety of factors. This can include demographic data, location information, and search behaviors. So, if you’re regularly visiting your website or searching from inside the office, you’re pretty much guaranteed to be high-up when you search for your company.

Instead, get a real look at how you are performing by using an analytics service. There are a number of free services available, but I always recommend Google Analytics as a starting point. The learning curve can feel steep, but once you’ve got the hang of it, you can quickly find everything you want to know about your site’s performance.

Take control of your reputation

Stop me if you’ve heard this one before: A small business is gaining steam and popularity. Suddenly it all comes screeching to a halt, as your internet traffic dries up and your website stops converting. The culprit? A single bad review in the right place.

It is perhaps the scariest campfire story you can tell a business owner. Sites like Yelp have become notorious for destroying local businesses who handled a negative review the wrong way.

In some cases, the business owners just try to ignore the bad review while it festers and drives away any interested customers. Unfortunately, ignoring the problem doesn’t make it go away. In even worse situations, a business owner can let their emotions get to them and lash out at the reviewer – a move pretty much guaranteed to make you go viral for reasons you don’t want.

Neither of these is the right approach, and both can cause you headaches for a long time to come.

Thankfully, there is a right way to make sure your online reputation remains stellar no matter how scathing of a review you get from a disgruntled customer – address it head on every time. Rather than letting it get to you, keep an even head and treat every customer how you’d like to be treated. You’d be shocked to see just how many angry customers can be flipped into brand advocates so long as they feel their voice is heard.

The most important thing here is to be sure to watch all the major places people are likely to be talking about your business and be able to respond appropriately. This includes Yelp, Twitter, and Facebook. Check at least once a day to make sure you haven’t gotten any new reviews that need your attention. While you’re at it, it also never hurts to take the time to show your thanks to any positive reviews or mentions you’ve received as well.

Google’s Penguin algorithm has been a core part of the search engines efforts to fight spam and low-quality content for years, but it has always been its own thing. The algorithm ran separate from Google’s core algorithm and was refreshed periodically. But that is all changing.

Starting today, Penguin is running in real-time as part of Google’s primary algorithm in all languages.

What Does This Mean?

In the past, the Penguin algorithm has been relatively static. When it was updated or refreshed, it would dish out penalties and remove penalties from those who had gone successfully gone through the reconsideration process. The only problem was these updates were sporadic, at best. In fact, the last update was over 700 days ago.

By turning Penguin into a real-time part of its algorithm, Google is speeding up the entire system so penalties can be given when a site is flagged and those who have resolved their problems can lose their penalty more quickly.

According to Google, Penguin can now make changes in roughly the same period of time it takes the search engine to crawl and re-index a page.

What Else Is Changing?

While the speed of Penguin is the biggest change as it becomes part of the core algorithm, there are some other small tweaks to how it works.

Penguin is now more targeted, only penalizing specific pages with that break link guidelines. Google Penguin used to punish the entire site for containing pages containing spammy link building practices, but now it will only devalue the individual pages.

Google is also making some changes to how it talks about Penguin in the public. Or, as the company stated, “We’re not going to comment on future refreshes.”