Tag Archive for: Google SEO

Google is making some big changes to how it ranks results that aim to deliver more personalized search results and increase the prevalence of “first-hand knowledge”.

The search engine announced the changes earlier this month while spotlighting two specific updates that have recently come to users. 

Cathy Edwards, Vice President of Search at Google, says these updates will better connect humans with the topics and content that are most relevant to their interests and needs:

“Search has always been about connecting human curiosity with the incredible expanse of human wisdom on the net. These advancements will help users find the most helpful information just for them, no matter how specific their questions may be. 

Bringing First-Hand Knowledge To The Surface

Google has made adjustments to its ranking algorithm to show more first-person perspectives higher in search results. While the company didn’t tell us exactly how it tweaked the algorithm, Edwards emphasizes that it will help people find new individual experiences, advice, and opinions when searching. 

With this change, the company says it will hopefully show fewer repetitive pieces of content that don’t bring new perspectives or opinions in the first pages of results. 

The announcement says:

“As part of this work, we’ve also rolled out a series of ranking improvements to show more first-person perspectives in results, so it’s easier to find this content across Search.”

Follow Topics For More Curated Results

Google is giving you the ability to curate your own search results by following topics that are important to you. 

By following topics in search results, such as a favorite football team, style of restaurant, or genre of music, you can stay in touch with these topics naturally while you are searching. 

Follows not only impact what you see in typical search results but help highlight important topics in Discover and other areas of Google.

You can see an example of how this can shape your search results below. The first image shows what search results looked like before this update rolled out, and after.

Like most changes to the search results, however, it is unclear exactly how this affects optimization strategies going forward. We will know more as we get more data in the coming weeks.

Personalization Is The Future

Google has been increasingly customizing search results for users based on numerous factors including location, age, gender, demographics, and more. These latest updates continue this effort to ensure that the search results you see aren’t just the most relevant sites for anyone. They are the most relevant search results for you.

Google’s Search Liaison, Danny Sullivan, raised some eyebrows over the weekend by saying that “major changes” are coming to Google’s search results. 

The statement came during a live talk, where Sullivan reportedly told the crowd to “buckle up” because major changes were on the way.

As the public voice for Google’s Search team, Sullivan is uniquely positioned to speak on what the search engine’s developers are working on behind the scenes. For businesses, this means that he is one of the only people who can give advance notice about upcoming shifts to search results that could impact your online visibility and sales. 

What Did Sullivan Say?

Since it wasn’t livestreamed or recorded, there’s been some discussion about exactly what Sullivan told the crowd. Posts on X agree on a few details though. 

While attendees agree Sullivan specifically used the phrase “buckle up”, a few users provided longer versions of the quote that paint a slightly different picture. 

One person, Andy Simpson, says the entire quote was “There’s so much coming that I don’t want to say to buckle up because that makes you freak out because if you’re doing good stuff, it’s not going to be an issue for you.”

This is likely the case, as Sullivan has since clarified:

“I was talking about various things people have raised where they want to see our results improve, or where they think ‘sure, you fixed this but what about….’ And that these things all correspond to improvements we have in the works. That there’s so much coming that I don’t want to say buckle up, because those who are making good, people-first content should be fine. But that said, there’s a lot of improvements on the way.”

Either way, it is important for businesses to take note of these statements and watch their site’s search results performance for any signs of major shifts in the near future. 

Think using blogs to get to the top of the search engines is a thing of the past? Don’t be so quick to ditch your brand’s blog because a new study suggests that blog posts are the most common type of content found in the top 5 Google search results (excluding homepages). 

Even with low-quality AI-generated blog content on the rise, BrightEdge says that blogs are the leading type of content returned by Google – a strong indication that blogs with well-crafted content are one of the strongest search engine optimization tools available to brands today. 

About The Study

For the study, BrightEdge analyzed results for a dataset of 10,000 keywords of varying intent across 10 specific industries:

  • Banking
  • Insurance
  • Retail
  • Software
  • Higher Education
  • Real Estate
  • Advertising and Marketing
  • Manufacturing
  • Travel and Hospitality
  • Industrial

Using data collected during August of this year, the study then analyzed the content types of 23,785 pages ranking in the top 10 search positions. 

While the leading type of page found in the top search results were homepages, these were excluded because these are essentially the default type of page Google returns when it believes a site may be relevant but it does not know which specific page to recommend.

Once homepages have been accounted for and excluded, the leading type of content in top search results was blog posts – accounting for 19% of the top 10 search results. When you narrow the focus to just the top 5 search results, that climbs to 23% of search results. 

The Takeaway

Many brands have been moving away from traditional brand blogs because of a misguided notion that blogs were becoming irrelevant compared to more interactive or visual media like videos or user-generated content. This trend has only accelerated with the recent surge in lower-quality content pumped out by generative AI systems. 

As Jim Yu, founder of BrightEdge and executive chairman says, however, well-maintained blogs are still an essential tool for raising the visibility of your brand and educating consumers:

“The future is not just AI – it’s AI and human symbiosis. AI can inform and assist, but human creativity, expertise and skill sets are necessary to add the voice and trust of your brand. Success lies in the fusion of AI and human expertise throughout any content creation process,” 

Brands struggling to make progress in this area should likely re-evaluate their content and ensure their strategy is focused on delivering relevant, useful, and interesting information to your target market. 

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below:

Google has confirmed it is rolling out its latest broad core algorithm update, signifying yet another potential shake-up for the search engine’s results.

Google’s broad core algorithm updates serve as some of the most significant updates for the search engine compared to the smaller updates that are happening multiple times a day. They can affect rankings for search engine results pages (SERPs) throughout Google’s entire platform.

As is usual with Google, the search company is being tight-lipped about specific details regarding the update, only going so far as to confirm the latest update. The update is also expected to take up to multiple weeks for the full impact to be obvious.

With this in mind, it is wise for brands to take note and monitor their own search performance in the coming weeks.

What Can You Do?

Aside from always striving to provide the best online experience possible with your website, there are a few specific steps you can take to safeguard your site from updates like these:

  • Monitor site performance regularly to identify early signs of issues with your site
  • Create content geared to your audience’s needs and interests
  • Optimize your site’s performance (including speed, mobile-friendliness, and user experience) to ensure your site isn’t off-putting to visitors

TL;DR

Google has launched its latest broad core algorithm update, which could potentially affect rankings for search engine results pages. The update may take several weeks to have full impact, so brands are advised to monitor their search performance. To safeguard your site, monitor its performance regularly, create audience-specific content, and optimize its performance for speed, mobile-friendliness, and user-experience.

If you’re still unclear on how Google thinks about marketing agencies that offer negative SEO linkbuilding services or link disavowal services, the latest comments from John Mueller should help clarify the company’s stance. 

In a conversation that popped up on Twitter between Mueller and several marketing experts, Mueller clearly and definitively slammed companies offering these types of services by saying that they are “just making stuff up and cashing in from those who don’t know better.”

This is particularly notable as some have accused Google of being unclear on their handling of link disavowal using their tools

The post that started it all came from Twitter user @RyanJones who said, “I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

In response, one user began talking about negative SEO which caught the attention of Mueller. The user mentioned that “agencies know what kind of links hurt the website because they have been doing this for a long time. It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well. They will provide you examples as well with proper insights.”

In response, Mueller gave what is possibly his clearest statement on this type of “service” yet:

“That’s all made up & irrelevant. These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Instead of spending time and effort on any of this, Mueller instead recommended something simple:

“Don’t waste your time on it; do things that build up your site instead.”

Google is encouraging brands to ensure content is properly dated in search engines by using multiple date indicators on each page. 

The recommendation came in the wake of an issue with Google News where the wrong dates were being shown.

In the response, Google’s Search Liaison, Danny Sullivan, emphasized that while many factors may have contributed in this specific situation, the lack of proper date signals made it difficult to show correct info in the search results. 

“That page is a particular challenge since the main story lacks a visible date (it only has a time), and the page contains multiple stories which do contain full dates. Our guidance warns about this.”

To prevent situations like this from arising, Sullivan says it is important to use several signals to clarify the date content is published:

“Understand that ideally, the meta data alone would seem to some to be enough, and we’ll keep working to improve. But there are good reasons why we like multiple date signals present.”

Why Does This Matter?

It may not seem like a big deal for the wrong date to occasionally get shown with content in the search results. However, these can undermine your authority, lead to confusion, and create a poor user experience. All of these can lead to decreased page performance and even demotions in Google’s search results.

On the other hand, situations like this also highlight the need for Google to deliver more consistent ways to signal a page’s publishing date. 

For now, the best recommendation Google has is to use a scattershot approach for the best chance of having your page correctly dated:

“Google doesn’t depend on a single date factor because all factors can be prone to issues. That’s why our systems look at several factors to determine our best estimate of when a page was published or significantly updated.”

Today, Google revealed it is preparing a massive update called the Helpful Content Update that may be the biggest change to the search engine’s algorithm in years.

The update is aiming to filter out sites that have large amounts of content that are written solely for the search engine, without providing value to actual users.

Or, as Google simply put it in its announcement:

“The helpful content update aims to better reward content where visitors feel they’ve had a satisfying experience, while content that doesn’t meet a visitor’s expectations won’t perform as well.”

Here’s what we know about the update so far:

What Is The Google Helpful Content Update?

Philosophically, there is little about the helpful content update which is all that different from what Google has been working towards in the past. 

The algorithm update aims to help users find the most high-quality content which will be the most helpful. What sets it apart is how it aims to achieve this.

In this instance, Google plans to improve search results by targeting and removing what could be called “search engine-first content” or content written expressly for the purpose of boosting rankings without actually delivering quality content to readers.

While the algorithm will be applied to all Google search results when it rolls out, the company said four specific types of sites are most likely to be affected:

  • Online educational materials
  • Arts & entertainment
  • Shopping
  • Tech

Content in these niches seem to be most prone to being written specifically for search engines rather than humans and Google hopes to improve the quality of results in these areas.

As a representative from Google told Search Engine Land’s Barry Schwartz:

“If you search for information about a new movie, you might have previously encountered articles that aggregated reviews from other sites without adding perspectives beyond what’s available elsewhere on the web. This isn’t very helpful if you’re expecting to read something new. With this update, you’ll see more results with unique information, so you’re more likely to read something you haven’t seen before.”

Is your site safe?

Rather than provide a simple checklist of things companies can do to prepare their website, Google offered a series of questions that can be used to determine if you’re creating content for humans or search engines:

  • Do you have an existing or intended audience for your business or site that would find the content useful if they came directly to you? 
  • Does your content clearly demonstrate first-hand expertise and a depth of knowledge (for example, expertise that comes from having actually used a product or service, or visiting a place)?
  • Does your site have a primary purpose or focus?
  • After reading your content, will someone leave feeling they’ve learned enough about a topic to help achieve their goal?
  • Will someone reading your content leave feeling like they’ve had a satisfying experience?
  • Are you keeping in mind our guidance for core updates and for product reviews?

Additionally, the Google Search Central article provided a similar list of questions you can use to avoid search-engine first content in the future:

  • Is the content primarily to attract people from search engines, rather than made for humans?
  • Are you producing lots of content on different topics in hopes that some of it might perform well in search results?
  • Are you using extensive automation to produce content on many topics?
  • Are you mainly summarizing what others have to say without adding much value?
  • Are you writing about things simply because they seem trending and not because you’d write about them otherwise for your existing audience?
  • Does your content leave readers feeling like they need to search again to get better information from other sources?
  • Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t).
  • Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you’d get search traffic?
  • Does your content promise to answer a question that actually has no answer, such as suggesting there’s a release date for a product, movie, or TV show when one isn’t confirmed?

When Will It Arrive

The helpful content update is due to roll out next week to all English-language search results in the U.S. The company plans to expand the update to other languages and countries sometime in the future.

In an update to the help documentation for Googlebot, the search engine’s crawling tool, Google explained it will only crawl the first 15 MB of any webpage. Anything after this initial 15 MBs will not influence your webpage’s rankings.

As the Googlebot help document states:

“After the first 15 MB of the file, Googlebot stops crawling and only considers the first 15 MB of the file for indexing.

The file size limit is applied on the uncompressed data.”

Though this may initially raise concerns since images and videos can easily exceed these sizes, the help document makes clear that media or other resources are typically exempt from this Googlebot limit:

“Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately.”

What This Means For Your Website

If you’ve been following the most commonly used best practices for web design and content management, this should leave your website largely unaffected. Specifically, the best practices you should be following include:

  • Keeping the most relevant SEO-related information relatively close to the start of any HTML file. 
  • Compressing images.
  • Leaving images or videos unencoded into the HTML when possible.
  • Keeping HTML files small – typically less than 100 KB.

Despite Google being very clear about its feelings on paying for SEO links (hint: it is not a fan), I still regularly come across stories of brands spending hundreds or even thousands of dollars on links that promise to increase their rankings.

Typically, these individuals have heard success stories from others who had recently bought a ton of SEO backlinks and saw their own site jump to the top of search results. Unfortunately, this is rarely the end of the story. 

Today, I wanted to highlight a more complete example of what happens when you pay for links and why.

The Full Story of Someone Who Spent $5,000 on SEO Links

In this instance, I came across someone who had spent thousands of dollars on links for SEO purposes through Search Engine Journal’s “Ask an SEO” column. In the most recent edition of this weekly article, a person named Marlin lays out their situation.

“I paid over $5,000 for SEO link building.”

From the outset, it is unclear if Marlin knew exactly what they had gotten into. While it is possible they directly purchased links from a website, there is also the potential that Marlin and their company put their trust in a questionable marketing agency that purchased or generated spammy links to “boost” rankings.

This is important because it is very common for online SEO packages to include “link building services” which are actually accomplished through link farms that will inevitably be identified and shut down. This is why it is crucial to know that the people handling your link-building efforts use proven, Google-approved strategies rather than cutting corners.

“At first, traffic was boosted.”

As promised, the initial result of buying links is frequently a quick spike in your search engine rankings. Even better, this payoff seems to come much more quickly than the rankings boosts seen from traditional link-building efforts. In some cases, you might even get a huge boost to your rankings within a week or two of paying for the service!

However, the story isn’t over.

“We then lost our rankings on those keywords and our traffic is gone!”

Despite the initially promising results, this is the inevitable conclusion of every story about paying for links.

In the best-case scenario, Google simply ignores your newly acquired low-quality links – putting you right back where you started. In some cases, depending on how widespread the link scheme appears to be, you can wind up even worse than when you began.

If Google believes you have a persistent habit of trying to manipulate search rankings, your site may receive a penalty that significantly impairs your rankings. In the worst cases, your site can be removed from search results entirely.

Why Paid Links Inevitably Fail

There is a very simple reason this story followed a predictable pattern. Google explicitly forbids any sort of “unnatural links” or link schemes. Additionally, the search engine has invested huge amounts of time and resources to identify these artificial links.

At the same time, Google is locked into a game of whack-a-mole where new link sellers are popping up all the time – which is why their links may help your rankings for a very short time.

In SEO, shortcuts are rarely as great as they appear. If you’re looking for long-term, sustainable success, the only option is to roll up your sleeves and build links the old-fashioned way: by creating great content and building real relationships with other members of your industry.

It won’t be quick and it won’t be easy, but it will be worth it in the long run.