Tag Archive for: SEO

Google’s much-talked-about ‘helpful content update’ is officially rolling out.

The company announced it had begun the process of implementing the new algorithm update via the Search Central Google Search ranking updates page.

As the update posted today says. “[Google] released the August 2022 helpful content update. The rollout could take up to two weeks to complete.

What Is The Helpful Content Update?

In short, the helpful content update intends to make content written specifically for search engines (sometimes called “search engine-first content”) less prevalent in search results while increasing the presence of content that is most valuable to actual users.

Announced a little more than a week ago, the update is being applied sitewide, meaning it will be a factor for all search results. At the same time, Google has indicated that the update will impact online education, arts, tech, and shopping more than other websites.

Early rumblings and statements from Google suggest this may be the biggest update to the search engine in years, and may radically shake up the search results users receive. 

Two Week Rollout

As with most algorithm updates, the company is gradually implementing the helpful content update. Over the next two weeks, most sites will likely see fluctuations in search performance as the update is rolled out before search performance stabilizes. Additionally, it may take even longer for the full scope of the helpful content update to become apparent following the completed rollout.

What To Do

With the update rolling out, brands hoping to make changes before the impact is felt may be cutting it too close to save their rankings. However, you can still remove any search engine-first content from your site to minimize the update’s impact on your site. 

Beyond that, there is little you can do now other than monitor your rankings over the next two weeks and beyond to track the impact in real-time.

Today, Google revealed it is preparing a massive update called the Helpful Content Update that may be the biggest change to the search engine’s algorithm in years.

The update is aiming to filter out sites that have large amounts of content that are written solely for the search engine, without providing value to actual users.

Or, as Google simply put it in its announcement:

“The helpful content update aims to better reward content where visitors feel they’ve had a satisfying experience, while content that doesn’t meet a visitor’s expectations won’t perform as well.”

Here’s what we know about the update so far:

What Is The Google Helpful Content Update?

Philosophically, there is little about the helpful content update which is all that different from what Google has been working towards in the past. 

The algorithm update aims to help users find the most high-quality content which will be the most helpful. What sets it apart is how it aims to achieve this.

In this instance, Google plans to improve search results by targeting and removing what could be called “search engine-first content” or content written expressly for the purpose of boosting rankings without actually delivering quality content to readers.

While the algorithm will be applied to all Google search results when it rolls out, the company said four specific types of sites are most likely to be affected:

  • Online educational materials
  • Arts & entertainment
  • Shopping
  • Tech

Content in these niches seem to be most prone to being written specifically for search engines rather than humans and Google hopes to improve the quality of results in these areas.

As a representative from Google told Search Engine Land’s Barry Schwartz:

“If you search for information about a new movie, you might have previously encountered articles that aggregated reviews from other sites without adding perspectives beyond what’s available elsewhere on the web. This isn’t very helpful if you’re expecting to read something new. With this update, you’ll see more results with unique information, so you’re more likely to read something you haven’t seen before.”

Is your site safe?

Rather than provide a simple checklist of things companies can do to prepare their website, Google offered a series of questions that can be used to determine if you’re creating content for humans or search engines:

  • Do you have an existing or intended audience for your business or site that would find the content useful if they came directly to you? 
  • Does your content clearly demonstrate first-hand expertise and a depth of knowledge (for example, expertise that comes from having actually used a product or service, or visiting a place)?
  • Does your site have a primary purpose or focus?
  • After reading your content, will someone leave feeling they’ve learned enough about a topic to help achieve their goal?
  • Will someone reading your content leave feeling like they’ve had a satisfying experience?
  • Are you keeping in mind our guidance for core updates and for product reviews?

Additionally, the Google Search Central article provided a similar list of questions you can use to avoid search-engine first content in the future:

  • Is the content primarily to attract people from search engines, rather than made for humans?
  • Are you producing lots of content on different topics in hopes that some of it might perform well in search results?
  • Are you using extensive automation to produce content on many topics?
  • Are you mainly summarizing what others have to say without adding much value?
  • Are you writing about things simply because they seem trending and not because you’d write about them otherwise for your existing audience?
  • Does your content leave readers feeling like they need to search again to get better information from other sources?
  • Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t).
  • Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you’d get search traffic?
  • Does your content promise to answer a question that actually has no answer, such as suggesting there’s a release date for a product, movie, or TV show when one isn’t confirmed?

When Will It Arrive

The helpful content update is due to roll out next week to all English-language search results in the U.S. The company plans to expand the update to other languages and countries sometime in the future.

In an update to the help documentation for Googlebot, the search engine’s crawling tool, Google explained it will only crawl the first 15 MB of any webpage. Anything after this initial 15 MBs will not influence your webpage’s rankings.

As the Googlebot help document states:

“After the first 15 MB of the file, Googlebot stops crawling and only considers the first 15 MB of the file for indexing.

The file size limit is applied on the uncompressed data.”

Though this may initially raise concerns since images and videos can easily exceed these sizes, the help document makes clear that media or other resources are typically exempt from this Googlebot limit:

“Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately.”

What This Means For Your Website

If you’ve been following the most commonly used best practices for web design and content management, this should leave your website largely unaffected. Specifically, the best practices you should be following include:

  • Keeping the most relevant SEO-related information relatively close to the start of any HTML file. 
  • Compressing images.
  • Leaving images or videos unencoded into the HTML when possible.
  • Keeping HTML files small – typically less than 100 KB.

With internet speeds constantly increasing, smartphones becoming the primary way to get online, and people’s attention spans getting shorter than ever, it is absolutely crucial that your website loads quickly. Visitors will not hesitate to click the ‘back’ button and Google has slowly made loading times one of the most important ranking signals it uses. 

At the same time, users have come to expect stylish, high-quality images from any website they visit. They don’t just want to find the best information. They want the best information in the most enjoyable package. 

This creates a catch-22 for website owners. Users want to see a page filled with great images, but they don’t want to wait for it. Unfortunately, these high-quality pictures have the tendency to slow down how quickly websites load. 

Thankfully, there are ways to mediate this by optimizing your images to make loading your web pages as efficient and quick as possible – as Alan Kent, Google Developer Advocate, shares in a recent video:

The video gets pretty in-depth at times and leans into technical details, so we will try to collect the most important tips and info below:

Google’s 6 Tips For Optimizing Online Images

1. Eliminate Image Cumulative Layout Shift (CLS)

Don’t let the jargony name intimidate you. You have no doubt encountered CLS before, and it probably frustrated you.

CLS is where text or images move as each individual component loads. Because of this, you might have text which refuses to stay in place as you try to read it, have new images popping into place where a link was visible seconds before, or potentially open an entirely different page because a link appeared right where you were trying to swipe. 

Though this issue can affect any type of content on a webpage, images are frequently a leading culprit because of the amount of space they fill on a page. 

2. Keep Your Images Only As Large As Needed

It can be tempting to upload images in the largest size possible, to guarantee every little detail will be included without pixelation or artifacting. Some web designers see this as “future-proofing” their site or ensuring the best quality no matter how large an image is shown. 

The problem is that this can be overkill. Even when rendering an image for smaller resolutions, browsers have to download the original image and compress it to render correctly. This slows things down, as larger images take a longer time to be downloaded and rendered in the proper size for the display it is being shown on.

The complication is that displays can range wildly in size and resolution – from tiny smartphones to gigantic monitors. That makes it hard to identify exactly when an image becomes “too large.” 

The easiest way to find this out is by checking out the Opportunities section in the PageSpeed Insights report, under ‘properly sized images’. Here you’ll see which images are larger than they need to be so you can replace them with more properly sized alternatives.

3. Use The Best Image Format

Which file format you choose to save your images in might seem like a minor choice, but it can have major effects on loading speeds. At the same time, choosing the right image format isn’t always as simple as choosing the one which outputs the smallest file.

While formats like JPEG or webP tend to deliver smaller file sizes from the same initial image, they do so by compressing the image. This compression subtly degrades the quality of the image to minimize file size. 

On the other hand, larger file formats like PNG can preserve fine details to maintain the original quality of an image, though this results in larger files. 

In many cases, your visitors may not notice the difference between a PNG or JPEG, making the smaller file the obvious choice. However, more complex images or very large images may look noticeably worse in small formats.

To identify images that may not be in the most efficient format for your site, check out the ‘serve images in next-gen formats’ section of the PageSpeed Insights report.

4. Compress Images Properly

While file formats have a big impact on how large your image files are, most formats allow you to dictate just how much compression occurs. If you’d like, you can prioritize preserving detail while receiving a slightly larger file, or you can prioritize getting the smallest file at the cost of the image quality. 

To figure out what is best for your website, you can explore the ‘encode images efficiently’ section of the PageSpeed Insights report. Here, you’ll find details about images that may benefit from being compressed and how much this might shrink image files. 

5. Cache Images In The Browser

Caching is a process browsers use where they temporarily store images or details from your website to s[eed up the loading process on related pages or if they return to your site. 

If you do this, however, it is important for you to tell the browser how long it should keep these cached images This is done through an HTTP response header containing guidance on how to handle cached files and images. 

If you’re unsure whether you’ve properly configured this header, you can also find details about this in the PageSpeed Insights report, within the ‘serve static assets with an efficient cache policy’ section. 

6. Correctly Sequence Image Downloads

By default, web browsers wait to load details until they are absolutely needed. This is a practice called “lazy loading” that allows browsers to focus on the details you’re most likely to be focused on at the moment. This is not always the best process for loading larger files like images or videos, though. 

To get around this, Google recommends establishing the sequencing order some parts of your site are downloaded and rendered by browsers. 

Specifically, Google recommends using the following sequencing order:

  • “Hero Images” at the top of the page
  • Above the fold images
  • Images just below the fold

After this, Kent says most other images can be lazy-loaded without an issue. 

Again, you’ll be able to find an assessment of how efficiently you are loading images on your website within the PageSpeed Insights report, under ‘defer offscreen images‘. 

For more, be sure to watch the 14-minute long video above or explore more SEO news and tips here.

The way we use keywords for search engine optimization (SEO) has changed quite a bit since the early days of Google. Instead of stuffing pages with obvious keyword spam, SEO success is more about delivering content that is useful and interesting for your ideal customers. One thing that hasn’t changed during all that time, however, is the importance of keyword research.

Keyword research has similarly grown and evolved throughout the years, but attentive brands will know that keyword research has consistently been a huge factor in their online success since the creation of search engines.

What Is Keyword Research?

The idea behind keyword research has always been basically the same. The practice is all about identifying the actual keywords people are using to find your website and websites like yours.

With most modern tools, you can not only identify these keywords, but also assess their overall popularity, how difficult it would be to rank for these terms, and more.

Essentially, by looking at the terms people are already using to find you (and which popular search terms you’re missing out on, you learn what your customers are really looking for to best deliver it.

With this information, you can develop strategies focused on reaching the most effective audiences for your brand.

Keyword research also lets you identify emerging opportunities, set important benchmarks for your SEO efforts, and measure the success of your optimization.

Lastly, keyword research gives you the chance to check your own assumptions using real-world data. Often, brands quickly discover their top keywords are entirely different than assumed.

How To Use Your Keywords

Once you’ve identified the most important keywords for your brand, it’s time to actually start targeting these terms.

In the dark ages of SEO, targeting keywords meant seeing how many times you could fit a word into a piece of text. Whether the rest of the content was relevant, well-written, or just a string of gibberish were secondary concerns, at best.

This meant Google would think the page was full of great information about that topic and place you high in the search results!

Google’s systems have gotten exponentially more complex over the years. These days, the search engine uses machine learning to better understand the content they index and the intent behind search terms. 

Pages can (theoretically) rank well for keywords despite not using them anywhere on their site since Google can understand how the page is relevant to the keyword topic.

Of course, it is better to still strategically place keywords you are targeting on the pages on your site and the content you share. But, the most important thing now is simply delivering the best resources for the keywords you want to rank for. 

While this may seem like it has decreased the importance of keywords, that couldn’t be farther from the truth. These days, this insight helps you spot new shifts in your industry, brainstorm the best content for your potential customers, and set the most relevant goals for your SEO efforts.

Despite Google being very clear about its feelings on paying for SEO links (hint: it is not a fan), I still regularly come across stories of brands spending hundreds or even thousands of dollars on links that promise to increase their rankings.

Typically, these individuals have heard success stories from others who had recently bought a ton of SEO backlinks and saw their own site jump to the top of search results. Unfortunately, this is rarely the end of the story. 

Today, I wanted to highlight a more complete example of what happens when you pay for links and why.

The Full Story of Someone Who Spent $5,000 on SEO Links

In this instance, I came across someone who had spent thousands of dollars on links for SEO purposes through Search Engine Journal’s “Ask an SEO” column. In the most recent edition of this weekly article, a person named Marlin lays out their situation.

“I paid over $5,000 for SEO link building.”

From the outset, it is unclear if Marlin knew exactly what they had gotten into. While it is possible they directly purchased links from a website, there is also the potential that Marlin and their company put their trust in a questionable marketing agency that purchased or generated spammy links to “boost” rankings.

This is important because it is very common for online SEO packages to include “link building services” which are actually accomplished through link farms that will inevitably be identified and shut down. This is why it is crucial to know that the people handling your link-building efforts use proven, Google-approved strategies rather than cutting corners.

“At first, traffic was boosted.”

As promised, the initial result of buying links is frequently a quick spike in your search engine rankings. Even better, this payoff seems to come much more quickly than the rankings boosts seen from traditional link-building efforts. In some cases, you might even get a huge boost to your rankings within a week or two of paying for the service!

However, the story isn’t over.

“We then lost our rankings on those keywords and our traffic is gone!”

Despite the initially promising results, this is the inevitable conclusion of every story about paying for links.

In the best-case scenario, Google simply ignores your newly acquired low-quality links – putting you right back where you started. In some cases, depending on how widespread the link scheme appears to be, you can wind up even worse than when you began.

If Google believes you have a persistent habit of trying to manipulate search rankings, your site may receive a penalty that significantly impairs your rankings. In the worst cases, your site can be removed from search results entirely.

Why Paid Links Inevitably Fail

There is a very simple reason this story followed a predictable pattern. Google explicitly forbids any sort of “unnatural links” or link schemes. Additionally, the search engine has invested huge amounts of time and resources to identify these artificial links.

At the same time, Google is locked into a game of whack-a-mole where new link sellers are popping up all the time – which is why their links may help your rankings for a very short time.

In SEO, shortcuts are rarely as great as they appear. If you’re looking for long-term, sustainable success, the only option is to roll up your sleeves and build links the old-fashioned way: by creating great content and building real relationships with other members of your industry.

It won’t be quick and it won’t be easy, but it will be worth it in the long run.

If you’re a business owner or operator, you’ve probably been told 100 times by 100 different people that you just HAVE to invest in Search Engine Optimization. Unfortunately, you’ve also likely never really heard why SEO is so important beyond broad mentions of “being found online” or that “everyone uses Google.”

Marketers and salespeople have a bad habit of talking about the power and benefits of optimization without explaining what sets it apart from other types of online marketing, how it impacts your ability to reach new markets, and why many SEO packages don’t cut it. 

So today, I wanted to do just that.

What Is Search Engine Optimization

Before we can talk about what makes SEO special, we have to talk a bit about what it is.

In the simplest terms, search engine optimization is the name for a wide range of strategies and techniques used to increase your visibility on search engines. 

In the past, this could be boiled down to the phrase “making your website the top result on Google searches.” These days, search engines are much more complex and what might be the top result for one user might be completely different for another.

As such, SEO has evolved to focus more on overall visibility across Google’s many systems with the goal of attracting as many potential customers as possible to your site.

How SEO Works

For our purposes today, we aren’t going to go very in-depth discussing the numerous strategies or techniques used in SEO. Otherwise, we’d be here all day.

What matters for this discussion is understanding that these methods affect how Google sees and ranks your site. 

While some strategies are dedicated to helping Google understand the content that is on your site, others are intended to boost the overall value of your site. Combined, these approaches help ensure Google picks your site for relevant searches and gives you the best chance to attract website traffic.

Why SEO Is Essential in 2022

Google Is The Most Visited Site In The World

Marketers always like to say “everyone uses Google” to emphasize the importance of SEO (and they aren’t necessarily wrong), but what does that really mean?

It means that Google is a massive part of daily life for practically everyone around the globe, and can massively influence what information we see, who we do business with, and what products people buy.

To give you an idea of how much influence Google has compared to any other site online, the search engine sees more than 3x the traffic that the second most popular website – YouTube (which is also owned by Google.)

The most popular site in the world NOT owned by Google – Facebook – sees less than a quarter of the traffic seen by Google.com.

No matter how you try to spin it, Google acts as the central hub to the internet for the vast majority of people out there. If you don’t play by their rules, you risk being disconnected from this hub and any potential traffic you might get.

Organic Search is Still The Main Driver of Traffic

When considering where to invest their marketing budget, many businesses find themselves asking the same question: “Why should I spend money on SEO, which is complicated and not guaranteed to pay off, when I could instead run ads that are guaranteed to appear above those search results?”

Organic search results get underestimated because ranking highly is rarely a sure thing – even for the biggest companies. Meanwhile, paid search ads are built around driving results without uncertainty.

Despite this, there is actually a very simple reason you should invest in organic search optimization.

Organic search results drive more than twice the traffic compared to the next leading traffic source. Compared to paid ads, organic search results drive more than 5x the traffic to websites.

At the end of the day, the majority of search results still result in a user clicking an organic link from regular search results. So while it may seem riskier, investing in search engine optimization has the chance for much larger rewards.

Better SEO Means Better User Experience

Every brand wants its website to provide the best user experience possible. A positive user experience increases the likelihood of driving conversions, while negative user experiences can sour people on your company entirely.

So, it should come as good news that the majority of SEO practices are intended to improve user experience in a variety of ways including speeding up your site, making it easier to use, and improving accessibility.

By ensuring you are optimized for search engines, you are also investing in improving your site for the real potential customers who will soon be visiting.

SEO Is a Process That Is Always Changing

Companies looking to save some cash on SEO will have an easy time finding dozens of cheap SEO packages across the web. The problems with the packages are numerous, but the biggest red flag is the assumption that SEO is something you do once.

In reality, SEO is something that needs to be done regularly to have a real impact. 

When left alone, Google assumes websites are becoming outdated or irrelevant. No matter what industry you are in, there are always new products coming out, new information that can benefit your customers, and new ways to improve your site.

Additionally, Google itself is always changing. The company releases new guidelines, algorithm updates, and features for webmasters seemingly every day. Any cheap package deal is unable to take these updates into account and help your company stay ahead of the rapidly changing search results.

SEO Results Amplify With Time

Unlike almost any other form of marketing, search engine optimization is one of the few investments which tends to build on itself for greater and greater results.

As you optimize your website and create quality content to improve your search rankings, you also provide a more robust presence online. Your website becomes an even greater resource to potential customers. You start getting linked to by others in your industry. People start sharing your brand around social media. 

Ads may drive immediate results, but these tend to stabilize with time. Effective search engine optimization, on the other hand, pays increasing dividends the longer you invest in it.


The role search engines play in our lives will only continue to grow as people become more connected and expect information to always be at their fingertips. For all these reasons, it is imperative that companies invest in the best optimization practices possible if they want to continue reaching prospective customers in an increasingly digital world.

Due to the long-term impact of SEO, the best time to start optimizing your website was probably months or years ago. The second best time, however, is now.

Any small-to-medium-sized business owner or operator is all too aware that it often feels like the odds are stacked against them – especially when it comes to competing with larger companies on Google. 

It’s something Google rarely addresses outright, but it seems clear that big companies have several advantages which can make it hard to compete. This is why one person decided to ask Google’s John Mueller about the situation during a recent Office Hours hangout chat with Google Search Advocate.

As Mueller acknowledges, Google is well aware that big brands often receive natural competitive advantages. But, he also had some advice for smaller brands trying to rank against massive brands – big sites face their own unique problems and limitations which can give you a chance to get the upper hand.

John Mueller’s Advice For Small Companies On Google

The original question posed to Mueller included two parts, but it was the second half that the Search Advocate decided to focus on. Specifically, he was asked:

“Do smaller organizations have a chance in competing with larger companies?”

From the outset, he says its a bit of a broader “philosophical” question, but he does his best to show how smaller companies have consistently been able to turn the tables against larger brands. For example, Mueller points to how many larger companies were so invested in using Macromedia Flash, they stuck with it long after it became clear it was not helping their SEO. Meanwhile, smaller sites often knew better and were able to use this against their competition.

“One of the things that I’ve noticed over time is that in the beginning, a lot of large companies were, essentially, incompetent with regards to the web and they made terrible websites.

And their visibility in the search results was really bad.

And it was easy for small websites to get in and kind of like say, well, here’s my small website or my small bookstore, and suddenly your content is visible to a large amount of users.

And you can have that success moment early on.

But over time, as large companies also see the value of search and of the web overall, they’ve grown their websites.

They have really competent teams, they work really hard on making a fantastic web experience.

And that kind of means for smaller companies that it’s a lot harder to gain a foothold there, especially if there is a very competitive existing market out there.

And it’s less about large companies or small companies.

It’s really more about the competitive environment in general.”

While it is true that it can seem very difficult to compete with the seemingly unlimited resources of bigger brands, history has shown time and time again that bigger brands face their own challenges. 

As Mueller concludes:

“As a small company, you should probably focus more on your strengths and the weaknesses of the competitors and try to find an angle where you can shine, where other people don’t have the ability to shine as well.

Which could be specific kinds of content, or specific audiences or anything along those lines.

Kind of like how you would do that with a normal, physical business as well.”

In the end, big brands competing are much like David facing down Goliath; if they know how to use their strengths and talents to their advantage they can overcome seemingly unbeatable challengers.

You can watch Mueller’s answer in the video below, starting around 38:14.

Most people these days understand the general idea of how search engines work. Search engines like Google send out automated bots to scan or “crawl” all the pages on a website, before using their algorithms to sort through which sites are best for specific search queries. 

What few outside Google knew until recently, was that the search engine has begun using two different methods to crawl websites – one which specifically searches out new content and another to review content already within its search index.

Google Search Advocate John Mueller revealed this recently during one of his regular Search Central SEO office-hours chats on January 7th.

During this session, an SEO professional asked Mueller about the behavior he has observed from Googlebot crawling his website. 

Specifically, the user says Googlebot previously crawled his site daily when it was frequently sharing content. Since content publishing has slowed on this site, he has seen that Googlebot has been crawling his website less often.

As it turns out, Mueller says this is quite normal and is the result of how Google approaches crawling web pages.

How Google Crawls New vs. Old Content

While Mueller acknowledges there are several factors that can contribute to how often it crawls different pages on a website – including what type of pages they are, how new they are, and how Google understands your site.

“It’s not so much that we crawl a website, but we crawl individual pages of a website. And when it comes to crawling, we have two types of crawling roughly.

One is a discovery crawl where we try to discover new pages on your website. And the other is a refresh crawl where we update existing pages that we know about.”

These different types of crawling target different types of pages, so it is reasonable that they also occur more or less frequently depending on the type of content.

“So for the most part, for example, we would refresh crawl the homepage, I don’t know, once a day, or every couple of hours, or something like that.

And if we find new links on their home page then we’ll go off and crawl those with the discovery crawl as well. And because of that you will always see a mix of discover and refresh happening with regard to crawling. And you’ll see some baseline of crawling happening every day.

But if we recognize that individual pages change very rarely, then we realize we don’t have to crawl them all the time.”

The takeaway here is that Google adapts to your site according to your own publishing habits. Which type of crawling it is using or how frequently it is happening are not inherently good or bad indicators of your website’s health, and your focus should be (as always) on providing the smoothest online sales experience for your customers. 

Nonetheless, it is interesting to know that Google has made this adjustment to how it crawls content across the web and to speculate about how this might affect its ranking process.

To hear Mueller’s full response (including more details about why Google crawls some sites more often than others), check out the video below:

Google has confirmed that it is sometimes replacing page titles in search results with other copy it finds more relevant. As public liaison for Google Search, Danny Sullivan, explains:

“Last week, we introduced a new system of generating titles for web pages. Before this, titles might change based on the query issued. This generally will no longer happen with our new system. This is because we think our new system is producing titles that work better for documents overall, to describe what they are about, regardless of the particular query.”

In plain English, this means that Google is rewriting the title tags accompanying web pages in some search results – often replacing it with other text from your page. This is not the first time Google has made adjustments to title tags being shown in search results, but it is definitely the most extensive rewriting the search engine has done. 

According to Sullivan, the goal of this is to highlight the most relevant content for users and focus on content that users can “visually see”: 

“Also, while we’ve gone beyond HTML text to create titles for over a decade, our new system is making even more use of such text. In particular, we are making use of text that humans can visually see when they arrive at a web page. We consider the main visual title or headline shown on a page, content that site owners often place within <H1> tags, within other header tags, or which is made large and prominent through the use of style treatments.”

Does This Mean HTML Title Tags Don’t Matter?

If Google is going to just replace the tags put on pages, why should we even bother? The answer is for a few reasons. 

Firstly, the title tags will still provide their traditional SEO value by helping the search engine understand your page.

Secondly, Google is not rewriting the majority of search results titles. According to Sullivan, Google will show the original HTML title tags in more than 80% of cases. The system will only revise title tags if it believes the current tags are either too long, stuffed with irrelevant keywords, or a generic boilerplate.

“In some cases, we may add site names where that is seen as helpful. In other instances, when encountering an extremely long title, we might select the most relevant portion rather than starting at the beginning and truncating more useful parts.”

What This Means For You

Since there is no way of opting out of this system, there is nothing for brands to change moving forward. 

The biggest changes from this will instead be in reporting, where some pages may see increased or decreased click-through rates due to changed titles in search results. 

For more, read the full statement from Google and Danny Sullivan here.