Google has confirmed it is rolling out its latest broad core algorithm update, signifying yet another potential shake-up for the search engine’s results.
Google’s broad core algorithm updates serve as some of the most significant updates for the search engine compared to the smaller updates that are happening multiple times a day. They can affect rankings for search engine results pages (SERPs) throughout Google’s entire platform.
As is usual with Google, the search company is being tight-lipped about specific details regarding the update, only going so far as to confirm the latest update. The update is also expected to take up to multiple weeks for the full impact to be obvious.
With this in mind, it is wise for brands to take note and monitor their own search performance in the coming weeks.
What Can You Do?
Aside from always striving to provide the best online experience possible with your website, there are a few specific steps you can take to safeguard your site from updates like these:
Monitor site performance regularly to identify early signs of issues with your site
Create content geared to your audience’s needs and interests
Optimize your site’s performance (including speed, mobile-friendliness, and user experience) to ensure your site isn’t off-putting to visitors
Google has launched its latest broad core algorithm update, which could potentially affect rankings for search engine results pages. The update may take several weeks to have full impact, so brands are advised to monitor their search performance. To safeguard your site, monitor its performance regularly, create audience-specific content, and optimize its performance for speed, mobile-friendliness, and user-experience.
https://www.tulsamarketingonline.com/wp-content/uploads/2020/01/GoogleRankings.png360640Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2023-03-16 20:29:362023-03-16 20:29:38Google Confirms March Broad Core Algorithm Update Is Rolling Out
If you’re still unclear on how Google thinks about marketing agencies that offer negative SEO linkbuilding services or link disavowal services, the latest comments from John Mueller should help clarify the company’s stance.
In a conversation that popped up on Twitter between Mueller and several marketing experts, Mueller clearly and definitively slammed companies offering these types of services by saying that they are “just making stuff up and cashing in from those who don’t know better.”
This is particularly notable as some have accused Google of being unclear on their handling of link disavowal using their tools
The post that started it all came from Twitter user @RyanJones who said, “I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”
In response, one user began talking about negative SEO which caught the attention of Mueller. The user mentioned that “agencies know what kind of links hurt the website because they have been doing this for a long time. It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well. They will provide you examples as well with proper insights.”
In response, Mueller gave what is possibly his clearest statement on this type of “service” yet:
“That’s all made up & irrelevant. These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”
Instead of spending time and effort on any of this, Mueller instead recommended something simple:
“Don’t waste your time on it; do things that build up your site instead.”
https://www.tulsamarketingonline.com/wp-content/uploads/2021/05/Google-Page-Experience-Desktop1.jpg343720Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2023-02-02 23:49:432023-02-02 23:49:44John Mueller Makes It Clear How Google Feels About Negative SEO and Inappropriate Link Disavowal
Google is encouraging brands to ensure content is properly dated in search engines by using multiple date indicators on each page.
The recommendation came in the wake of an issue with Google News where the wrong dates were being shown.
In the response, Google’s Search Liaison, Danny Sullivan, emphasized that while many factors may have contributed in this specific situation, the lack of proper date signals made it difficult to show correct info in the search results.
“That page is a particular challenge since the main story lacks a visible date (it only has a time), and the page contains multiple stories which do contain full dates. Our guidance warns about this.”
To prevent situations like this from arising, Sullivan says it is important to use several signals to clarify the date content is published:
“Understand that ideally, the meta data alone would seem to some to be enough, and we’ll keep working to improve. But there are good reasons why we like multiple date signals present.”
Why Does This Matter?
It may not seem like a big deal for the wrong date to occasionally get shown with content in the search results. However, these can undermine your authority, lead to confusion, and create a poor user experience. All of these can lead to decreased page performance and even demotions in Google’s search results.
On the other hand, situations like this also highlight the need for Google to deliver more consistent ways to signal a page’s publishing date.
“Google doesn’t depend on a single date factor because all factors can be prone to issues. That’s why our systems look at several factors to determine our best estimate of when a page was published or significantly updated.”
https://www.tulsamarketingonline.com/wp-content/uploads/2020/12/Google-Year-In-Search-2020.jpg360640Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2023-01-19 20:34:032023-01-19 20:34:05Google Says To Use Multiple Signals To Get Dates Right In Search Results
Today, Google revealed it is preparing a massive update called the Helpful Content Update that may be the biggest change to the search engine’s algorithm in years.
The update is aiming to filter out sites that have large amounts of content that are written solely for the search engine, without providing value to actual users.
Or, as Google simply put it in its announcement:
“The helpful content update aims to better reward content where visitors feel they’ve had a satisfying experience, while content that doesn’t meet a visitor’s expectations won’t perform as well.”
Here’s what we know about the update so far:
What Is The Google Helpful Content Update?
Philosophically, there is little about the helpful content update which is all that different from what Google has been working towards in the past.
The algorithm update aims to help users find the most high-quality content which will be the most helpful. What sets it apart is how it aims to achieve this.
In this instance, Google plans to improve search results by targeting and removing what could be called “search engine-first content” or content written expressly for the purpose of boosting rankings without actually delivering quality content to readers.
While the algorithm will be applied to all Google search results when it rolls out, the company said four specific types of sites are most likely to be affected:
Online educational materials
Arts & entertainment
Content in these niches seem to be most prone to being written specifically for search engines rather than humans and Google hopes to improve the quality of results in these areas.
“If you search for information about a new movie, you might have previously encountered articles that aggregated reviews from other sites without adding perspectives beyond what’s available elsewhere on the web. This isn’t very helpful if you’re expecting to read something new. With this update, you’ll see more results with unique information, so you’re more likely to read something you haven’t seen before.”
Is your site safe?
Rather than provide a simple checklist of things companies can do to prepare their website, Google offered a series of questions that can be used to determine if you’re creating content for humans or search engines:
Do you have an existing or intended audience for your business or site that would find the content useful if they came directly to you?
Does your content clearly demonstrate first-hand expertise and a depth of knowledge (for example, expertise that comes from having actually used a product or service, or visiting a place)?
Does your site have a primary purpose or focus?
After reading your content, will someone leave feeling they’ve learned enough about a topic to help achieve their goal?
Will someone reading your content leave feeling like they’ve had a satisfying experience?
Are you keeping in mind our guidance for core updates and for product reviews?
Additionally, the Google Search Central article provided a similar list of questions you can use to avoid search-engine first content in the future:
Is the content primarily to attract people from search engines, rather than made for humans?
Are you producing lots of content on different topics in hopes that some of it might perform well in search results?
Are you using extensive automation to produce content on many topics?
Are you mainly summarizing what others have to say without adding much value?
Are you writing about things simply because they seem trending and not because you’d write about them otherwise for your existing audience?
Does your content leave readers feeling like they need to search again to get better information from other sources?
Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t).
Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you’d get search traffic?
Does your content promise to answer a question that actually has no answer, such as suggesting there’s a release date for a product, movie, or TV show when one isn’t confirmed?
When Will It Arrive
The helpful content update is due to roll out next week to all English-language search results in the U.S. The company plans to expand the update to other languages and countries sometime in the future.
In an update to the help documentation for Googlebot, the search engine’s crawling tool, Google explained it will only crawl the first 15 MB of any webpage. Anything after this initial 15 MBs will not influence your webpage’s rankings.
As the Googlebot help document states:
“After the first 15 MB of the file, Googlebot stops crawling and only considers the first 15 MB of the file for indexing.
The file size limit is applied on the uncompressed data.”
Though this may initially raise concerns since images and videos can easily exceed these sizes, the help document makes clear that media or other resources are typically exempt from this Googlebot limit:
What This Means For Your Website
If you’ve been following the most commonly used best practices for web design and content management, this should leave your website largely unaffected. Specifically, the best practices you should be following include:
Keeping the most relevant SEO-related information relatively close to the start of any HTML file.
Leaving images or videos unencoded into the HTML when possible.
Keeping HTML files small – typically less than 100 KB.
Despite Google being very clear about its feelings on paying for SEO links (hint: it is not a fan), I still regularly come across stories of brands spending hundreds or even thousands of dollars on links that promise to increase their rankings.
Typically, these individuals have heard success stories from others who had recently bought a ton of SEO backlinks and saw their own site jump to the top of search results. Unfortunately, this is rarely the end of the story.
Today, I wanted to highlight a more complete example of what happens when you pay for links and why.
The Full Story of Someone Who Spent $5,000 on SEO Links
In this instance, I came across someone who had spent thousands of dollars on links for SEO purposes through Search Engine Journal’s “Ask an SEO” column. In the most recent edition of this weekly article, a person named Marlin lays out their situation.
“I paid over $5,000 for SEO link building.”
From the outset, it is unclear if Marlin knew exactly what they had gotten into. While it is possible they directly purchased links from a website, there is also the potential that Marlin and their company put their trust in a questionable marketing agency that purchased or generated spammy links to “boost” rankings.
This is important because it is very common for online SEO packages to include “link building services” which are actually accomplished through link farms that will inevitably be identified and shut down. This is why it is crucial to know that the people handling your link-building efforts use proven, Google-approved strategies rather than cutting corners.
“At first, traffic was boosted.”
As promised, the initial result of buying links is frequently a quick spike in your search engine rankings. Even better, this payoff seems to come much more quickly than the rankings boosts seen from traditional link-building efforts. In some cases, you might even get a huge boost to your rankings within a week or two of paying for the service!
However, the story isn’t over.
“We then lost our rankings on those keywords and our traffic is gone!”
Despite the initially promising results, this is the inevitable conclusion of every story about paying for links.
In the best-case scenario, Google simply ignores your newly acquired low-quality links – putting you right back where you started. In some cases, depending on how widespread the link scheme appears to be, you can wind up even worse than when you began.
If Google believes you have a persistent habit of trying to manipulate search rankings, your site may receive a penalty that significantly impairs your rankings. In the worst cases, your site can be removed from search results entirely.
Why Paid Links Inevitably Fail
There is a very simple reason this story followed a predictable pattern. Google explicitly forbids any sort of “unnatural links” or link schemes. Additionally, the search engine has invested huge amounts of time and resources to identify these artificial links.
At the same time, Google is locked into a game of whack-a-mole where new link sellers are popping up all the time – which is why their links may help your rankings for a very short time.
In SEO, shortcuts are rarely as great as they appear. If you’re looking for long-term, sustainable success, the only option is to roll up your sleeves and build links the old-fashioned way: by creating great content and building real relationships with other members of your industry.
It won’t be quick and it won’t be easy, but it will be worth it in the long run.
https://www.tulsamarketingonline.com/wp-content/uploads/2022/05/Buying-Links-Bad.png6751200Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2022-05-03 17:22:382022-05-03 17:22:39What Happens When You Pay For SEO Links?
Product pages may receive a temporary reduction in their visibility in Google search results if the product is listed as out of stock, according to Google’s Search Advocate John Mueller during the most recent Google Search Central SEO Office Hours session.
Surprisingly, though, this is not always the case.
As Mueller answered questions about how product stock affects rankings, he explained that Google has a few ways of handling out-of-stock product pages.
How Google Handles Out-of-Stock Products
Mueller says that, in most cases, Google treats out-of-stock listings as a soft redirect or unavailable page:
“Out of stock – it’s possible. That’s kind of simplified like that. I think there are multiple things that come into play when it comes to products themselves in that they can be shown as a normal search result.
They can also be shown as an organic shopping result as well. If something is out of stock, I believe the organic shopping result might not be shown – I’m not 100% sure.
And when it comes to the normal search results, it can happen that we when see that something is out of stock, we will assume it’s more like a soft 404 error, where we will drop that URL from the search results as well.
Theoretically, it could affect the visibility in search if something goes out of stock.”
In some situations, though, Google will essentially override this decision and continue to show a page if it is considered particularly relevant for users.
For example, if the product page also includes helpful information about the product in general, it may still be worth keeping in search results despite the lack of stock.
As Mueller explains”
“It doesn’t have to be the case. In particular, if you have a lot of information about that product anyway on those pages, then that page can still be quite relevant for people who are searching for a specific product. So it’s not necessarily that something goes out of stock, and that page disappears from search.”
Out-of-Stock Products Don’t Hurt Your Entire Site
While it is true that listing one product as unavailable can keep that specific page from appearing in search results, Mueller is sure to reassure you that this should not impact the rest of your website:
“The other thing that’s also important to note here is that even if one product goes out of stock, the rest of the site’s rankings are not affected by that.
So even if we were to drop that one specific product because we think it’s more like a soft 404 page, then people searching for other products on the site, we would still show those normally. It’s not that there would be any kind of negative effect that swaps over into the other parts of the site.”
You can watch the entire discussion with Google’s John Mueller in a recording of the SEO Office Hours session below:
Most people these days understand the general idea of how search engines work. Search engines like Google send out automated bots to scan or “crawl” all the pages on a website, before using their algorithms to sort through which sites are best for specific search queries.
What few outside Google knew until recently, was that the search engine has begun using two different methods to crawl websites – one which specifically searches out new content and another to review content already within its search index.
Google Search Advocate John Mueller revealed this recently during one of his regular Search Central SEO office-hours chats on January 7th.
During this session, an SEO professional asked Mueller about the behavior he has observed from Googlebot crawling his website.
Specifically, the user says Googlebot previously crawled his site daily when it was frequently sharing content. Since content publishing has slowed on this site, he has seen that Googlebot has been crawling his website less often.
As it turns out, Mueller says this is quite normal and is the result of how Google approaches crawling web pages.
How Google Crawls New vs. Old Content
While Mueller acknowledges there are several factors that can contribute to how often it crawls different pages on a website – including what type of pages they are, how new they are, and how Google understands your site.
“It’s not so much that we crawl a website, but we crawl individual pages of a website. And when it comes to crawling, we have two types of crawling roughly.
One is a discovery crawl where we try to discover new pages on your website. And the other is a refresh crawl where we update existing pages that we know about.”
These different types of crawling target different types of pages, so it is reasonable that they also occur more or less frequently depending on the type of content.
“So for the most part, for example, we would refresh crawl the homepage, I don’t know, once a day, or every couple of hours, or something like that.
And if we find new links on their home page then we’ll go off and crawl those with the discovery crawl as well. And because of that you will always see a mix of discover and refresh happening with regard to crawling. And you’ll see some baseline of crawling happening every day.
But if we recognize that individual pages change very rarely, then we realize we don’t have to crawl them all the time.”
The takeaway here is that Google adapts to your site according to your own publishing habits. Which type of crawling it is using or how frequently it is happening are not inherently good or bad indicators of your website’s health, and your focus should be (as always) on providing the smoothest online sales experience for your customers.
Nonetheless, it is interesting to know that Google has made this adjustment to how it crawls content across the web and to speculate about how this might affect its ranking process.
To hear Mueller’s full response (including more details about why Google crawls some sites more often than others), check out the video below:
https://www.tulsamarketingonline.com/wp-content/uploads/2021/01/John-Mueller.jpg337640Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2022-01-11 18:31:032022-01-11 18:31:04Google Reveals It Has Two Ways To Crawl Web Pages
If your site is offline for more than a couple of days you could be at risk of having your pages deindexed, according to Google Search Advocate John Mueller.
It should go without saying that the less downtime your website experiences, the better. Still, some downtime is unavoidable thanks to maintenance, updates, redesigns, and other issues which can be entirely out of your hands.
This inevitably raises the question of exactly how long is too long for your site to be offline. At what point does this begin to hurt your rankings?
After years of debate, we finally have an official answer from Google courtesy of John Mueller during the most recent Google Search Central SEO office hours session.
How Long is Too Long to Be Offline?
The topic arose when an SEO specialist named Aakash Singh asked Mueller what can be done to minimize the loss of rankings or search performance while his client’s website undergoes an expected week of downtime.
The bad news is that a week is simply too long for a site to be offline without experiencing any negative side effects. In fact, Mueller says that sites can start having pages be de-indexed after being down for just a few days.
John Mueller On How Site Downtime Impacts Rankings
Beginning his response, Mueller explains how Google “sees” sites that are experiencing downtime.
“For an outage of maybe a day or so, using a 503 result code is a great way to tell us that we should check back. But after a couple of days we think this is a permanent result code, and we think your pages are just gone, and we will drop them from the index.”
“And when the pages come back we will crawl them again and we will try to index them again. But it’s essentially during that time we will probably drop a lot of the pages from the website from our index, and there’s a pretty good chance that it’ll come back in a similar way but it’s not always guaranteed.”
The general message is that sites should minimize downtime, even when using the proper redirects or site codes.
Mueller does leave us with a suggestion for avoiding the worst fallout from downtime, but he still emphasizes the importance of getting a site back up as quickly as possible:
“… that could be something like setting up a static version of the website somewhere and just showing that to users for the time being. But especially if you’re doing this in a planned way I would try to find ways to reduce the outage to less than a day if at all possible.”
To hear Mueller’s full explanation, check out the recording from the December 10th SEO office hours session below:
https://www.tulsamarketingonline.com/wp-content/uploads/2021/01/John-Mueller.jpg337640Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2021-12-14 18:33:532021-12-14 18:33:55Google May Start Deindexing Your Website If It Goes Offline For More Than a Day
A few weeks ago, Google teased that it planned to refine its PageSpeed Insights tools to make data “more intuitive” and easy to understand. Now, that update has arrived.
What Is The PageSpeed Insights Tool?
If you’re unfamiliar, the PageSpeed Insights tool from Google evaluates your web pages to provide suggestions to improve how quickly content loads.
The tool has been around in various forms since 2013 when it was a simple API webmaster could use to test their page speeds. Version 5, the most recent major update, arrived in 2018. However, smaller updates like this week’s happen somewhat regularly.
The biggest focus of the new update is a change to the user interface to be more intuitive by “clearly differentiating between data derived from a synthetic environment and data collected from users in the field.”
To do this, Google has added dedicated sections for each type of data.
Where the tool used to include a label specifying which type of data you were viewing, Google has instead added information about what the data means for you and how it may be used to improve your performance.
Additionally, Google has shifted its emphasis to data collected from real users by moving field data to the top.
The Core Web Vitals assessment has also been expanded, with a label showing if your site has passed a Core Web Vitals assessment in the field and in-depth metrics from simulated environments.
Importantly, the PageSpeed Insights tool also includes details at the bottom of the page specifying how the data was collected in the field. This information includes:
Data collection period
Lastly, Google has removed the previously included screenshot of the page as it indexed your content, replacing it with a series of images displaying the full loading sequence.
https://www.tulsamarketingonline.com/wp-content/uploads/2021/11/PageSpeed-Insight-Banner.png280800Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2021-11-16 19:47:542021-11-16 19:47:56Improved PageSpeed Insights Arrive At Google