After nearly two weeks, Google has confirmed that its latest core update has finished rolling out. This means brands and sites that may have seen volatility online since March are all-clear to start assessing the impact. 

After being officially announced on March 27, the March 2026 core update finished rolling out after 12 days and 4 hours. This was just short of Google’s initial statement that the roll-out could take up to two weeks to be finalized. 

At the time that it started rolling out, the search engine said the update was “a regular update designed to better surface relevant, satisfying content for searchers from all types of sites.”

As is typical with core updates, there was no warning or further information about what to expect from the rollout. This left many site owners on the edge of their seat, anxious to see if the update would have a sizable impact on their site performance. 

Google notified the public that the update was completed via its Search Status Dashboard yesterday, April 8th, 2026. This means that sites can start accurately measuring the effect of the update and planning for future optimization. 

Were You Impacted By The Core Update?

If you were significantly impacted by the update, you should prepare for a lengthy recovery. Fallout from core updates is often long-lasting, with only gradual improvements until another major core update is released. That said, there are still steps you can take to start recovering now if the core update has led to a drop in site performance. 

In particular, Google recommends assessing your website content to ensure you are delivering quality content that provides value to real internet users and avoiding clickbait or spammy practices to over-optimize your content. To help with this, Google encourages those affected to review their guidelines for people-first content

At the same time, there may not be anything specific to fix. Google has consistently emphasized that a drop in rankings after a core update may not indicate there is anything major wrong with your site. 

As the company said as far back as 2019, “We know those with sites that experience drops will be looking for a fix, and we want to ensure they don’t try to fix the wrong things. Moreover, there might not be anything to fix at all.”

In many cases, the only way to know if you should make changes to your marketing strategies is to conduct in-depth analysis and do an honest self-assessment of your content creation approach.

Google has confirmed it has started a “small and narrow test” using AI to rewrite site headlines in search with AI without any notification to users or website managers. 

The confirmation raised eyebrows as it used strikingly similar language as the search company used last year when it confirmed it was rewriting headlines in its Discover feed before making it an official feature a month later. 

Google Confirms Rewriting Headlines With AI

According to a report from The Verge, Google has been rewriting headlines in search for several months. Notably, many of the headline rewrites led to misleading or outright unrelated headlines. For example, researchers noted an instance where the headline “I used the ‘cheat on everything’ AI tool and it didn’t help me cheat on anything” to the shorter, less-descriptive headline “‘Cheat on everything’ AI tool.” In another case, it gave an article the headline “Copilot Changes: Marketing Teams at it Again” despite that language never being used in the article. 

Sean Hollister described the practice as similar to “a bookstore ripping the covers off the books it puts on display and changing the titles.”

While the AI rewrites seemed to be used most frequently on news sites, The Verge confirmed Google has changed headlines on other types of websites as well. 

None of these changes had any notice or disclosure that the headline users were seeing was different from the original headline. 

In a statement, Google said it aims to use AI to “identify content on a page that would be a useful and relevant title to a users’ query” and to improve “matching titles to users’ queries and facilitating engagement with web content.”

A Repeating Pattern

While it is not uncommon for Google to test features like this on a limited number of sites, Matt Southern from Search Engine Journal noted that Google’s confirmation to The Verge was eerily similar to how the company addressed using AI to rewrite headlines in Discover. 

In December of last year, the company acknowledged it was using AI-generated headlines in a “small UI experiment for a subset of Discover users.”

By January, the company announced this was officially a feature for articles appearing in Discover. 

Differences With How Google Previously Rewrote Titles

This is not the first time Google has changed headlines appearing in search. In fact, one study found that more than three-quarters of title tags were changed when they appeared in search results. 

However, the new test is unique for the way it is using AI to generate entirely new headlines. In the past, Google would rewrite titles and headlines by pulling from content directly on the related page. 

With this new test, Google is moving to create titles entirely from scratch without necessarily using phrasing on webpages, risking creating misleading or unrelated headlines. 

This could cause major issues for some publishers, as users will get frustrated and mistrust sites they believe are using misleading or “clickbait” headlines. 

For more, read the full article from The Verge (requires a subscription) or Matt Southern’s coverage from Search Engine Land.

A new analysis indicates that AI tools are now generating enough sessions to be equivalent to more than half of search engine volume – highlighting the surge in artificial intelligence usage in recent times. 

According to data from Graphite.io CEO Ethan Smith, AI tools drive more than 45 billion monthly sessions worldwide, equivalent to roughly 56% of search engine volume. 

The study analyzed usage from both web traffic and mobile apps going to major artificial intelligence tools including ChatGPT, Gemini, Perplexity, Grok, and Claude.

While desktop usage accounted for a significant amount of LLM usage, the report indicates that mobile tools have been the driving force behind the major rise in AI usage.

Notably, the study found that the increased usage of AI didn’t necessarily come at the cost of traditional search engine usage. Instead, the combined use of online search and LLM tools rose 26% since 2023. 

What The Report Says

The report specifically reviewed usage of the five largest AI tools available (ChatGPT, Gemini, Perplexity, Grok, and Claude), and compared them against the largest search engines. 

It found that all the artificial intelligence platforms combined generated approximately 45 billion monthly sessions worldwide, including 5.4 billion sessions in the U.S. each month. 

Of all AI usage, 83% came from mobile apps. In the U.S., mobile apps drove 75% of artificial intelligence use. 

While other AI platforms have shown increased usage in recent months, the study shows that ChatGPT still leads the pack by a wide margin, driving 89% of global LLM usage. 

These findings are notable because this is one of the only studies to compare online search with LLM usage across both desktop sessions and mobile apps. This leads the report to suggest that most comparisons between AI use and online search usage underestimate AI use by 4-5x. 

At the same time, the findings suggest that artificial intelligence and digital search are not necessarily in competition with each other. While search may be losing some use to LLMs, Smith suggests that the increase in overall usage suggests both search and artificial intelligence may both be essential for users. Rather than AI superceding search, brands need to invest in both to maintain visibility to users. 

For more, read the full report here

While a growing number of U.S. consumers are using TikTok for search, a new survey suggests the popular social network may not be as strong of a challenger to Google as previously believed. 

A new survey from Adobe Express found that the number of people using TikTok for search has grown compared to a 2024 survey from the same company. However, the study found that fewer young users say they prefer TikTok to Google’s search engine. 

The Study

The report comes from an Adobe Express survey published earlier this month and conducted in January 2026. It surveyed over 800 consumers and 200 small businesses in the U.S. about their search habits across various platforms including Google, TikTok, and ChatGPT. 

It found that 49% of consumers report using TikTok as a search engine, an 8 point increase from 2024. However, the most notable findings were among Gen Z users. 

Gen Z and TikTok as a Search Engine

While much has been made about the number of Gen Z users favoring TikTok over Google, the study shows that number is actually falling. 

Among Gen Z users who were surveyed, those who said they were more likely to turn to TikTok for a search over Google fell from 8% in 2024 to 4% in 2026. 

This isn’t to say Gen Z is using TikTok less for search, though. In fact, 65% of Gen Z users said they use TikTok as a search engine, and 25% said they found it effective for finding information. It’s just that they don’t necessarily prefer TikTok’s search tools over Google’s. 

Instead, it seems that Gen Z is adopting a multiplatform approach to search – using the platform they feel is best or most convenient for specific searches. 

ChatGPT Shows Growth as a Search Engine

While the number of people who prefer TikTok for search over Google fell, the survey suggests more users of every age group are turning to ChatGPT for search over Google. 

According to the survey, 14% of users say they are more likely to use ChatGPT for search than Google. This was true even when broken down by age group, with 12% of Gen Z, 15% of millennials, 15% of Gen Z, and 14% of baby boomers. 

What This Means

When a significant number of younger users started reporting using TikTok over Google, it caught the notice of many brands and marketers. However, it appears the situation isn’t as simple as “TikTok will be the next big search engine”. Instead, it appears that users are using a variety of search platforms, with ChatGPT quickly growing as a significant player in search. 

It is unclear whether the sale of TikTok’s U.S. operations or changes to the platform contributed to the decrease in those who favor the platform. 

For brands looking to supplement their search marketing in the face of falling organic search traffic from Google, the answer seems to be investing in multiple platforms and ensuring they are getting picked up by AI tools – especially ChatGPT. That said, it will still likely be quite some time before any single platform dethrones Google as the biggest search engine.

For more, read the full report from Adobe Express here.

Crawling and indexing issues are one of the most damaging SEO issues a site can have. Not only do they hurt your rankings, making your business and products less visible in search results. These types of issues can completely prevent pages or entire sections of your site from being properly added to Google’s search indexes.

Now, two of Google’s most well-known representatives have shed light on the two biggest crawling issues the search engine encounters regularly

In a recent Search Off the Record podcast, Gary Illyes and Martin Splitt went into detail on the biggest crawling challenges Google faces in 2025, including the two biggest issues Google sees. 

According to Illyes and Splitt, faceted navigation and action parameters account for approximately 75% of crawling issues that Google encounters. 

Both issues create crawling problems that can overload your server, slow your site down significantly, and create infinite crawling loops. If this happens, Google’s crawlers can expend a massive amount of energy and server bandwidth that can bring your website to a screeching halt and even make your site entirely inaccessible in some cases. 

The Two Biggest Crawling Issues 

Gary Illyes says the two biggest issues account for 75% of crawl challenges.

  • Faceted Navigation – This accounts for 50% of issues according to Google. This is a navigation strategy (typically used on e-commerce sites) to allow users to filter and navigate items based on specific details like price, manufacturer, or size. The issue is that this system can generate a seemingly endless number of URL patterns if it creates a URL for every single combination of filters. Without careful management, this can lead to Google crawlers expending crawl budgets on URLs with negligible search value, duplicate content issues, and slower site performance. 
  • Action Parameters – These account for 25% of challenges. These are URL parameters used to trigger or track specific user actions, such as adding an item to a cart or saving an item to a wishlist. Importantly, these sorts of parameters don’t tend to meaningfully change page content, creating widespread duplicate content issues on your site. 

Additionally, Illyes mentioned a few other, less common crawling challenges:

  • Irrelevant Parameters – These account for 10% of issues. These problems pop up when crawlers notice strings of parameters (typically used to track session ID numbers or UTM parameters) attached to content that Google’s systems deem irrelevant to the actual navigation or page content. 
  • WordPress Plugins or Widgets – Approximately 5% of crawling issues come from WordPress plugins and other tools for sites using similar CRM’s. In some cases, these widgets may modify URLs for event tracking. Google can struggle to understand when this happens, because there is not an established system or pattern that these tools follow. 
  • “Weird Stuff”: – Lastly, Illyes attributed approximately 2% of problems to rare technical issues that pop-up. In the podcast, he cited times when URLs may be double-encoded. This means that when the crawler decodes the URL, it is still left with an unusable encoded string instead of a functional URL. 

In the discussion, Illyes and Splitt go more into detail about these issues, what causes them to arise, and how to prevent them. For more, listen to the full episode here.

Since AI overviews have taken over search results and reduced the amount of clicks going to traditional search results, a debate has emerged within the SEO community. While one half of the community insists that the rise of AI calls for an entirely new approach to website optimization, others have cautioned that there is no need to reinvent the wheel. 

Now, two leading figures from Google have chimed in with a lengthy discussion on AI, how it is impacting search, and whether you need to be doing anything new to improve your visibility in AI overviews. 

Do You Need To Be Doing Anything New To Optimize For AI Search?

In a recent episode of the Search Off The Record podcast, Google’s Danny Sullivan and John Mueller spoke about AI and how it has changed search (and how it hasn’t). While they concede that it feels like search has changed significantly over the past couple years, the rise of artificial intelligence in search doesn’t actually call for any changes to how you optimize your website. 

Instead, they say that AI overviews use many of the same signals that were being used by the search engine.

To start the discussion, Jonn Mueller asked:

“So everything kind of around AI, or is this really a new thing? It feels like these fads come and go. Is AI in fad? How do you think?”

To which, Danny Sullivan said:

“Oh gosh, my favorite thing is that we should be calling it LMNOPEO because there’s just so many acronyms for it. It’s GEO for generative engine optimization or AEO for answer engine optimization and AIEO. I don’t know. There’s so many different names for it.

I used to write about SEO and search. I did that for like 20 years. And part of me is just so relieved. I don’t have to do that aspect of it anymore to try to keep up with everything that people are wondering about.

And on the other hand, you still have to kind of keep up on it because we still try to explain to people what’s going on. And I think the good news is like, There’s not a lot you actually really need to be worrying about.

It’s understandable. I think people keep having these questions, right? I mean, you see search formats changing, you see all sorts of things happening and you wonder, well, is there something new I should be doing? Totally get that.

And remember, we, John and I and others, we all came together because we had this blog post we did in May, which we’ll drop a link to or we’ll point you to somehow to it, but it was… we were getting asked again and again, well, what should we be doing? What should we be thinking about?

And we all put our heads together and we talked with the engineers and everything else. So we came up with nothing really that different.

Google’s Systems Prioritize The Best Content For Humans

According to Danny Sullivan, the reason you don’t need new strategies to optimize for overviews is that Google’s AI systems aim to surface the best content for human users – the same as Google’s traditional search systems. 

As Sullivan explained:

“And when it comes to all of our ranking systems, it’s about how are we trying to reward content that we think is great for people, that it was written for human beings in mind, not written for search algorithms, not written for LLMs, not written for LMNO, PEO, whatever you want to call it.

It’s that everything we do and all the things that we tailor and all the things that we try to improve, it’s all about how do we reward content that human beings find satisfying and say, that was what I was looking for, that’s what I needed. So if all of our systems are lining up with that, it’s that thing about you’re going to be ahead of it if you’re already doing that.

To whereas the more you’re trying to… Optimize or GEO or whatever you think it is for a specific kind of system, the more you’re potentially going to get away from the main goal, especially if those systems improve and get better, then you’re kind of having to shift and play a lot of catch up.

So, you know, we’re going to talk about some of that stuff here with the big caveat, we’re only talking about Google, right? That’s who we work for. So we don’t say what, anybody else’s AI search, chat search, whatever you want to kind of deal with and kind of go with it from there. But we’ll talk about how we look at things and how it works.”

Optimizing For Humans Is The Key To Google Success

While Sullivan’s comments are limited to Google’s own AI systems, they make it clear that optimizing for anything or anyone other than the end user is a mistake. Instead, it is crucial to do everything possible to deliver the best content and most value for real human visitors to your site. 

This makes sense in the big scheme of things. Ultimately, Google’s goal is to provide the best experience for real human users and delivering websites optimized primarily for artificial systems isn’t likely to win over many users. If you keep your goals aligned with providing the most value and best experience for human users, you’re better positioned to be the type of site Google wants to highlight. 

For more from Danny Sullivan’s discussion with John Mueller, listen to the full Search Off The Record episode here.

LLMs.txt files, an increasingly popular method to improve visibility in AI search, may not be as effective as previously thought.

After reviewing over 300,000 domains, a new analysis from SE Ranking found that LLMs.txt files have no noticeable effect on visibility in major LLMs. 

What Are LLMs.txt Files?

LLMs.txt files are files uploaded alongside your website with the intention of helping AI tools understand your domain and more easily navigate or index your site. In recent months, these..txt files have begun to catch on as a relatively easy way to optimize your site for major AI tools.

They are similar to robots.txt files, which are widely used to optimize websites for search engines like Google or Bing. 

While robots.txt files are well-established as an effective and simple SEO tool, it has been unclear whether any major LLM has begun using LLMs.txt files when indexing websites. 

Slow Adoption

As a new optimization strategy, LLMs.txt files have seen relatively slow adoption. According to SE Ranking’s data, just 10.13% of domains crawled for the study had implemented these files as an optimization strategy. 

Part of the reason for the slow adoption of these files is the lack of research or clear data showing their impact.

Notably, the sites that were most likely to use these files were mid-tier websites likely looking for any edge against high-traffic competition. 

Data Says LLMs.txt Files Are Not Tied To Visibility

For SE Ranking’s study, a team of researchers analyzed how frequently domains were cited across responses from several popular LLMs. 

Ultimately, their data was unable to provide any evidence that the strategy had any real effect, even when using alternate research models to compare.

While the study doesn’t outright discourage people from implementing LLMs.txt files, the researchers conclude fairly bluntly that LLMs.txt “doesn’t seem to directly impact AI citation frequency. At least not yet.”

Robots.txt Is Still Used – Even By AI

While LLMs.txt files had been gradually gaining popularity, the reality is that none of the major AI tools have indicated they use these files when crawling or indexing websites. 

Aside from limited data suggesting GPTBot sometimes fetches LLMs.txt files, there is no evidence of any major LLM using or recommending implementing these files. 

Instead, OpenAI encourages websites to focus on robots.txt files and Google seemingly defers to these types of files when crawling or indexing pages. 

For more about the analysis and how it was conducted, read SE Ranking’s full report here.

Google VP of Search, Liz Reid, recently spoke with The Wall Street Journal in a revealing interview focusing on the company’s approach to content as AI becomes more deeply integrated in both its search engine and society at large. 

Google Strives To Give Users What They Want

Liz Reid made it clear that what content Google decides to surface in both AI overviews and traditional search results are shaped heavily by user feedback.:

“…we do have to respond to who users want to hear from, right? Like, we are in the business of both giving them high quality information, but information that they seek out. And so we have over time adjusted our ranking to surface more of this content in response to what we’ve heard from users.

…You see it from users, right? Like we do everything from user research to we run an experiment. And so you take feedback from what you hear, from research about what users want, you then test it out, and then you see how users actually act. And then based on how users act, the system then starts to learn and adjust as well.”

This reflects how Google doesn’t just look for “the best content” but instead ranks “the best content for its users”. It is a small but important distinction which can have significant implications for what content gets seen on the search engine. Smart businesses and marketers will keep this in mind when creating online content. Don’t just make great content. Make great content that people want to engage with. 

Is AI Content Automatically Bad?

When discussing the idea of high quality content, Reid takes a moment to discuss AI content. Specifically she indicates that the search engine is essentially neutral about AI-generated content. While the search engine strives not to show “slop”, she suggests that AI content can easily rank alongside handmade content – so long as it passes the same quality standards:

“Now, AI generated content doesn’t necessarily equal spam.

But oftentimes when people are referring to it, they’re referring to the spam version of it, right? Or the phrase AI slop, right? This content that feels extremely low value across, okay? And we really want to make an effort that that doesn’t surface.”

People Want Unique Perspectives

When discussing what type of content users are most likely to click on, Reid emphasized that people don’t want surface level takes or superficial content. They want content that has depth and offers a unique perspective. This is especially true when it comes to what content people click on in AI overviews. 

As Reid said:

“But what we see is people want content from that human perspective. They want that sense of like, what’s the unique thing you bring to it, okay? And actually what we see on what people click on, on AI Overviews, is content that is richer and deeper, okay?

That surface-level AI generated content, people don’t want that because if they click on that, they don’t actually learn that much more than they previously got. They don’t trust the result anymore.

So what we see with AI Overviews is that we surface these sites and get fewer what we call bounce clicks. A bounce click is like you click on your site, Yeah, I didn’t want that, and you go back.

AI Overviews gives some content, and then we get to surface deeper, richer content, and we’ll look to continue to do that over time so that we really do get that creator content and not the AI generated.”

Even AI Tools Look For Human Perspectives

Throughout the interview, Reid makes it clear that users are looking for content that stands out from the noise. They want content with unique, strong perspectives and human experiences. While AI can help turn these perspectives into content people can enjoy, they ultimately require human guidance and insight to rise above all the other content online. 

The full interview has a lot of interesting insights from someone with deep knowledge of how Google search works and how it is advancing into the AI age. Watch it below:

Google’s Gary Illyes recently explained that Google’s search engine treats AI-generated images essentially the same as any other images and does not penalize sites for using AI images.

In a Q&A with interviewer Kenichi Suzuki and shared by Search Engine Journal, Illyes explained that AI-generated images have no direct impact on SEO or online rankings. 

Instead, he suggested that any effect to rankings from AI-generated issues may be brought on by technical issues. He suggested that brands may even see increased traffic if they use AI to create unique images. 

How Does Google Handle AI-Generated Content?

Google has largely been trying to take a nuanced approach to how it handles content made with AI. While the company has encouraged those who use AI-generated text content to ensure it is reviewed by humans, they have also taken steps to derank low-quality AI content.

At the same time, Google has not directly addressed how it handles AI-generated images.

About 10 minutes into the recent interview, Illyes was asked if Google would punish a site if some of their images were made with AI:

“Say if there’s a content that the content itself is legit, the sentences are legit, but also there are a lot of images which are relevant to the content itself, but all of them – let’s say all of them are generated by AI. Will the content or the overall site, is it going to be penalized or not?”

In response, Illyes emphasized that AI-generated images don’t affect SEO in any direct way.

“No, no. So AI generated image doesn’t impact the SEO. Not direct.

So obviously when you put images on your site, you will have to sacrifice some resources to those images… But otherwise you are not going to, I don’t think that you’re going to see any negative impact from that.

If anything, you might get some traffic out of image search or video search or whatever, but otherwise it should just be fine.”

In other words, the only major SEO consideration a site should have when using AI-generated content is ensuring the images are small enough and properly optimized to load quickly. 

While brands should consider other potential issues you might encounter using AI-generated images, such as how your audience will respond, Illyes’s comments make it clear that Google won’t be punishing you simply for using AI to create your graphics or pictures.

While AI overviews upend much of how we look for information online, marketers have been split on how to respond. Some say that traditional SEO is all that is necessary to get your site cited by Google’s AI overviews, while others have been arguing that a new “SEO for AI” is needed. Now, Google has weighed in. 

During a talk at Search Central Live, Google’s Gary Illyes, told attendees that AI search tools don’t mean marketers need to use a new type of optimization and that standard SEO practices are all that is needed to be included in Google’s AI overviews and AI mode. 

While we were not present at the event, Google Search Advocate Kenichi Suzuki shared a detailed overview of what Gary Illyes discussed, including three main focus areas: 

  1. AI uses traditional SEO infrastructure and signals.
  2. Content quality matters, but so does authenticity
  3. Google has used AI in its traditional search for a long time.

How AI Uses SEO

Illyes emphasized that Google’s AI tools rely on the same basic systems and infrastructure used elsewhere by Google, including relying on the same search signals and indexing approach. 

As Suzuki says:

“[Illyes] explicitly stated that there is no need for a new acronym or a separate discipline. The core principles of creating helpful, reliable, people-first content remain the foundation for visibility in all of search formats.”

Authenticity Matters

Gary Illyes said that while Google does not punish sites that publish content made with AI, it watches for signs of abuse, including sites that churn out tons of low-quality AI content or pages with deceptive information like fake author personas or AI-generated images presented as real. 

Suzuki summed up Illyes’s statements, saying:

“Search Quality Raters are instructed to give the lowest possible rating to any content that is deceptive. This includes creating fake author personas with AI-gen images or churning out content that simply rehashes information from other sources without adding unique value or experience.”

Google Has Been Using AI For a Long Time

Throughout his presentation, Gary repeatedly emphasized that Google’s use of AI goes back years before the current surge in generative AI tools. Specifically, Illyes pointed to Google’s MUM system as a form or predictive AI to understand the intent behind queries.

While the introduction of MUM did cause some shifts in how we approach SEO in general, it did not call for an entirely new optimization discipline, just as new generative AI tools do not require a new “SEO for AI”. 

The Takeaway

While AI is undeniably making us change some aspects of search engine optimization, it doesn’t call for your business to adopt “GEO” or “AI SEO” or any other separate approaches to optimization. 

Instead, it is essential that you adapt your current SEO strategies, focus on providing content that provides real value to readers, and develop strategies to cement your authentic authority in your field.