Tag Archive for: search engine indexing

One of the most frustrating aspects of search engine optimization is the time it takes to see results. In some cases, you can see changes start to hit Google’s search engines in just a few hours. In others, you can spend weeks waiting for new content to be indexed with no indication when Google will get around to your pages.

In a recent AskGooglebot session, Google’s John Mueller said this huge variation in the time it takes for pages to be indexed is to be expected for a number of reasons. However, he also provides some tips for speeding up the process so you can start seeing the fruits of your labor as soon as possible.

Why Indexing Can Take So Long

In most cases, Mueller says sites that produce consistently high quality content should expect to see their new pages get indexed within a few hours to a week. In some situations, though, even high quality pages can take longer to be indexed due to a variety of factors.

Technical issues can pop up which can delay Google’s ability to spot your new pages or prevent indexing entirely. Additionally, there is always the chance that Google’s systems are just tied up elsewhere and need time to get to your new content.

Why Google May Not Index Your Page

It is important to note that Google does not index everything. In fact, there are plenty of reasons the search engine might not index your new content.

For starters, you can just tell Google not to index a page or your entire site. It might be that you want to prioritize another version of your site or that your site isn’t ready yet. 

The search engine also excludes content that doesn’t bring sufficient value. This includes duplicate content, malicious or spammy pages, and websites which mirror other existing sites.

How To Speed Up Indexing

Thankfully, Mueller says there are ways to help speed up indexing your content.

  • Prevent server overloading by ensuring your server can handle the traffic coming to it. This ensures Google can get to your site in a timely manner. 
  • Use prominent internal links to help Google’s systems navigate your site and understand what pages are most important.
  • Avoid unnecessary URLs to keep your site well organized and easy for Google to spot new content.
  • Google prioritizes sites which put out consistently quality content and provide high value for users. The more important Google thinks your site is for people online, the more high priority your new pages will be for indexing and ranking.

For more about how Google indexes web pages and how to speed up the process, check out the full AskGooglebot video below:

Bing has announced that its search engine crawler, Bingbot, will be going evergreen over the next few months by adopting the Chromium-based Edge browser to render webpages.

Essentially, this means it will be able to crawl, render, and properly index more of your content more closely to how to actual users see it. 

As Bing says in its announcement:

By adopting Microsoft Edge, Bingbot will now render all web pages using the same underlying web platform technology already used today by Googlebot, Google Chrome, and other Chromium-based browsers. This will make it easy for developers to ensure their web sites and their Content Management System work across all these solutions without having to spend time investigating each solution in depth.

The additional upside is that this mirrors steps recently taken by Google, which suggests it may become easier to optimize for both search engines without specific steps for each platform.

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below: