Posts

Bing has announced that its search engine crawler, Bingbot, will be going evergreen over the next few months by adopting the Chromium-based Edge browser to render webpages.

Essentially, this means it will be able to crawl, render, and properly index more of your content more closely to how to actual users see it. 

As Bing says in its announcement:

By adopting Microsoft Edge, Bingbot will now render all web pages using the same underlying web platform technology already used today by Googlebot, Google Chrome, and other Chromium-based browsers. This will make it easy for developers to ensure their web sites and their Content Management System work across all these solutions without having to spend time investigating each solution in depth.

The additional upside is that this mirrors steps recently taken by Google, which suggests it may become easier to optimize for both search engines without specific steps for each platform.

Google has announced they will be rolling out a broad update to their core search algorithm starting later today. 

While the updates are a regular part of maintaining and improving the company’s search engine, Google has typically been reluctant to give advance notice before the update has rolled out. In some cases, they have even been unwilling to address algorithm updates in-depth after their implementation. 

This is only the second time the search engine has announced a broad core algorithm update ahead of time, suggesting they are being more proactive in communicating with webmasters. 

Google’s Danny Sullivan says the update should start very soon and will take up to a few days to complete. 

The company’s announcement didn’t add any new guidance or recommendations for managing your site during and after the rollout of this update, but Google did recommend reviewing the existing guidelines for core updates:

  • Widely notable effects are to be expected, which can include drops or gains in search rankings.
  • Core updates are “broad” in the sense that they don’t target anything specific. Rather, they’re designed to improve Google’s systems overall.
  • Pages that drop in rankings aren’t being penalized; they’re being reassessed against other web content that has been published since the last update.
  • Focusing on providing the best possible content is the top recommended way to deal with the impact of a core algorithm update.
  • Broad core updates happen every few months. Sites might not recover from one update until the next one rolls out.
  • Improvements do not guarantee recovery. However, choosing not to implement any improvements will virtually guarantee no recovery.

New research from Yext and Forbes reinforces just how important it is to keep the information on search engine results relevant to your business accurate and up-to-date. 

The findings from more than 500 US consumers indicates that people automatically assume only half of the information they see in search results is accurate. Additionally, those consumers then hold the brands responsible for any inaccurate information about them, even when it appears outside of your official channels.

The study also revealed a few more bits of interesting information:

  • 57% of respondents say they bypass search and visit a brand’s official website first because they believe the information there will be more complete and accurate.
  • 50% of consumers regularly turn to third-party sites and apps to find information about brands.
  • 48% of those surveyed said a brand’s website is their most trusted source of information.
  • 47% say they are more likely to trust a third-party site over a brand’s website.
  • 20% of current and new customers trust social media to deliver accurate brand information.
  • 28% of consumers avoid buying a brand’s product after seeing inaccurate information.

Marc Ferrentino, Chief Strategy Officer of Yext elaborated on the findings, saying:

”Our research shows that regardless of where they search for information, people expect the answers they find to be consistent and accurate — and they hold brands responsible to ensure this is the case.

… there is a significant opportunity for businesses to differentiate themselves from their competition through verification on and off of their own websites.”

You can download the full report here.

Shelling out extra money on search engine optimization (SEO) can be a scary move for any company, but a new study suggests it pays off. 

Findings from a survey of business owners found that those who pay above the average for SEO services were more likely to be satisfied compared to those who paid less. 

According to the data from Backlinko, the average amount small businesses in America pay for SEO services is $497.16 per month. 

However, those that spent more than $500 per month were 53.3% more likely to say they were “extremely satisfied” compared to those who spent less than that. 

What Business Owners Say About SEO

The responses come from a larger study conducted by Backlinko, which included surveying 1,200 business owners across America about SEO-related issues. 

It found that most small businesses see online optimization largely as a way to drive referrals and reviews, as well as improving Google search performance. 

Through these goals, they also see SEO as a way to bring in new customers and increase sales. 

Unfortunately, many of these businesses aren’t getting what they are expecting out of their SEO providers.

Just 30% of business owners said they would recommend their current service provider. 

Disappointed With SEO

The biggest issues were simply that businesses were not satisfied with the results and some search engine optimization service providers delivered poor customer service or responsiveness. 

Part of this may be that some businesses choose to work with freelancers. The results showed that those who worked with agencies were more likely to be satisfied than those who hired freelancers. 

More likely, though, is that many SEO agencies and service providers are likely overselling what their low-level services can accomplish. While any search engine optimization is better than none, many promise that low-cost options will lead to major gains.

Based on the findings, they are more likely to be satisfied by diving in and truly investing in their online optimization, rather than only dipping a toe in the water.

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below:

This week, Google announced it will begin adding new websites to its mobile-first index by default beginning July 1. However, older sites that have yet to be added to the mobile-first index will still be exempt until they are updated to be mobile-friendly.

In the announcement, Google explained that “mobile-first indexing will be enabled by default for all new, previously unknown to Google Search, websites starting July 1, 2019. It’s fantastic to see that new websites are now generally showing users – and search engines – the same content on both mobile and desktop devices.”

While new sites will be moved to the mobile-first index, older sites which have not been added will not be migrated over yet.

“For older websites, we’ll continue monitoring and evaluating pages for their readiness for mobile first indexing and will notify them through Search Console once they’re seen as being ready,” as the announcement said.

No Notifications

Google has been notifying site owners when their site has been migrated to the mobile-first index through Search Console notifications. However, this will not be the case for new sites that are added to the index by default.

“Since the default state for new websites will be mobile-first indexing, there’s no need to send a notification,” Google stated.

What is the mobile-first index?

Google’s mobile-first index is the search engines primary way of cataloging sites across the internet. Launched a few years ago, the mobile-first index analyses the mobile version of a page first and uses that information to rank web pages. Although it started small, the index has become Google’s primary search engine index with more than 50% of what is indexed by Google being added to the mobile-first index.

The news adds even more motivation to new site creators and business owners to ensure they provide a smooth experience with the same content on both desktop and mobile when the site is launched. Not only will many of your customers likely visit your site through mobile devices, but how mobile-friendly your site is will directly affect your search engine ranking.

Often, businesses think of SEO and online advertising as being entirely separate. They may feel like they need to choose one or the other. However, a new study from WordStream shows that most experts agree that SEO and advertising work best together, not apart.

The new data published in WordStream’s report on the online advertising landscape in 2019 reveals that more than three-quarters (79%) of online advertisers are also incorporating SEO within their marketing strategies.

Even more, digital advertisers ranked SEO as the leading marketing channel aside from advertising for growing their business.

The full breakdown of responses is as follows:

Outside of digital advertising, what other marketing channels are you using to grow your business in 2019?

  • SEO – 79%
  • Email marketing – 66%
  • Content marketing – 60%
  • Word of mouth marketing – 47%
  • Direct mail – 32%
  • Event marketing – 26%
  • Guerrilla marketing – 9%
  • Affinity marketing – 6%
  • Telemarketing – 4%
  • Other – 1%

As WordStream explains, the findings show that while advertisers may prioritize paid search for bringing in immediate revenue, they also recognize the importance of fostering a long-term strategy for bringing in new potential customers:

“Like content marketing, SEO can be an extremely valuable long-term strategy when done effectively. Kudos to those surveyed for recognizing the importance of balancing short-term results with a long-term strategy for sustainable growth!”

The report includes a number of other interesting tidbits about the current state of online advertising, including the discovery that nearly half of advertisers are increasing their Google search ads budgets this year.

To read the full report, click here.

A lot of people have come to think of search engine optimization and content marketing as separate strategies these days, but Google’s John Mueller wants to remind webmasters that both are intrinsically linked. Without great content, even the most well-optimized sites won’t rank as high as they should.

The discussion was brought up during a recent Google Webmaster Central hangout where one site owner asked about improving rankings for his site.

Specifically, he explained that there were no technical issues that he could find using Google’s tools and wasn’t sure what else he could do to improve performance.

Here’s the question that was asked:

“There are zero issues on our website according to Search Console. We’re providing fast performance in mobile and great UX. I’m not sure what to do to improve rankings.”

Mueller responded by explaining that it is important to not forget about the other half of the equation. Just focusing on the technical details won’t always lead to high rankings because the content on the site still needs to be relevant and engaging for users.

The best way to approach the issue, in Mueller’s opinion, is to ask what issues users might be having with your products or services and what questions they might ask. Then, use content to provide clear and easily available answers to these questions.

In addition to these issues, Mueller noted that some industries have much stronger competition for rankings than others. If you are in one of these niches, you may still struggle to rank as well as you’d like against competition which has been maintaining an informative and well-designed site for longer.

You can read or watch Mueller’s answer in full below, starting at 32:29 in the video:

“This is always kind of a tricky situation where you’re working on your website for a while, then sometimes you focus on a lot of the technical details and forget about the bigger picture.

So what I would recommend doing here is taking your website and the queries that you’re looking [to rank] for, and going to one of the webmaster forums.

It could be our webmaster forum, there are lots of other webmaster forums out there where webmasters and SEOs hang out. And sometimes they’ll be able to look at your website and quickly pull out a bunch of issues. Things that you could be focusing on as well.

Sometimes that’s not so easy, but I think having more people look at your website and give you advice, and being open to that advice, I think that’s an important aspect here.

Another thing to keep in mind is that just because something is technically correct doesn’t mean that it’s relevant to users in the search results. That doesn’t mean that it will rank high.

So if you clean up your website, and you fix all of the issues, for example, if your website contains lots of terrible content then it still won’t rank that high.

So you need to, on the one hand, understand which of these technical issues are actually critical for your website to have fixed.

And, on the other hand, you really need to focus on the user aspect as well to find what are issues that users are having, and how can my website help solve those issues. Or help answer those questions.”

Not long ago, it seemed like every business website had a “Testimonials” page filled with reviews and references from either past-customers or fellow members of their industry. If you have a keen eye, though, you might have noticed these pages are slowly falling out of use in favor of posting your Google, Yelp, and other online reviews on your site.

The practice has led to some confusion, as many experts claimed putting your own online reviews from across the web on your site could be potentially dangerous for search engine optimization. There have even been suggestions it could lead to Google penalties.

Now, you can breathe easy and share your online reviews with pride, as Google webmaster trends analyst John Mueller has confirmed that it is totally fine to highlight your reviews on your company website – with one exception.

While posting your reviews on your website is acceptable, Mueller warns that you can not use review structured data on these reviews.

As Mueller explained on Twitter:

“From a Google SEO point of view, I don’t see a problem with that. I imagine the original is more likely to rank for that text, but if you use that to provide context, that’s fine (it shouldn’t be marked up with structured data though).”

Mueller then went on to explain that review structured data is intended for reviews “directly produced by your site” and using them on third-party reviews on your own site would go against Google’s guidelines.

Google Algorithm

After much ado, Google has rolled out its latest big algorithm update, called the “Speed Update.” And, once again, there has been little to no effect on the search results we see every day.

This is the latest in a pattern of big announcements of search algorithm updates that seem to fizzle out into nothing. It would be reasonable for many to stop caring and assume they don’t really need to worry about all these algorithm updates.

They would be wrong.

Why Google’s Algorithm Updates Matter

The other trend running through Google’s latest algorithm updates is that they have been almost universally focused on usability across devices. In other words, Google cares about how users perceive your website. Is it out of date? Slow? Impossible to read on a smartphone?

Of course, Google’s interest here isn’t entirely altruistic. They have made their name by delivering the best search results possible. If they allow low-quality sites to dominate the rankings, they wouldn’t be doing their job very well.

As a business, you also aren’t doing a very good job representing yourself if you aren’t living up to most of Google’s latest standards. People will be put off if your copy is outdated, or your site is too slow or buggy to use. This is the biggest reason you should really care. Google’s standards are (largely) the same as your potential customers’ standards.

Bringing It All Together

If you aren’t doing one of the things above, you might be able to get away with it. Some people may give you a pass for a sluggish website. It might not matter much if your copy is a year or two old if it is still relevant and accurate. Desktop users won’t even know if your site isn’t mobile-friendly. Taken together, though, it paints a really bad picture.

This is essentially how Google’s algorithm functions. There are literally hundreds of factors or signals that affect how sites are ranked. A single new search signal isn’t likely to have a big impact. Neglecting several search signals will likely have serious consequences for your rankings.

The best way to think of Google’s search algorithm is by comparing it to a test. Missing one or two answers is fine. But, the more questions you get wrong, the worse your score is. In the end, it is always best to strive to ensure your site isn’t just meeting basic standards, but is designed and optimized to perform as well as possible. This way, you’ll satisfy anyone who comes to your site, and you won’t have to worry about updating every time Google launches a new algorithm update.