Posts

A lot of people have come to think of search engine optimization and content marketing as separate strategies these days, but Google’s John Mueller wants to remind webmasters that both are intrinsically linked. Without great content, even the most well-optimized sites won’t rank as high as they should.

The discussion was brought up during a recent Google Webmaster Central hangout where one site owner asked about improving rankings for his site.

Specifically, he explained that there were no technical issues that he could find using Google’s tools and wasn’t sure what else he could do to improve performance.

Here’s the question that was asked:

“There are zero issues on our website according to Search Console. We’re providing fast performance in mobile and great UX. I’m not sure what to do to improve rankings.”

Mueller responded by explaining that it is important to not forget about the other half of the equation. Just focusing on the technical details won’t always lead to high rankings because the content on the site still needs to be relevant and engaging for users.

The best way to approach the issue, in Mueller’s opinion, is to ask what issues users might be having with your products or services and what questions they might ask. Then, use content to provide clear and easily available answers to these questions.

In addition to these issues, Mueller noted that some industries have much stronger competition for rankings than others. If you are in one of these niches, you may still struggle to rank as well as you’d like against competition which has been maintaining an informative and well-designed site for longer.

You can read or watch Mueller’s answer in full below, starting at 32:29 in the video:

“This is always kind of a tricky situation where you’re working on your website for a while, then sometimes you focus on a lot of the technical details and forget about the bigger picture.

So what I would recommend doing here is taking your website and the queries that you’re looking [to rank] for, and going to one of the webmaster forums.

It could be our webmaster forum, there are lots of other webmaster forums out there where webmasters and SEOs hang out. And sometimes they’ll be able to look at your website and quickly pull out a bunch of issues. Things that you could be focusing on as well.

Sometimes that’s not so easy, but I think having more people look at your website and give you advice, and being open to that advice, I think that’s an important aspect here.

Another thing to keep in mind is that just because something is technically correct doesn’t mean that it’s relevant to users in the search results. That doesn’t mean that it will rank high.

So if you clean up your website, and you fix all of the issues, for example, if your website contains lots of terrible content then it still won’t rank that high.

So you need to, on the one hand, understand which of these technical issues are actually critical for your website to have fixed.

And, on the other hand, you really need to focus on the user aspect as well to find what are issues that users are having, and how can my website help solve those issues. Or help answer those questions.”

If you operate a website that is frequently creating or changing pages – such as an e-retail or publishing site – you’ve probably noticed it can take Google a while to update the search engine with your new content.

This has led to widespread speculation about just how frequently Google indexes pages and why it seems like some types of websites get indexed more frequently than others.

In a recent Q&A video, Google’s John Mueller took the time to answer this directly. He explains how Google’s indexing bots prioritize specific types of pages that are more “important” and limit excessive stress on servers. But, in typical Google fashion, he isn’t giving away everything.

The question posed was:

“How often does Google re-index a website? It seems like it’s much less often than it used to be. We add or remove pages from our site, and it’s weeks before those changes are reflected in Google Search.”

Mueller starts by explaining that Google takes its time to crawl the entirety of a website, noting that if it were to continuously crawl entire sites in short periods of time it would lead to unnecessary strain on the server. Because of this, Googlebot actually has a limit on the number of pages it can crawl every day.

Instead, Googlebot focuses on pages that should be crawled more frequently like home pages or high-level category pages. These pages will get crawled at least every few days, but it sounds like less-important pages (like maybe blog posts) might take considerably longer to get crawled.

You can watch Mueller’s response below or read the quoted statement underneath.

“Looking at the whole website all at once, or even within a short period of time, can cause a significant load on a website. Googlebot tries to be polite and is limited to a certain number of pages every day. This number is automatically adjusted as we better recognize the limits of a website. Looking at portions of a website means that we have to prioritize how we crawl.

So how does this work? In general, Googlebot tries to crawl important pages more frequently to make sure that most critical pages are covered. Often this will be a websites home page or maybe higher-level category pages. New content is often mentioned and linked from there, so it’s a great place for us to start. We’ll re-crawl these pages frequently, maybe every few days. maybe even much more frequently depending on the website.”

Google Logo

With Google’s extensive personalization of search results for users, it has gotten harder and harder to tell when a major shakeup happens thanks to changes to Google’s algorithms. That hasn’t stopped people from guessing a major algorithm shift has occurred when they notice significant changes to how sites are performing across the board.

This happened last week when many major authorities in SEO speculated Google unleashed a major algorithm update. Of course, Google won’t confirm that any major changes happened, but Webmaster Trends Analyst for Google, John Mueller, did take the time to remind everyone “we make changes almost every day.”

Google’s Gary Illyes took the stance even further, tweeting “we have 3 updates in a day average. I think it’s pretty safe to assume there was one recently…”

The truth is, the days of the major Google algorithms like Penguin and Panda upending the search world overnight are largely over. Instead, Google has shifted to a model of constant evolution, tweaking and changing things perpetually.

When there is a new important algorithm, such as recent mobile-friendliness algorithms, the company tends to warn businesses ahead of time. Even then, these recent algorithm updates have been benign, only affecting a small number of websites.

The best plan isn’t to be on constant watch for unannounced shifts, and react. Instead, take a proactive stance by making sure your site follows all of Google’s latest best practices and provides value to searchers. If you do that, you should make it through any changes Google throws at you any time soon.

Yesterday, we reported that a significant number of websites had been hit with Google penalties over the weekend for “unnatural outbound links.” Since then, Google has clarified that the manual penalties issued this weekend were specifically related to bloggers giving links to websites in exchange for free products or services.

Google had issued a warning a few weeks ago urging bloggers to disclose free product reviews and nofollow links in their blog posts related to these products. Now, they’ve taken action against sites who ignored the warning.

In the warning, Google told bloggers to “nofollow the link, if you decide to link to the company’s site, the company’s social media accounts, an online merchant’s page that sells the product, a review service’s page featuring reviews of the product or the company’s mobile app in an app store.”

As Barry Schwartz reports, John Mueller from Google explained the penalties in several threads on the Google support forums, telling people to look at the warning Google published recently named Best practices for bloggers reviewing free products they receive from companies. In one comment, Mueller went on to say:

In particular, if a post was made because of a free product (or free service, or just paid, etc.), then any links placed there because of that need to have a rel=nofollow attached to them. This includes links to the product itself, any sales pages (such as on Amazon), affiliate links, social media profiles, etc. that are associated with that post. Additionally, I imagine your readers would also appreciate it if those posts were labeled appropriately. It’s fine to keep these kinds of posts up, sometimes there’s a lot of useful information in them! However, the links in those posts specifically need to be modified so that they don’t pass PageRank (by using the rel=nofollow).

Once these links are cleaned up appropriately, feel free to submit a reconsideration request, so that the webspam team can double-check and remove the manual action.

If you are a blogger or company who has participated in an agreement to give free products to reviews, be sure to check your Google Search Console messages to see if you’ve been hit by the latest round of manual penalties.

mobile-closeup-campaign

It has been clear for some time now that neglecting to have a mobile-friendly site can hurt your Google rankings, particularly in mobile search results. However, some have been wondering if the reverse is also true. Does having a desktop-friendly web site have a similar negative impact on your desktop rankings in Google?

Well, last Friday Google’s John Mueller clarified the situation in a Google Hangout, saying you do not need a “desktop-friendly” site in order to rank well on desktop. The only caveat is that your mobile site must still render properly on desktop.

John Mueller said that you need to “make sure that desktop users can still see some of your content, if it is formatted in a way that works best for mobile, that’s perfectly fine.”

“You definitely do not need a specific desktop website in addition to a mobile website,” Mueller added.

If your business depends on desktop traffic and conversions to properly reach your market, it is still highly important to provide a pleasing experience when users come to your site. For that reason, I’d hesitate to suggest going all-in on mobile leaning design utilizing extra-large buttons and minimal navigation.

The most reliable strategy is to use a design technique such as responsive design to provide a great experience for users no matter where they are coming from. If that isn’t an option, it may still be best to keep operating separate sites for mobile and desktop so you don’t wind up losing customers just because they are using a desktop computer or smartphone.

You can see the full video below, or jump to 12:50 in the video to get straight to Mueller’s answer.

The recently announced Google Panda algorithm update raised eyebrows for several reasons. Of course, any Google algorithm news is worthy of attention, but this specific update was unique in several ways that had SEOs and webmasters wondering what the deal was. Finally, Google has given some insight into why Panda 4.2 is so different from past algorithm updates.

There’s still not much information about why there was such a long lull between algorithm updates – over 10 months – but, Google’s John Mueller did recently provide some answers as to why the algorithm update is rolling out significantly slower than normal.

In a Google Hangout session between Mueller and webmasters, John explained the rollout is taking several months instead of the usual few days or weeks due to “technical reasons.” He also explicitly said the long rollout isn’t specifically intend to “confuse people” as some have suggested.

Both the SEM Post and Search Engine Roundtable transcribed Mueller’s comments on Panda:

This [Panda rollout] is actually pretty much a similar update to before. For technical reasons we are rolling it out a bit slower. It is not that we are trying to confuse people with this. It is really just for technical reasons.

So, it is not that we are crawling slowly. We are crawling and indexing normal, and we are using that content as well to recognize higher quality and lower quality sites. But we are rolling out this information in a little bit more slower way. Mostly for technical reasons.

It is not like we are making this process slower by design, it is really an internal issue on our side.

Webmasters have expressed frustration with the long rollout because it is taking much longer than normal to see results from the algorithm, and Mueller’s comments only provide a small window into how the algorithm is functioning.

Here is the video, from the start of the conversation:

Google’s upcoming mobile-friendly algorithm has webmasters panicking as the deadline fast approaches. As always, when there is fear there is also plenty of misinformation.

In particular, there is one myth going around which is stirring up quite a bit of trouble.

Google has attempted to be clear that their new mobile algorithm will demote pages that don’t pass a mobile-friendliness test when they might appear in mobile search results pages. Unfortunately, that is being misconstrued.

gtxcelpoopoo-1429100580

As Barry Schwartz shared, emails are going around proclaiming Google will be completely removing sites from search results if they don’t live up to the mobile standard. Not demoted, but completely de-listed and removed from Google.

The rumor was noticed when Ashley Berman Hale, an active personality in the Google Webmaster Help Channels, posted an email she recently received with the title “Google Removing Your Site From Search Results This Month!”

The copy of the email then goes on to say, “Did you know Google will demote or delist you from their search results if you don’t have a mobile friendly site by April 21st?”

Now, the mobile algorithm on the horizon is certainly controversial among webmasters, but there is no need to be spreading outright lies. Google’s initial announcement of the algorithm was relatively vague, but they have been working hard to make sure webmasters’ questions were getting answered. It also didn’t take long for many of the experts from Google to chime in and clear the air.

Google’s Gary Illyes posted a response on Twitter:

Google’s John Mueller also posted a short statement on Google+ to make matters perfectly clear:

It’s great to get people motivated to make their website mobile-friendly, but we’re not going to be removing sites from search just because they’re not mobile-friendly. You can test your pages & reach our documentation (including some simple tweaks that might work for your CMS too) at http://g.co/mobilefriendly

Hopefully this settles the matter once and for all. Google’s algorithm WILL demote your site on mobile search results, but it WILL NOT affect you on desktop search results or completely remove you from the listings.

Google has been emphasizing the importance of mobile design and usability over the past year and now the search giant has added mobile usability reports to Webmaster Tools. Many believe this could be a sign that Google may be making mobile usability a ranking factor sooner rather than later.

The tool is intended to show whether your mobile site has any of the common usability issues that degrade a user’s mobile browsing experience.

Currently, the tool included specific errors for showing flash content on mobile (which can also result in a warning on mobile search results for your site), missing viewport meta-tag for mobile pages, improperly small fonts which are hard to read on mobile, fixed-width viewports, content not sized to viewport, and clickable links and buttons spaced too closely together.

John Mueller from Google’s Webmaster Trends Analyst team based in Zurich said they “strongly recommend you take a look at these issues in Webmaster Tools.”

Of course, Mueller could simply be encouraging this because it improves user experience, but there is strong evidence to suggest Google will eventually make mobile user experience a ranking signal within search engine algorithms.

You can see an example of the reports below:

Mobile Usability Reports

Much has been made out of the announcement that Google would include switching from HTTP to HTTPS in their ranking algorithm. Despite clearly stating that the factor would be lightweight in the initial announcement, the possibility of a relatively easy rankings boost drove lots of people to make the switch immediately.

In the aftermath studies from analytics groups such as SearchMetrics have suggested that any effect of switching URLs might have is largely unnoticeable. Now, Google’s John Mueller has basically admitted that the signal currently too lightweight to have any noticeable effect but that may change at some point in the future.

At 22 minutes and 21 seconds in a recent video hangout, Mueller explained that HTTPS is a ranking signal but it is only a “very lightweight signal” and there aren’t any plans to change that in the future.

Jennifer Slegg was the first to report Mueller’s statement and transcribed it:

I wouldn’t expect any visible change when you move from http to https, just from that change, just from SEO reasons. That kind of ranking effect is very small and very subtle. It’s not something where you will see a rise in rankings just from going to https

I think that in the long run, it is definitely a good idea, and we might make that factor stronger at some point, maybe years in the future, but at the moment you won’t see any magical SEO advantage from doing that.

That said, anytime you make significant changes in your site, change the site’s URLs, you are definitely going to see some fluctuations in the short term. So you’ll likely see some drop or some changes as we recrawl and reindex everything. In the long run, it will settle down to about the same place, it won’t settle down to some place that’s like a point higher or something like that.

You can see the video below:

Google Authorship

There was a time not too long ago when every SEO professional felt confident proclaiming that Authorship was the future of search, but it appears the predictions couldn’t have been much more incorrect.

When Google was pushing Authorship as a part of their search system, it frequently repeated that authorship information would help users identify more trustworthy sources and improve the quality of results. In the end, it was ultimately little more than a picture and name next to content and was often ignored by users.

This problem was reflected in the confirmation by Google’s John Mueller that authorship information will be entirely stripped out of search results. In the statement, Mueller explains:

“Unfortunately, we’ve also observed that this information isn’t as useful to our users as we’d hoped, and can even distract from those results. With this in mind, we’ve made the difficult decision to stop showing Authorship in search results.”

If we are being honest, the vast majority of Google users probably won’t even notice a difference and site owners shouldn’t be too concerned since Authorship didn’t help increase traffic to pages. But it has received considerable attention from the online marketing community because it seemed like a common sense and simple way to improve listings. In the long run however, it just didn’t work.

Mueller did clarify that Google will continue focusing on Schema.org structured markup, saying: “This markup helps all search engines better understand the content and context of pages on the web, and they’ll continue to use it to show rich snippets in search results.”