Tag Archive for: John Mueller

Google has been emphasizing the importance of mobile design and usability over the past year and now the search giant has added mobile usability reports to Webmaster Tools. Many believe this could be a sign that Google may be making mobile usability a ranking factor sooner rather than later.

The tool is intended to show whether your mobile site has any of the common usability issues that degrade a user’s mobile browsing experience.

Currently, the tool included specific errors for showing flash content on mobile (which can also result in a warning on mobile search results for your site), missing viewport meta-tag for mobile pages, improperly small fonts which are hard to read on mobile, fixed-width viewports, content not sized to viewport, and clickable links and buttons spaced too closely together.

John Mueller from Google’s Webmaster Trends Analyst team based in Zurich said they “strongly recommend you take a look at these issues in Webmaster Tools.”

Of course, Mueller could simply be encouraging this because it improves user experience, but there is strong evidence to suggest Google will eventually make mobile user experience a ranking signal within search engine algorithms.

You can see an example of the reports below:

Mobile Usability Reports

Much has been made out of the announcement that Google would include switching from HTTP to HTTPS in their ranking algorithm. Despite clearly stating that the factor would be lightweight in the initial announcement, the possibility of a relatively easy rankings boost drove lots of people to make the switch immediately.

In the aftermath studies from analytics groups such as SearchMetrics have suggested that any effect of switching URLs might have is largely unnoticeable. Now, Google’s John Mueller has basically admitted that the signal currently too lightweight to have any noticeable effect but that may change at some point in the future.

At 22 minutes and 21 seconds in a recent video hangout, Mueller explained that HTTPS is a ranking signal but it is only a “very lightweight signal” and there aren’t any plans to change that in the future.

Jennifer Slegg was the first to report Mueller’s statement and transcribed it:

I wouldn’t expect any visible change when you move from http to https, just from that change, just from SEO reasons. That kind of ranking effect is very small and very subtle. It’s not something where you will see a rise in rankings just from going to https

I think that in the long run, it is definitely a good idea, and we might make that factor stronger at some point, maybe years in the future, but at the moment you won’t see any magical SEO advantage from doing that.

That said, anytime you make significant changes in your site, change the site’s URLs, you are definitely going to see some fluctuations in the short term. So you’ll likely see some drop or some changes as we recrawl and reindex everything. In the long run, it will settle down to about the same place, it won’t settle down to some place that’s like a point higher or something like that.

You can see the video below:

Google Authorship

There was a time not too long ago when every SEO professional felt confident proclaiming that Authorship was the future of search, but it appears the predictions couldn’t have been much more incorrect.

When Google was pushing Authorship as a part of their search system, it frequently repeated that authorship information would help users identify more trustworthy sources and improve the quality of results. In the end, it was ultimately little more than a picture and name next to content and was often ignored by users.

This problem was reflected in the confirmation by Google’s John Mueller that authorship information will be entirely stripped out of search results. In the statement, Mueller explains:

“Unfortunately, we’ve also observed that this information isn’t as useful to our users as we’d hoped, and can even distract from those results. With this in mind, we’ve made the difficult decision to stop showing Authorship in search results.”

If we are being honest, the vast majority of Google users probably won’t even notice a difference and site owners shouldn’t be too concerned since Authorship didn’t help increase traffic to pages. But it has received considerable attention from the online marketing community because it seemed like a common sense and simple way to improve listings. In the long run however, it just didn’t work.

Mueller did clarify that Google will continue focusing on Schema.org structured markup, saying: “This markup helps all search engines better understand the content and context of pages on the web, and they’ll continue to use it to show rich snippets in search results.”

Duplicate content has been an important topic for webmasters for years. It should be absolutely no secret by now that duplicate content is generally dangerous to a site and usually offers no value, but there are occasional reasons for duplicate content to exist.

Of course, there are very real risks with hosting a significant amount of duplicate content, but often the fear is larger than the actual risk of penalties – so long as you aren’t taking advantage and purposely posting excessive duplicate content.

Google’s John Mueller puts the risk of using duplicate content in the best context,. According to John, there are two real issues with duplicate content.

The first issue is that Google’s algorithms typically automatically choose one URL to show for specific content in search, and sometimes you don’t get to choose. The only way you can effectively let Google know your preference is by using redirects or canonical tags, and that isn’t foolproof.

Secondly, if you are hosted a ton of duplicate content it can actually make the process of crawling to overwhelming for the server, which will slow new content from being noticed as quickly as it should be.

Still, John said that in most cases, “reasonable amounts of duplication […] with a strong server” is not a huge problem, as “most users won’t notice the choice of URL and crawling can still be sufficient.”

Every week we try to keep you updated with all the SEM news from around the web, but the Google I/O event this week was packed full of so much information there was bound to be some stuff we didn’t get to cover. Today, we’re going to cover all the latest news from I/O and everywhere else. We’ll start with Google, but there is also some interesting Facebook news to discuss near further down.

Android Reaches 1 Billion Active Users Per Month

To open the annual I/O developers conference, Google rattled off an impressive list of statistics, as large tech companies tend to do. Of these statistics, there is one that was noteworthy and shows just how prominent smartphones have become throughout the world.

At last year’s conference, Sundar Pichai told the audience that Android had around 530 million active users per month. Over the past year, that number has almost doubled, surpassing one billion active users every month.

Other interesting stats include:

  • Android users send 20 billion texts per day.
  • They take 93 million selfies per day.
  • Android users take 1.5 trillion steps each day.
  • They check their phones 100 billion times per day.
  • Android tablets represent over 60 percent of all tablets shipped.

If you’re wondering what everyone is doing on their phones, you can be assured that it involves apps. comScore released a report this week showing that over half of all digital media time is now spent using apps, though its unclear whether more time is spent on Angry Birds or The Wall Street Journal’s news app.

Screen-Shot-2014-06-26-at-10.17.29-AM1-600x334

Google Will Remove Author Images From Search Results

danny-goodwin-google-authorship-pic

Google is continuing to push an emphasis on authorship authority, but don’t expect to continue seeing author images in your search results for much longer. In the next few days, Google will be stripping the author images and Google+ circle counts from desktop and mobile search results in an attempt to streamline search result appearances. In the announcement on Google+, John Mueller said:

“We’ve been doing lots of work to clean up the visual design of our search results, in particular creating a better mobile experience and a more consistent design across devices,” he wrote. “As a part of this, we’re simplifying the way authorship is shown in mobile and desktop search results, removing the profile photo and circle count.”

Danny Goodwin from Search Engine Watch shared an example showing what the results looked like before (shown above) and what they will look like in the very near future (below).

john-resig-google-authorship-pic

Google is Testing Their Domain Registry Service

Google has expanded into seemingly every facet of online activity, but up until now they have left domain registration to other service providers. That won’t stay the case for much longer as Google recently announced they will be inviting a limited number of people to test their new service called … wait for it… Google Domains.

As the shockingly creative name suggests, Google Domains will let users search, find, purchase, and transfer the domain or domains best representing their business. The service is still being built, which is part of why you shouldn’t expect for it to come out of testing in the close future. But, the service could potentially make the act of creating a website and establishing a company presence online much more easily understandable for the 55% of businesses that still don’t have websites.

Google My Business Comes To iOS and Android

GoogleMyBusinessMobile

Google announced Google My Business earlier this month, and on Wednesday the official apps for the service came out on both Android and Apple smartphones. The apps offer a unified interface that will make it easier to manage your brand’s online presence on the go. App features include:

  • Edit the business listing by changing hours, description, etc.
  • View managers of the page but not manage them
  • Post to Google+
  • Add photos and update cover and profile photos
  • View local insights and analytics
  • Change business pages and accounts

Watch Now Now Influences Facebook News Feed Video Rankings

Facebook is constantly working on its News Feed algorithm and the latest update is intended to improve the quality of videos being served to users. The new video ranking takes note of how long users watch a video for and uses this information to influence content ranking. This metric will be combined with other long-standing metrics such as likes, comments, and shares.

In the announcement, Facebook said, “In our early tests, this improvement resulted in more people watching more videos that are relevant to them.” The announcement also noted that twice as many people watch videos on Facebook compared to the numbers from six months ago.

Wait, So Facebook is Still Popular With Teens?

You’ve undoubtedly heard the rumors, studies, and proclamations that Facebook is losing traction with teens like a car driving off a cliff. From what everyone wants to believe, all the young kids are moving to the burgeoning messaging apps to escape the prying eyes of their parents and lame extended family. Even President Obama has commented that kids “don’t use Facebook anymore.”

The problem with all this is, the numbers don’t entirely support that conclusion and two reports from the past week confirm that Facebook is still the most popular social site for the demographic by far.

facebook-forrester-chart-600x403

First, Forrester Research released a report showing that more than 75% of US online youth use Facebook at least once a month. Their survey polled 4,517 internet users between the ages of 12 and 17, and that Facebook has twice as many users as Pinterest, Tumblr, Snapchat, and more than Instagram and WhatsApp combined.

niche-600x428

The second study, from college and K-12 education review site, Niche, found similar results from a survey of 7,000 teenage users. Specifically, 87% still use Facebook “occasionally” and 61% use it daily (including the 47% who use it “a few times a day.)

John MuellerThere is no escaping a Google penalty without following the rules and doing the hard work to clear your name. It is old news that if you have suffered a penalty on your site and you moveto a new domain and redirect the URLs to that new domain, the penalty stays around thanks to the redirects.

However, some SEO’s may attempt to escape the penalty by moving their site to a new domain without redirecting the URLs. That way, theoretically Google can’t follow you and put the penalty on your moved site too. Unfortunately, Google’s John Mueller has quashed the notion pretty thoroughly.

In a recent Google Webmaster Hangout, Mueller explained that even if you move the site and don’t redirect, Google may still find you and apply the penalty again. Roughly 23 minutes into the video below, John answers Barry Schwartz’s question on the topic by explaining that if you just copy and paste the pages onto a new site without many changes, Google may still be able to pick up on the site move. Even if you do not set up 301 redirects or use the change address tool in Google Webmaster Tools, Google could potentially still know you moved domains and pass along the penalty.

John does explicitly explain in the video that Google doesn’t rely on their signals alone to pass along the penalty. But, if they receive signs that you are trying to hide the same site elsewhere under a new URL without fixing the core issues with your site, they will investigate and likely apply the penalty.

In the end, you will have to do the hard work you are avoiding to ever get rid of the penalty completely. It may sometimes be better to completely tear down a site and start from scratch to fix the issues that earned you the penalty to begin with, but you can’t just run away from it forever.

You can view the video below or here.

Google's John Mueller Courtesy of Google+

John Mueller

Recently I discussed a common issue sites have where a misplaced noindex tag on the front page of a site can keep search engines from crawling or indexing your site. It happens all the time, but it isn’t the only reason your site might not be crawled. The good news is there is little to no long term damage done to your site or your SEO, according to a recent statement from Google’s John Mueller.

Barry Schwartz noticed Mueller had responded to a question on the Google Webmaster Help forums from an employee for a company who had accidentally blocked GoogleBot from crawling and indexing their site. In John Mueller’s words:

From our point of view, once we’re able to recrawl and reprocess your URLs, they’ll re-appear in our search results. There’s generally no long-term damage caused by an outage like this, but it might take a bit of time for things to get back to “normal” again (with the caveat that our algorithms change over time, so the current “normal” may not be the same state as it was before).

So don’t worry to much if you discover you find your site has been having problems with crawling or indexing. What matters is how quickly you respond and fix the problem. Once the issue is solved, everything should return to relatively normal. Of course, as Mueller mentions, you might not return back to your exact same state because these things are always fluctuating.

 

Stop Sign

Source: Sharon Pruitt

Sometimes the source of the problem is so glaringly simple that you would never consider it. This is the case of many webmasters frustrated with their sites not being indexed or ranked by search engines. While there are numerous more technical reasons search engines might refuse to index your page, a surprising amount of time the problem is caused by you telling the search engine not to index your site with a noindex tag.

This is frequently overlooked, but it can put a complete halt to your site’s rankings and visibility. Thankfully it is also very easy to fix. The biggest hassle is trying to actually find the redirect, as they can be hard to spot due to redirects. But, you can use a http header checker tool to verify before the site page redirects.

Don’t be embarrassed if this small mistake has been keeping you down. As Barry Schwartz mentions on SEO Roundtable, there have been large Fortune 500 companies with these same problems. John Mueller also recently ran into someone with a noindex on their homepage. He noticed a thread in the Google Webmaster Help forums where a site owner had been working to fix his problem all day with the help of the other forum members. John explained the problem wasn’t nearly as complex as everyone else had suggested. It was much more obvious:

It looks like a lot of your pages had a noindex robots meta tag on them for a while and dropped out because of that. In the meantime, that meta tag is gone, so if you can keep it out, you should be good to go :).

When you encounter a problem with your site ranking or being indexed, it is always best to start with the most obvious possible causes before going to the bigger and more difficult mistakes. While we all like to think we wouldn’t make such a simple mistake, we all also let the small things slip by.

Google Hangouts IconThis Monday, site owners looking for advice will have the opportunity to have their website briefly reviewed by Google, as John Mueller announced on Google+. The short site reviews will be taking place November 18th at 10am EDT and will last one-hour. Search Engine Land suggests the event will be lead by Mueller, though no one is quite sure the format this event will be in.

To have your site reviewed, you have to add the site to this Google Moderator page. Then, if Google has the time and chooses your site, it will be reviewed live this upcoming Monday via Google+ Hangouts.

You can also RSVP for the event by going to this page and add it to your calendar.
John’s statement explained the event, saying:

For this hangout, we’ll review sites that are submitted via the moderator page and give a short comment on where you might want to focus your efforts, assuming there are any issues from Google’s point of view :).

rsz_john_muellerThere is a misconception amongst a small few that Google only wants the absolute best websites and they don’t index websites they think aren’t worth their time or space in their index. In reality, this is far from the truth.

Google is always indexing content and they index pretty much anything they can find. Supposedly, the only thing they don’t index is spam.

SEO Roundtable pointed out that Google’s John Mueller commented in a Google Webmaster Help thread recently saying “unless the content is primarily spam (eg spun / rewritten / scraped content), we’d try to at least have it indexed.”

He was responding to a question about a site bot being fully indexed over a prolonged period of time, which he believes is the result of a bug, though he didn’t have any definite answers until it is shown to the indexing team.

Before anyone gets up in arms, that statement is a little misleading on the aspect of spam. Everyone knows Google still indexes their fair share of spam, and in some cases they even get ranked. Mueller’s comments instead show how Google tries to avoid adding spam to their index, but we it is obvious that they don’t succeed in avoiding indexing all of the junk.

Getting indexed isn’t the same as ranking, but to have any chance of being ranked you have to be indexed.