Tag Archive for: SEO Roundtable

What’s the best way to rank highly right now, according to Google? Most SEO professionals would say some one of two things. Creating a quality site will get your site ranked highly, and quality content is the most powerful way to improve the quality and value of your site.

According to Ryan Moulton, a software engineer at Google who Barry Schwartz from SEO Roundtable implies works in the search area, high quality content doesn’t necessarily work like that.

The assumption is that the “high quality” content Google favors is the most accurate and informative text available. But, Moulton says we misunderstand or forget about actual usefulness.

He was defending Google in a Hacker News thread on why Google ranks some sites highly despite the content not being entirely accurate, and in some people’s eyes low quality. He explains that some sources may be the most accurate, but they are often way too high-minded for the average searcher.

He states, “there’s a balance between popularity and quality that we try to be very careful with. Ranking isn’t entirely one or the other. It doesn’t help to give people a better page if they aren’t going to click on it anyways.”

Ryan then continues with an example:

Suppose you search for something like [pinched nerve ibuprofen]. The top two results currently are mayoclinic.com and answers.yahoo.com.

Almost anyone would agree that the mayoclinic result is higher quality. It’s written by professional physicians at a world renowned institution. However, getting the answer to your question requires reading a lot of text. You have to be comfortable with words like “Nonsteroidal anti-inflammatory drugs,” which a lot of people aren’t. Half of people aren’t literate enough to read their prescription drug labels: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1831578/

The answer on yahoo answers is provided by “auntcookie84.” I have no idea who she is, whether she’s qualified to provide this information, or whether the information is correct. However, I have no trouble whatsoever reading what she wrote, regardless of how literate I am.

Google has to balance many factors in their search results, and the simple fact is most searchers aren’t looking for comprehensive scientific explanations for most of their problems. They want the most relevant information for their problem in terms they can understand.

It should be noted Google does allow access to these academic sources in other areas of their search, but when writing for the main search page, your content needs to be accessible to your audience. Your average SEO news source can get away with using technical language to an extent, because those reading your information likely already have built a vocabulary for the topic.

However, if you are offering a service or attempting to educate to the general public about your field, you need to use terms they can easily understand without a dictionary and address their needs head-on.

There is still certainly a place for more extensive content. For instance, the Mayo Clinic and WebMD still rank higher than Yahoo Answers for most medical searches, simply because they are more reliable.

It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.

According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.

SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.

Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

It has been well over a month since Penguin 2.0 was unleashed upon the world and the search industry is still reeling from the results of the algorithm update aimed at link profiles, low quality backlinks, and over-optimized anchor texts.

The average estimate says that Penguin 2.0 affected over 2-percent of all English queries. That doesn’t sound like much, but when SEO Roundtable took a poll in late May over half their readers say they had been hit by the changes.

First, it should be said that some portion of those may have been affected by a separate algorithm update released shortly before the new version of Penguin, but that update was aimed at typically spammy sectors like payday loans and pornography.

The majority of those saying they were affected by Penguin however were most likely correct about their recent drop in rankings or loss of traffic. It is either that, or far too many involved were misreading their data or somehow unaware that their payday loan site might be targeted by Google. Let’s assume that’s not the case, because that option sounds highly unlikely.

But, time has passed since Penguin came out. I’ve seen at least 10 articles detailing how to recover from Penguin, and numerous others focused on all the areas Penguin targeted. We should all be getting back to normal, right?

According to the recent poll from SEO Roundtable on the topic, that is not the case. Over 60 percent of those responding have said they haven’t recovered from the algorithm update, with only 7.5-percent saying they have fully recovered.

What does this mean? Well the respondents are clearly SEO informed people who keep up to date with the latest blogs, since they responded to one of the more reputable sites available on the issue. One major issue is that full recovery from Penguin isn’t possible for many of those affected until the next refresh. It is hard to know when that refresh could happen, though it may not be until the next update is announced.

The other issue is simply that those articles telling SEOs how to recover from Penguin range from completely valid to “how to try to cheat the new system” which can be confusing for inexperienced or uninformed SEOs. The best suggestion for solving this problem is playing close attention to what sites you are reading and always take the more conservative advice.

Ranking PodiumA WebmasterWorld thread from roughly a month ago brings up an interesting question for us SEO professionals. While we focus on the algorithms we know about such as Penguin or Panda, it has long been suggested that Google could also be using different ranking factors depending on the industry a site fits within. In other words, sites for roofing companies would be being reviewed and ranked according to different standards than sites for tech companies.

Well, Matt Cutts, the head of Google’s Webspam team and trusted engineer, took to that thread to dispel all the rumors. He doesn’t deny that Google has “looked at topic-specific ranking.” Instead, he says scaling was the issue. In his answer, Cutts explains, “We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.”

He continued, “What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.”

While Google says they aren’t using different algorithms for different industries, it has been announced that Google uses Subject Specific Authority Ranking, which helps authorities in varying topics to be selected as the most reputable on that subject.

Of course, looking at the comments from SEO Roundtable, who reported on the WebmasterWorld thread, it is clear many don’t necessarily believe Cutts’ statement. Some say they have “always seen a difference in industry types,” while others argue that different industries necessitate using different ranking factors or algorithms due to lack of specific resources available to that industry. For example, industrial companies don’t tend to run blogs, which means creating new content through blogging shouldn’t be as honored as it is on other topics like health and tech with a lot of news constantly coming out.

For now, all we have to go on is Cutts’ word and our own experiences. Do you think Google is using different algorithms depending on industry? Do you think they should be?

rsz_john_muellerThere is a misconception amongst a small few that Google only wants the absolute best websites and they don’t index websites they think aren’t worth their time or space in their index. In reality, this is far from the truth.

Google is always indexing content and they index pretty much anything they can find. Supposedly, the only thing they don’t index is spam.

SEO Roundtable pointed out that Google’s John Mueller commented in a Google Webmaster Help thread recently saying “unless the content is primarily spam (eg spun / rewritten / scraped content), we’d try to at least have it indexed.”

He was responding to a question about a site bot being fully indexed over a prolonged period of time, which he believes is the result of a bug, though he didn’t have any definite answers until it is shown to the indexing team.

Before anyone gets up in arms, that statement is a little misleading on the aspect of spam. Everyone knows Google still indexes their fair share of spam, and in some cases they even get ranked. Mueller’s comments instead show how Google tries to avoid adding spam to their index, but we it is obvious that they don’t succeed in avoiding indexing all of the junk.

Getting indexed isn’t the same as ranking, but to have any chance of being ranked you have to be indexed.

After Google launched the disavow link tool in mid-October, there were a fair amount of doubters and those that claimed it doesn’t work. Dixon Jones from Majestic SEO, a well known and respected member of the SEO community posted in a Google+ thread that he used it for a site and it worked to remove a manual link penalty.

SEO Roundtable has the exact posts he made explaining the process and what happened. For Jones, the manual penalty was removed fairly efficiently, but those wanting to use it for a Penguin link penalty may need to wait for a new refresh.

The best option is of course to try to avoid link penalties or bad links, but the world isn’t perfect. If you’ve dealt with the root issue which caused the penalty, the disavow links tool may just be the solution you need.