Tag Archive for: matt cutts

Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

Penguin 2.0 only affected 2.3% of search queries, but you would think it did much more from the response online. Ignoring all of the worrying before the release, there have been tons of comments about the first-hand effects it seems many are dealing with in the post-Penguin 2.0 web. Those spurned by the new Penguin algorithm have even accused Google of only releasing the update to increase their profitability.

Matt Cutts, head of Google’s Webspam team, used his recent Webmaster Chat video to attack that idea head on. The main question he was asked is what aspect of Google updates Cutts thinks the SEO industry doesn’t understand. While Matt expresses concern about the amount of people who don’t get the difference between algorithm updates and data refreshes, Cutts’ main focus is the concept that Google is hurting web owners to improve their profits.

Most notably, the algorithm updates simply aren’t profitable. Google experienced decreases in their revenue from almost all their recent updates, but Cutts says that money isn’t the focus. Google is aiming at improving the quality of the internet experience, especially search. While site owners using questionable methods are upset, most searchers will hopefully feel that the updates have improved their experience, which will keep them coming back and using Google.

As far as the misunderstandings between algorithm updates and data refreshes, Cutts has expanded on the problem more elsewhere. The biggest difference is that the algorithm update changes how the system is working while data refreshes do not and only change the information the system is using or seeing.

Cutts was also asked which aspect of SEO that we are spending too much time on, which leads Cutts to one of the main practices that Penguin focuses on: link building. Too many SEOs are still putting too much faith in that single practice though it is being destabilized by other areas that more directly affect the quality of users’ experiences such as creating compelling content. Instead, Matt urges SEOs to pay more attention to design and speed, emphasizing the need to create the best web experience possible.

Cutts’ video is below, but the message is that Google is going to keep growing and evolving, whether you like it or not. If you listen to what they say and tell you about handling your SEO, you may have to give up some of your old habits but you’ll spend much less time worrying about the next algorithm update.

Well, the big event that the SEO community has been talking about for weeks has finally hit and everything is… mostly the same, unless you run sites known for spammy practices like porn or gambling. Two days ago, Google started rolling out Penguin 2.0. By Matt Cutts’ estimate, 2.3 percent of English-U.S. queries were affected.

While 2.3 percent of searches doesn’t sound like a lot, in all actuality that is thousands of websites being hit with penalties and sudden drops in the rankings, but if you’ve been keeping up with Google’s best practices, chances are you are safe.

None-the-less, in SEO it is always best to stay informed on these types of updates, and Penguin 2.0 does change the Google handles search a bit. To fill in everyone on all the details, Search Engine Journal’s John Rampton and Murray Newlands made a YouTube video covering everything you could want to know about Penguin 2.0.

Oh, and if you’ve been wanting to know why it’s called Penguin 2.0, Cutts says, “This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh).”

Google is always fighting to maintain diversity on their search engine results pages (SERPs). It has proven difficult over time to walk the line between offering searchers the content they want in easily browsable form, and keep the big established sites from completely dominating the results.

Matt Cutts, head of Google’s Webspam team, recently used one of his YouTube videos to talk about how Google is managing this, and highlight an upcoming change that will hopefully keep you from getting pages full of essentially the same results. No one wants to see eight results from Yelp when they are looking for a restaurant review.

The change Google is making is aimed at making it harder for multiple results from the same domain name to rank for the same terms. Basically, once you’ve seen three or four results from a domain, even over the spread of a few results pages, it will become increasingly harder for any more pages from that domain to rank.

If you don’t quite get what this means, it is easier to understand in context. In the video, Matt walks us through the history of Google’s domain result diversity efforts. It also shows how Google tries to manage bringing you the best authoritative and reputable search results without allowing bigger brands to form monopolies on the results.

You can see the full breakdown of the domain diversity history at Search Engine Land or in Cutts’ video, but basically when Google started out there were no restrictions on the number of results per domain. It was quickly apparent that this system doesn’t work because you will get page upon page of results from the single highest ranked domain. Then came different forms of “host clustering” which prevented more than two results per domain to be shown in the search results, but this was easily worked around by spammers.

More recently, Google has used a sort of tiered system where the first SERPs for a term are as diverse as possible, allowing only a few results from the same domains, however as you progress into the later search result pages, more and more results were allowed from repeat domains. Now, Google is tightening the belt and making it harder for those repeat domains to even get onto the later SERPs.

The Short Cutts

Are you familiar with Matt Cutts, the head of the Google Webspam team, and his YouTube videos? I share them here frequently, but even the ones I write about are just a selection of some of his best. Since he has started making the short informative videos in 2009, Cutts has made over five hundred of the videos.

Five hundred videos are a lot to sort through, and YouTube isn’t the best at helping you navigate large numbers of videos so Cutts’ videos were starting to get a bit jumbled. That’s why the online marketing company Click Consult created The Short Cutts, a site which organizes all of the Cutts videos into an easily usable resource for all SEO questions you may have.

Short Cutts QuestionFor anyone not already aware of Cutts’ YouTube posts, they all follow the fame pattern. A Google user asks a question about a topic, and Cutts answers the question as well as he can within a short two or maybe three minutes. Some question the usefulness of the videos because Cutts often can’t go into depth in the short time limit, but I think anyone can understand how important it is to hear information and answers to common SEO questions straight from his mouth, even if it is a little vague.

Possibly the best part of The Short Cutts is their method of displaying videos above two sets of text which may help give you a quick answer. The first block of text consists of the question Cutts is asked, and the second block of text gives a quick “yes or no” type answer which can help give you the answer to many of the more simple issues.

Another day, another Matt Cutts Google Webmaster Help video to talk about. This recent one deals with how SEO professionals pay close attention to any new Google patent that is remotely related to Search or Search Quality terms, and then speculate until some believe some very incorrect ideas about how Google is operating.

Cutts was asked what the latest SEO misconception he would “like to put to rest” and you could almost see the relief in his eyes as Cutts began explaining that patents aren’t necessarily put into practice.

“Just because a patent issues that has somebody’s name on it or someone who works at search quality or someone who works at Google, that doesn’t necessarily mean we are using that patent at that moment,” Cutts explained. “Sometimes you will see speculation Google had a patent where they mentioned using the length of time that a domain was registered. That doesn’t mean that we are necessarily doing that, it just means that mechanism is patented.”

Basically, there is a practice of SEO professionals, especially bloggers and writers, to speculate based on patents they see have been filed, and this can grow to offering tips and suggestions about how to run your website based on speculation stemming from a patent that isn’t in use, which all comes together to create some widespread misinformation.

For example, consider the speculation that comes every time Apple files patents for future phones. While they’ve recently had trouble with leaking physical prototypes in various ways, in the past, Apple kept their secrets well guarded, and the speculation based on their patents were often outlandish, and at best completely wrong.

That doesn’t mean you can’t learn and make predictions based on patents, especially if you see indicators that it has been implemented, but it is important to take every patent with a grain of salt. While Google has created the mechanisms for these patents, unless you see evidence, they probably aren’t worth getting worked up over.

While quality SEO is a complex, time-consuming job, there are many types of SEO that any site owner can do. There are also a lot of basic mistakes that site owners regularly make while trying to optimize their own page.

To help prevent these easily corrected mistakes, Matt Cutts, Google’s head of their Webspam team, devoted one of his recent YouTube videos (which you can watch below) to identifying the five most basic SEO mistakes anyone can make.

1) Not Making Your Site Crawlable – According to Cutts, the most common mistake “by volume” is simply not making Google able to crawl your site, or not even having a domain to begin with.

The way Google learns about sites is through web “crawlers” that index pages by following links. If you don’t provide links allowing Google’s bots to find your site, it won’t know what is there. If you can’t reach content by clicking normal links on the page in a text browser, it might as well not exist to Google.

2) Not Using Words People Are Searching For – Google also tries to connect people with the most relevant information for the exact search they used. If someone searches “how high is Mount Everest,” they will be connected with a site using those exact words on a page before they will be suggested a page using just “Mount Everest elevation.”

My favorite example Cutts uses of this is a restaurant’s website, mainly because it seems many restaurants have very minimal websites that are insanely in need of optimization and a bit of a design overhaul. When people look for a restaurant to eat, they search for a couple of things, mainly the location, menu, and hours. If the page has those listed in plain text, Google will index that information and direct more people to the site, than those with PDF menus or no information at all.

3) Focusing On Link Building – One of the biggest buzzwords in SEO is link building. It is one of the oldest strategies, and it is constantly tweaked by Google’s algorithms to keep it in the news regularly, but it may actually be dragging you down.

When people think link building, they cut off many other ideas and marketing options which will equally boost your site. Cutts suggests instead to focus on general marketing. If you make your website more well-known and respected within your community, you will attract real people, which will bring organic links which are much more respected by the search engines.

4) Bad Titles and Descriptions – Many people neglect their titles and descriptions assuming they will either be automatically filled in, or won’t matter in the long run. If your website says “untitled” in the title bar, it will also say “untitled” in a bookmarks folder as well as actual search results. Now ask yourself, would you click on a website without a title?

Similarly, the descriptions for webpages are often blank or copy and pasted straight from the page with no context. Your description should be enticing people to want to click on your page, as well as showing that you have the answer to the question they are searching for. If people can build entire followings around 140 character tweets, you should be able to make someone want to click your page with a 160 character description.

5) Not Using Webmaster Resources – This problem can only be born out of ignorance or laziness. There are countless SEO resources available out there, and most of them are free. The best resources anyone can turn too are the Webmaster Tools and Guidelines that Google offers, but you shouldn’t just stick to those either. There are blogs, webinars, videos, and forums all happy to teach you SEO, you just have to use them. If you’re reading this however, you probably don’t have this problem.

Conclusion

The most common SEO problems, according to Cutts, are also the most simple problems imaginable. There are resources available that will help you fix all your basic SEO problems, and you’ll learn more and get better through finding them and practicing. If you’re currently dealing with trying to learn how to make your site crawlable, you have a long way to go, but if you just keep working at it, you’ll be an SEO pro eventually.

Help!

Google has been getting some bad press lately surrounding their penalty notices. Their notices are notoriously vague, and this has come to the surface of the topic after the BBC received an “unnatural link” warning last month due to links pointing to a single page on the site, and Mozilla was notified of a “manual” penalty this week because Google identified a single page of spam on their site.

In both of those cases, the penalties were only applied on the individual pages in question, but that information wasn’t included in the notices, which makes for obvious concern. These cases also pinpoint one of the biggest issues with issuing notices without specifically identifying to problem for the site owners. With millions of pages of content, trying to identify the problem pages would be a needle-in-the-haystack situation.

Many have been concerned about the ambiguous notices, and Google has said they will work to improve their transparency, but what do you do if you get a notice that says you have been penalized but doesn’t tell you exactly where the problem is? Matt Cutts, head of Google’s web spam team, says you should start at Google’s webmaster help forum.

If help can’t be found in the webmaster help forums, Cutts says filing a reconsideration request could result in being given more information and possibly advice, though he concedes  “we don’t have the resources to have a one-on-one conversation with every single webmaster.”

This is notable, because many believed in the past that filing a reconsideration request after a penalty was a one-time attempt to restore your site’s name. Many speculated that Google would not be keen to reviewing repeated requests and to only file for reconsideration once the site master is sure they have solved the issues. According to Cutts, this doesn’t seem to be the case.

Telling site owners to turn to a forum or file requests where they might be given extra information doesn’t seem like very consistent advice for trying to overcome a penalty. Luckily, there are some other solutions for investigating what part of your site is causing all the problems. Danny Sullivan works through some other ways you can try to diagnose your site at Search Engine Land.

Many website owners and SEOs have seen it happen. Your website is getting going, and Google is responding to your content with decent initial rankings. Everything seems fine, then gradually your ranking starts plummeting with no explanation.

You could time every day checking your rankings watching for this to happen  but that is a waste of time, as Search Engine Journal explains. Checking rankings isn’t an income generating activity, and your time is simply better spent elsewhere, like creating content or networking.

So then what is there to do about this mystery fall in the rankings? First, we have to understand what is happening, which Matt Cutts so helpfully explains in one of his latest YouTube videos.

Cutts uses an analogy of an earthquake to get to the heart of what is occurring. When an earthquake hits, the news about it is pretty broad. We know where it happened, but not many more details. Similarly, when content is posted, Google’s initial read of it is pretty wide. It is a best guess about where your content should rank.

As time goes by after an earthquake, we learn more and more. You will find out how much damage is caused, how many people died, how many aftershocks there were, and much more. As Google learns more about your content, it adjusts rankings. It contextualizes your content within the broader scope and repositions as needed.

So what can be done if you see your site drop in the ratings like this? Change up your practice. Most likely, your content is appearing to be quality at first, but Google is gradually peeling back the facade and seeing what your website really is, and it doesn’t like it.

Resting PandaLast week the internet felt tremors that were very similar to the shock waves unleashed by Google’s Panda Updates, but something was different this time. Google didn’t announce or confirm the update, and they say they won’t confirm any updates in the future.

At this point, it is widely assumed the small shakeup last week was the Panda Update that Google’s Web Spam guru Matt Cutts said would be coming sometime soon at SMX West early on last week. But, as Search Engine Land reports, while he was talking, Cutts also said that Google’s Panda Updates would no longer be unveiled in big monthly changes. From now on, Panda’s changes will occur gradually.

The shift from big abrupt changes to a more fluid update system means that sites hit for low-quality content may not be able to diagnose their issue as easily. Site owners can’t look at their Analytics and see a big drop correlated with a confirmed update around that time period. However, Danny Goodwin says it may mean a faster recovery.

Site owners who have done their proper due diligence will no longer have to wait for the next update to roll around to see if Google has viewed their work favorably.

Google confirmed 24 of the Panda Updates, and the 25th is believed to have occurred late last week, but from now on, there won’t be any big announcements or confirmations. Just like everything else at Google, their web spam algorithms will be constantly changing over time rather than abruptly transforming.

Any time Google’s Penguin or Panda updates are mentioned, site owners and bloggers alike work themselves into a mini frenzy about the possibility that their totally legitimate website might have been penalized. It’s warranted, in a way, because a few innocent bystanders have been affected, but largely Google is policing those breaking the rules.

Meanwhile, bloggers have tended to downplay just how much rule breaking there is. Black hat SEO is treated as a fringe issue when in reality it is a huge issue. Writers tend to focus on a small aspect of black hat SEO in which competitors use shady links and other SEO tactics to bring your site down, and that is incredibly rare. Google considers all explicit spam to be black hat, and with that definition, black hat SEO is the most pervasive type of SEO around.

It is also the type of spam Google spends most of their time fighting. Matt Cutts, Google’s webspam team leader, took to YouTube recently to answer a question about how many notifications Google sends out to website owners, and 90% of Google’s manual penalties are still spent on blatant spam pages.

Google sends out hundreds of thousands of notifications each month, but the chances of your common SEO or website owner seeing one are rare. There is a chance though. The other 10% of notifications focus on problems that SEOs who have fallen out of the loop or novices may have gotten sucked up into such as link buying, link selling, or even hacking notifications.