Tag Archive for: Google

While quality SEO is a complex, time-consuming job, there are many types of SEO that any site owner can do. There are also a lot of basic mistakes that site owners regularly make while trying to optimize their own page.

To help prevent these easily corrected mistakes, Matt Cutts, Google’s head of their Webspam team, devoted one of his recent YouTube videos (which you can watch below) to identifying the five most basic SEO mistakes anyone can make.

1) Not Making Your Site Crawlable – According to Cutts, the most common mistake “by volume” is simply not making Google able to crawl your site, or not even having a domain to begin with.

The way Google learns about sites is through web “crawlers” that index pages by following links. If you don’t provide links allowing Google’s bots to find your site, it won’t know what is there. If you can’t reach content by clicking normal links on the page in a text browser, it might as well not exist to Google.

2) Not Using Words People Are Searching For – Google also tries to connect people with the most relevant information for the exact search they used. If someone searches “how high is Mount Everest,” they will be connected with a site using those exact words on a page before they will be suggested a page using just “Mount Everest elevation.”

My favorite example Cutts uses of this is a restaurant’s website, mainly because it seems many restaurants have very minimal websites that are insanely in need of optimization and a bit of a design overhaul. When people look for a restaurant to eat, they search for a couple of things, mainly the location, menu, and hours. If the page has those listed in plain text, Google will index that information and direct more people to the site, than those with PDF menus or no information at all.

3) Focusing On Link Building – One of the biggest buzzwords in SEO is link building. It is one of the oldest strategies, and it is constantly tweaked by Google’s algorithms to keep it in the news regularly, but it may actually be dragging you down.

When people think link building, they cut off many other ideas and marketing options which will equally boost your site. Cutts suggests instead to focus on general marketing. If you make your website more well-known and respected within your community, you will attract real people, which will bring organic links which are much more respected by the search engines.

4) Bad Titles and Descriptions – Many people neglect their titles and descriptions assuming they will either be automatically filled in, or won’t matter in the long run. If your website says “untitled” in the title bar, it will also say “untitled” in a bookmarks folder as well as actual search results. Now ask yourself, would you click on a website without a title?

Similarly, the descriptions for webpages are often blank or copy and pasted straight from the page with no context. Your description should be enticing people to want to click on your page, as well as showing that you have the answer to the question they are searching for. If people can build entire followings around 140 character tweets, you should be able to make someone want to click your page with a 160 character description.

5) Not Using Webmaster Resources – This problem can only be born out of ignorance or laziness. There are countless SEO resources available out there, and most of them are free. The best resources anyone can turn too are the Webmaster Tools and Guidelines that Google offers, but you shouldn’t just stick to those either. There are blogs, webinars, videos, and forums all happy to teach you SEO, you just have to use them. If you’re reading this however, you probably don’t have this problem.

Conclusion

The most common SEO problems, according to Cutts, are also the most simple problems imaginable. There are resources available that will help you fix all your basic SEO problems, and you’ll learn more and get better through finding them and practicing. If you’re currently dealing with trying to learn how to make your site crawlable, you have a long way to go, but if you just keep working at it, you’ll be an SEO pro eventually.

AdWords scripts offer a great opportunity to personalize your campaigns, but they have their flaws. For instance, you’ll need to write the code yourself and their output logs are not very user friendly. Frederick Vallaeys has some in-depth, expert suggestions for frustrated scripts users to get more out of their campaigns and overcome these flaws at Search Engine Land.

Though you probably don’t need to worry about how to make scripts work for the largest of AdWords accounts, his advice on making scripts accessible even if you don’t know how to write code is particularly valuable information. Check it out if you are already using, or are thinking of using, AdWords scripts.

Help!

Google has been getting some bad press lately surrounding their penalty notices. Their notices are notoriously vague, and this has come to the surface of the topic after the BBC received an “unnatural link” warning last month due to links pointing to a single page on the site, and Mozilla was notified of a “manual” penalty this week because Google identified a single page of spam on their site.

In both of those cases, the penalties were only applied on the individual pages in question, but that information wasn’t included in the notices, which makes for obvious concern. These cases also pinpoint one of the biggest issues with issuing notices without specifically identifying to problem for the site owners. With millions of pages of content, trying to identify the problem pages would be a needle-in-the-haystack situation.

Many have been concerned about the ambiguous notices, and Google has said they will work to improve their transparency, but what do you do if you get a notice that says you have been penalized but doesn’t tell you exactly where the problem is? Matt Cutts, head of Google’s web spam team, says you should start at Google’s webmaster help forum.

If help can’t be found in the webmaster help forums, Cutts says filing a reconsideration request could result in being given more information and possibly advice, though he concedes  “we don’t have the resources to have a one-on-one conversation with every single webmaster.”

This is notable, because many believed in the past that filing a reconsideration request after a penalty was a one-time attempt to restore your site’s name. Many speculated that Google would not be keen to reviewing repeated requests and to only file for reconsideration once the site master is sure they have solved the issues. According to Cutts, this doesn’t seem to be the case.

Telling site owners to turn to a forum or file requests where they might be given extra information doesn’t seem like very consistent advice for trying to overcome a penalty. Luckily, there are some other solutions for investigating what part of your site is causing all the problems. Danny Sullivan works through some other ways you can try to diagnose your site at Search Engine Land.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

The entire SEO community is bracing themselves. A new Google Penguin update should be here any time, and it is looking like it will be quite a big deal. Supposedly it will be much more brutal than the already merciless update that came last April.

Judging from what we already know about Penguin, there are some ways to prepare yourself and all of your sites to make sure you don’t get hit by the first wave of penalties. Plus, if you follow these suggestions from Marcela De Vivo, you’ll be improving your SEO all around.

  1. Monthly Link Audits – Knowledge is power, and audits give you a lot of knowledge. Start with the backlinks and get a baseline. Find out how may high quality and low quality links you have. Who are these links connecting to? If there are spammy links, work to have them removed. You can choose from a huge selection of audit tools to make the process easy, and you will always know how your link profile is doing.
  2. Anchor Density – A popular way to try to cheat search engines is cranking up anchor density for money terms, and Penguin already penalizes those that do it too much. There is a good chance they will get stricter on their anchor density guidelines, so it is important to keep an eye out. You want to be under 15% for the money term. Any higher is risking penalties when the new Penguin update arrives.
  3. Link Ratios – Links are all about finding the right balance. Google talks about Earned vs. Unearned links, and when they do that they mean Images vs Mentions or Text, Sitewide Ratios, Deep Link Profiles, etc. De Vivo breaks down the categories a little more, but the main idea is to keep a good balance between them all.
  4. Use Your Webmaster Tools – For every siteowner who thinks this is obvious is another siteowner who doesn’t know what Webmaster Tools is or how to monitor it. This is the best line between you and Google, and watching the links Google displays in your account can help identify problematic links as well as keeping you informed as to how they are effecting your rankings. There are numerous problems that Webmaster Tools can inform you of, you just have to look.
  5. Don’t Do Spammy Link Building – This one is the most obvious out of all of these, but it seems no amount of telling site owners to keep away from this practice will ever stop the problem. If something sounds too good to be true in SEO, IT IS. If you can’t identify spammy links, don’t do the work yourself. Google will penalize you if it hasn’t already, and the money you spent on those links wil be wasted.

Google Penguin isn’t the bad guy, nor is it the authoritarian figure not letting anyone have fun. Google’s spam fighting efforts are keeping our browsing running smoothly, and the “innocent” people affected by these changes are participating in questionable tactics. Read Google’s best practices, and follow them. If you are taking proper care of your site and following Google’s rules, the new Penguin update won’t feel near as scary.

You may have already noticed ads with a company’s number of Google+ followers noted at the bottom of them. This is a new feature from AdWords Enhanced Campaigns and one that you, like I did, might be wondering about. Does it really make that much of a difference how many followers you have? Does it make a consumer more likely to click on your ad? According to Google, yes.

Frederic Lardinois reports for TechCrunch that these ads with the follower count ‘annotations’ receive a 5 to 10-percent bump in CTR than regular ads. A large number of followers would likely lend a little more credibility to an ad, but those companies with thousands or millions of followers likely already have that credibility through name recognition.

And this new feature isn’t available to just anyone with an AdWords account. You’ll not only need a “significant number of followers”, but you also need “recent, high-quality posts”. The whole thing sounds a little subjective, but it may be worth putting the time in to build up your Google+ page to get the boost in CTR.

Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

With all of the different ways Google can penalize you these days, it is easy to get confused about what you need to do to fix your mistakes. Between Penguin, Panda, Unnatural Link Penalties, and Manual Penalties, there are more ways to get in trouble than ever.

Google’s increasing strictness is far from a bad thing, but it is also getting increasingly complex which makes for confusion when trying to bounce back from a mistake.

Marie Haynes knows just how confusing it can be. She has been working in SEO and writing for SEOMoz for years, but even she got confused when trying to help someone with what she thought was a Penguin-related penalty. She then saw another respected writer make the same mistake in a recent article but confusing unnatural links penalties with Panda.

It seems we need to go to the root of these issues and break down what each of these different penalties are and how they are different from each other.

The Penguin Algorithm came about last April as a algorithm change aimed at fighting webspam, which explains the initial title “The Webspam Algorithm” and it mainly targeted sites participating in link schemes and other questionable linking practices, though it also looked for indications of keyword stuffing.

The Penguin Algorithm isn’t to be confused with an Unnatural Link Penalty. The main difference is that Unnatural Links Penalties are manually taken against you rather than by an automated algorithm. They mainly place these algorithms when they believe a site is attempting to manipulate search engine results through the creation of links. The real question is what causes Google to investigate your site.

It is widely believed that filing a spam report will flag a site for manual review, but others have guessed that Google monitors more cutthroat niches such as “payday loans” or casino sites and consistently manually checks for unnatural links. Thanks to Google’s secrecy, we may never know exactly what makes Google personally examine a site.

So what is the main difference between Penguin and Unnatural Links Penalties? It really all comes down to the different way algorithms act compared to penalties taken by a living breathing person. Algorithms view all sites the same and is effective almost immediately. All sites hit by an algorithmic penalty will see the damage within the day of the algorithm update. Manual penalties on the other hand are being placed against sites at all times, and can be appealed more easily than an algorithmic penalty.

You can always recover from any of these penalties with effort, as Marie Haynes shows in her article, but you have to clean up your page and your methods. SEOs can’t get away with participating in link schemes or engaging other black hat techniques anymore, and there is no way to cheat the search engines anymore.

Here’s a theoretical scenario: You’ve been hit with a manual penalty from Google. You take all the time and effort it takes to complete a link audit and remove all the bad links you’ve accumulated, and made sure your link profile doesn’t look questionable to Google’s eyes. You resubmit, but even after weeks your website is still flat-lining. What the heck?

As it turns out, that link audit and resubmission process was only half the battle. Google does use over 200 different signals to determine ranking, but links are still the heavyweights in the arena. Now think back to all those unnatural links you just removed. Often, those “bad” links were some of the most powerful in your profile, and you don’t have anything healthy replacing them.

I have some bad news. If you got hit with a manual penalty, you most likely used questionable or downright spammy methods to climb the rankings before, and that doesn’t cut it anymore. There is a way to recover, but it takes basically restarting your SEO process to get your site back in the rankings, and this time you can’t take short cuts.

Search Engine Watch suggests a four step process to getting your sites ranking again, but if you loved the spammy old ways of the web, these steps may seem counter-intuitive or just boring and difficult. Unfortunately for you if you feel that way, there aren’t many other options, and there will be less the more refined Google gets. Chuck Price put it best when he said, “adhering to the webmaster guidelines is no longer a “suggested” course of action, it is required.”

The four-step process will help you clean house on all the remnants of less savory SEO methods, and make your site look as clean and reputable as it should. Don’t try to toe the line again or take advantage of any loophole you find. You only really get one chance to come back after a manual penalty. If you get hurt again, it will be nearly impossible to fix everything.

rsz_john_muellerThere is a misconception amongst a small few that Google only wants the absolute best websites and they don’t index websites they think aren’t worth their time or space in their index. In reality, this is far from the truth.

Google is always indexing content and they index pretty much anything they can find. Supposedly, the only thing they don’t index is spam.

SEO Roundtable pointed out that Google’s John Mueller commented in a Google Webmaster Help thread recently saying “unless the content is primarily spam (eg spun / rewritten / scraped content), we’d try to at least have it indexed.”

He was responding to a question about a site bot being fully indexed over a prolonged period of time, which he believes is the result of a bug, though he didn’t have any definite answers until it is shown to the indexing team.

Before anyone gets up in arms, that statement is a little misleading on the aspect of spam. Everyone knows Google still indexes their fair share of spam, and in some cases they even get ranked. Mueller’s comments instead show how Google tries to avoid adding spam to their index, but we it is obvious that they don’t succeed in avoiding indexing all of the junk.

Getting indexed isn’t the same as ranking, but to have any chance of being ranked you have to be indexed.

Bing Ads is the clear runner-up in to AdWords in the search engine advertising game, but they’ve seen a way to set themselves apart and give users something AdWords is not. Recently, Bing jumped on AdWords introduction of ‘Enhanced Campaigns’ and, more importantly, the vocal concerns of some users. The general manager of the search network, David Pann, announced that Bing would not be bundling mobile, desktop and tablet advertising together and would give users the flexibility to control their own campaigns.

Not only is this a clever step by Bing to promote itself while putting down Google, but it also gives advertisers an alternative to ‘Enhanced Campaigns’. There’s never been much of a difference between Bing and AdWords, but now you can trade one for the other based on your preferences. Of course, one still comes with a fairly significantly larger audience.

Read more about Bings recent announcements, including some planned changes and the future direction of the product at Search Engine Land.

Many website owners and SEOs have seen it happen. Your website is getting going, and Google is responding to your content with decent initial rankings. Everything seems fine, then gradually your ranking starts plummeting with no explanation.

You could time every day checking your rankings watching for this to happen  but that is a waste of time, as Search Engine Journal explains. Checking rankings isn’t an income generating activity, and your time is simply better spent elsewhere, like creating content or networking.

So then what is there to do about this mystery fall in the rankings? First, we have to understand what is happening, which Matt Cutts so helpfully explains in one of his latest YouTube videos.

Cutts uses an analogy of an earthquake to get to the heart of what is occurring. When an earthquake hits, the news about it is pretty broad. We know where it happened, but not many more details. Similarly, when content is posted, Google’s initial read of it is pretty wide. It is a best guess about where your content should rank.

As time goes by after an earthquake, we learn more and more. You will find out how much damage is caused, how many people died, how many aftershocks there were, and much more. As Google learns more about your content, it adjusts rankings. It contextualizes your content within the broader scope and repositions as needed.

So what can be done if you see your site drop in the ratings like this? Change up your practice. Most likely, your content is appearing to be quality at first, but Google is gradually peeling back the facade and seeing what your website really is, and it doesn’t like it.