Posts

While on the surface, creating content is about sharing important information of different kinds with the public, we’d all be lying if we said that we didn’t hope to get the most traffic possible coming to your site thanks to some great blog post or infographic. It isn’t easy. Getting over 100,000 views on a page as a startup is a lot of luck, but it also takes a lot of work to make quality content.

There are no magic tricks to make content that will get you exponentially more site visitors and creating one post that gets that many eyes on it doesn’t mean they will necessarily keep coming back, but it can tell us a lot about what people are looking for on the web and what counts as great quality.

Stephen Kenwright works at Branded3 who recently hit the coveted 100,000 pageview benchmark, and he wrote about what he has learned from the short term success over at SEOMoz. You can learn a lot from their isolated case, and the tips Kenwright offers.

Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

With all of the different ways Google can penalize you these days, it is easy to get confused about what you need to do to fix your mistakes. Between Penguin, Panda, Unnatural Link Penalties, and Manual Penalties, there are more ways to get in trouble than ever.

Google’s increasing strictness is far from a bad thing, but it is also getting increasingly complex which makes for confusion when trying to bounce back from a mistake.

Marie Haynes knows just how confusing it can be. She has been working in SEO and writing for SEOMoz for years, but even she got confused when trying to help someone with what she thought was a Penguin-related penalty. She then saw another respected writer make the same mistake in a recent article but confusing unnatural links penalties with Panda.

It seems we need to go to the root of these issues and break down what each of these different penalties are and how they are different from each other.

The Penguin Algorithm came about last April as a algorithm change aimed at fighting webspam, which explains the initial title “The Webspam Algorithm” and it mainly targeted sites participating in link schemes and other questionable linking practices, though it also looked for indications of keyword stuffing.

The Penguin Algorithm isn’t to be confused with an Unnatural Link Penalty. The main difference is that Unnatural Links Penalties are manually taken against you rather than by an automated algorithm. They mainly place these algorithms when they believe a site is attempting to manipulate search engine results through the creation of links. The real question is what causes Google to investigate your site.

It is widely believed that filing a spam report will flag a site for manual review, but others have guessed that Google monitors more cutthroat niches such as “payday loans” or casino sites and consistently manually checks for unnatural links. Thanks to Google’s secrecy, we may never know exactly what makes Google personally examine a site.

So what is the main difference between Penguin and Unnatural Links Penalties? It really all comes down to the different way algorithms act compared to penalties taken by a living breathing person. Algorithms view all sites the same and is effective almost immediately. All sites hit by an algorithmic penalty will see the damage within the day of the algorithm update. Manual penalties on the other hand are being placed against sites at all times, and can be appealed more easily than an algorithmic penalty.

You can always recover from any of these penalties with effort, as Marie Haynes shows in her article, but you have to clean up your page and your methods. SEOs can’t get away with participating in link schemes or engaging other black hat techniques anymore, and there is no way to cheat the search engines anymore.

Not a Google Panda

Look at the most recent SEO article you can find about Google Panda. You can even look at some I’ve written. In general, the mood among those articles is not positive. Whatever positive changes for users that Panda offered, it drastically changed how SEO is run, and well, people don’t tend to react well to change.

However, in all the hubbub about the negative impact Google’s changes may have had on smaller businesses, we forgot that Google Panda did make some very important changes that made their search engine perform markedly better.

Ruth Burr, head of SEO at SEOMoz, didn’t forget this because she is a constant user of Google search. I won’t repeat her anecdote here, but she does recall a time when using Google could easily lead you to vapid, not useful websites trying to hide that their “articles” were really just ads for their own business.

The biggest point she raises is very true. Google’s goals are not to “foster small or local business growth in the U.S. and abroad.” While there are ways for local or small businesses to take advantage of search engines, Google’s main aim is to simply provide the best search engine performance possible. There’s little denying Panda wasn’t a step in the right direction in that regard.

If you aren’t convinced of Panda’s positive features, or just want to see more pictures of cute pandas, check out Burr’s article. She makes some strong points.

It’s hard to keep up with Google’s constant adjustments, and AuthorRank is a future feature that isn’t as understood as it probably should be. Its history dates back to August of 2005 when Google filed a patent for “Agent Rank”.

This patent included ranking “agents” and using the public reception to the content they create to determine their rank. Basically, the more popular websites with positive responses would be higher in rankings than less-authoritive “agents”.

After the patent, “AgentRank” disappeared for a while, until in 2011 Eric Schmidt made references to identifying agents in order to improve search quality. A month later, they filed a patent for what is assumed to have become Google+, which acts as a digital signature system for identification, which can be tied to content. And that content can be ranked. Hello, AuthorRank.

It has yet to be officially implemented, but there have been rumors all year that AuthorRank is under development, and AJ Kohn has stated it could completely change the search engine game. It would act as a factor in PageRank, which makes high-quality content higher ranked.

Mike Arnesen at SEOmoz says it’s not a matter of “if Google rolls out AuthorRank, but when.” He also has some great suggestions of how to be prepared for when AuthorRank arrives. I highly suggest reading his extensive article, because I agree strongly with the idea AuthorRank will be here sooner rather than later.

With Google’s recent focus on social media, and the natural concept that people want to see quality content in their results, it is just a matter of time before AuthorRank is a serious concern to the SEO industry.

 

Google’s made some very big changes in the past couple of weeks, and it’s affecting more sites than previously expected. In what way? Depending on how SEO has been done, some sites are dropping a few positions, some are dropping by multiple pages.

Read more

Google has a ton of different tools available inside of the Google interface.  You can check to see all the pages on a single site, you can look through specific title tags, and with the “link:” command you can see links to a particular page or site.

However, this command is by no means the main tool you should use to get backlinks.  There are several holes in this command, and Google themselves advise to take it with a grain of salt.  SEOmoz has a great post about several misconceptions on this command.

So how do you get a decent report on backlinks?  There’s no perfect tool, but the two I’d recommend using are the Google Webmaster Tools and the Yahoo Site Explorer.  Both give a much better amount of information than the “link:” command and can give you a better concept on just what kind of backlinks a site has.  Which, as we all know, is indicative of the quality of SEO for a particular page.