Less than a year ago, Google unleashed an update called the “webspam algorithm” that seemed innocuous at first, until experts began to notice how widespread its effects were. The impact of the update was so large, Google eventually gave it an official name more in line with their other update, Panda. The “webspam algorithm” became Penguin.

The original name for the update was an accurate description for what this update did. It was aimed to demote sites violating the Webmaster Guidelines for Google, specifically sites full of webspam. These sites used manipulation to improve their rankings in the search engines, but some innocent sites were affected, and more have been affected by each subsequent update to Penguin.

These “black hat” methods such as keyword stuffing, cloaking, participating in link schemes, and purposefully using duplicate content had been around on the internet since SEO has existed (pretty much as long as the internet has been widely used), and Penguin sought to finally deal with the spammers, but with it a new set of rules for SEO were created.

Pratik Dholakiya has collected these rules into “The Definitive Guide To Penguin Friendly SEO” which explains which methods have been shunned and what new techniques are favorable for SEOs.

If you were actively using black hat techniques, you won’t find new ones to continue spamming in a different way, but for any SEO looking to legitimately improve their search performance with good content and practices, this list will help steer readers away from any bad methods.

Receiving an email from Google saying that they have noticed unnatural links associated with your website is never a good thing. The best outcome still involves losing organic search traffic from Google for at least some time, plus the email means you have work to do really quickly.

Search Engine Watch writer Chuck Price has seen webmasters respond to manual penalties and many of them actually make their problems worse, especially when they panic. Webmasters who panic when they receive a manual penalty website tend to fly into manic states running around doing the first thing they can think of to try to get the penalty fixed; thoughts like filing reconsideration requests before actually fixing the problem.

If you’ve received an email about unnatural links or manual penalties, take a minute to breath. There is no reason to panic. You have to do some work to identify the issues and fix your site’s linking problems, but panicking isn’t going to get that done any faster, and might blow your one chance to get your links reconsidered by Google in the near future.

Once you’re calm, it is time to get to work. Price has suggestions for how to get your page back in order no matter how many bad links you have. However, if you have been actively building thousands of unnatural links, you will have to make huge efforts to make up for the spamming.

On January 15th, Facebook announced they will be starting their own personal search engine called Facebook Graph Search. That’s right, the little white bar at the top of your page now has an actual purpose. The search engine relies strongly on “likes” and other relevant Facebook information such as page popularity and location signals.

While this new change could lead to some interesting methods of finding businesses close to you (for example, Facebook claims you will be able to search things like “Italian restaurants that my friends have been to”), but it also has business owners and SEO experts wondering how to take advantage of Facebook’s search. As Matt McGee found out however, Facebook was already ahead of everyone.

They released tips for how to make sure your business gets found in the Graph Search. Releasing these tips helps them as much as it helps others use the search, because it will help business position themselves to appear in the search while simultaneously populating the search engine for Facebook.

The most important things to know are that Graph Search is only available for a few right now, but it will offer local search from the very start, and the search results are created by compiling information created and shared by businesses, especially through their pages.

That doesn’t mean you have to have a page for your business however. It will help enormously, but businesses also will show up so long as customers or visitors have tagged them as a “place”.

We will have to wait to see just how Facebook’s Graph Search works once it is unveiled for the wider public. It could become another gimmick type feature of Facebook that many don’t use, like the FourSquare like ability to check into locations is treated at the moment. But, it is possible Facebook’s new search engine could be a useful tool for finding local businesses.

Getting your posts out to the masses is one of the hardest parts of writing online. Just publishing them isn’t even close to enough. The good news is, it has never been easier to share on the internet thanks to socia media.

Promoting your content is absolutely a part of getting it in front of other people, and you have to do the promoting yourself. The easiest way to think about it is to ask yourself “why should anyone care about my content, if I don’t care enough to promote it?”

There is no shame in pushing your content in front of the eyes of others. Of course, there is a line where pushing it onto others can be a turn off, but its way better than toiling in obscurity.

Jordan Kasteler, columnist from Search Engine Land, has nine different methods he uses to promote his content online. It may seem weird to put your work out there in front of others at first, but its the only way to get your content out there. In a world where content marketing is a huge part of SEO, its important to get people sharing your work.

With all of the changes Google made in the past year, it is easy to get mixed up as to what changes affected what areas of a site’s SEO information, and what was penalized by which algorithm updates. Combine that with a disavow links tool which most don’t seem to understand, and it is a wonder anyone can keep up with Google’s updates.

Pratik Dholakiya, writer for Search Engine Journal, recognized how confusing this all must be, and sought to explain which types of updates affected what, as well as all of the misconceptions surrounding these updates. He breaks them down into three basic types of updates, and each focused on different aspects of SEO.

EMD Algorithm Update – The September update targeted sites with exact match domains (EMDs), or sites named after keywords instead of brands. This change didn’t so much penalize most affected as it removed a special boost they were receiving due to the name of the website.

The only people really penalized by the update were those who had over-optimized their site around the keyword. There is also a misconception the EMD updates were Panda or Penguin related, but Matt Cutts has put that idea to rest.

 

Panda Updates – The main area the Panda updates looked at was your on-site content. Google was trying to weed out low-quality or duplicate content, and they’ve been churning out constant new versions of Panda all year.

Penguin Updates – Despite the close association with Panda, Google’s Penguin updates are actually their own beast, formerly known as the webspam algorithm update. They are targeting all of the spammy sites out there, and unless you’re a spammer, the only penalties you may have seen from these updates were from links.

If you have seen any penalties from these updates, Dholakiya explains how to help fix the problems. The Disavow Links tool can help with that, especially if you’ve seen penalties from the Penguin updates, but it isn’t a magic solution.

SEO as a whole can be split into two different categories: on-page and off-page optimization techniques. On-page optimization is focused on everything you can do to boost your rankings directly on your webpage. Off-page SEO concerns aspects that function elsewhere, like backlink management.

Some might argue that on-page optimization has been weakened by Google updates that have sought to weed out pages using methods like keyword stuffing. While this is kind of true, it does not fully discredit on-page methods.

You can still take advantage of proper keyword usage, titles, and URL management, but as Matt Cutts puts it, “there’s diminishing returns.” Christian Arno from Search Engine Journal explains what still works in on-page optimization, and while some of the old techniques have been cut down, the most effective techniques are still tried and true.

Bloggers are always talking about untapped methods of raising click-through rates and positions in rankings. They aren’t always as untapped as the writers make them seem, but the advice offered within their articles is usually solid. That’s the case with Chris Silver Smith’s list of semantic markups that can be added to pages.

Semantic markups are a way to increase the odds that information from a site will be highlighted on search engine result pages through the use of rich snippets. This increases visibility, and helps gain attention and click-throughs.

Semantic mark-ups most likely won’t directly improve your rankings on searches, there is quite a bit of evidence that they do increase click-through rates, because customers are more attracted to your listing. The average increase is supposedly 15-percent.

A raise in click-throughs can improve rankings over time because click-throughs do help determine rankings, so that alone is a great reason to start adding them to your websites.

Semantic mark-ups might not be the esoteric idea Smith presents them as, they will undoubtably help almost any page that adds them. They optimize for all of the most popular search engines, and they will benefit your site overall.

MajesticSEO is a tool that is as well-known as it is respected. It is great for site audits and research, but it also has an important use in the current state of SEO where it has become clear that your site can be hurt externally, through bad links.

MajesticSEO can help you diagnose bad links, and possibly a bad linking campaign. Though, it is entirely possible the bad links dragging you down are not coming from external sources, but instead are caused by unscrupulous site owners or SEOs.

Search Engine Journal writer Irish Wonder helps walk you through identifying and understanding bad links with the use of this great SEO tool. They aren’t perfect means of diagnosing your problem, but they can help point you in the right direction if you can’t pinpoint the exact issue.

When trying to pump content out for a blog, it is easy to become focused on resharing news or tips essential to the community, especially with SEO. The problem is that SEO changes so quickly, most of these posts go out of date very quickly. This is why every blog needs a good amount of “evergreen content”.

Evergreen content is the term for any posts or articles on your blog that will always be relevant to your content. Sujan Patel from Search Engine Journal uses an example to show the distinction.

If you are running an SEO blog, an article about the latest Penguin update won’t be relevant a week or two later when the next update appears. However, a post like “What is SEO?” will always be important, especially for any new readers you gain. The definition of SEO isn’t going to change, and the overall idea of the industry stays largely consistent, though you may need to update the article every few years.

Evergreen content is always up-to-date and will always be a primary interest for your readers. For blog managers, it offers more effective content, that can be re-run later with the same impact it originally had.  For readers, it is helpful because new readers are always looking for basic information.

I like to think of it like Wikipedia information. Wikipedia articles tend to consist of factual information without touching too much on “best practices” or other time sensitive issues. When someone accesses a Wikipedia article, they want a basic explanation of what something is and why it is important. If you can convey that in an article, you have the recipe for great evergreen content.

SEO changed so much, it is hard to predict what will be best for 2013. Just this past year, Google has issued so many changes that it has become pretty much a constant. Trying to pinpoint where we will be a year from now almost feels impossible.

Paul Bruemmer from Search Engine Land, however, believes he knows how to keep up with everything for the next year with just a few tactics that can help guide you. Some are timeless, such as always keeping up with the best SEO practices, specifically starting with the Google 2012 Search Engine Optimization Starter Guide. When in doubt, Google usually has an answer for any SEO practice you should be focusing on.

Social Media is, of course, also going to be a strong driving force in SEO for the next year, as there are no signs of them losing popularity. Even with the ever-changing heirarchy of sites, Facebook, Twitter, and Youtube are solid constants that can be leveraged increasingly to keep in touch with consumers.

Falling behind in SEO is dangerous for your career and your clients, and any good SEO knows to keep up. Brummer’s suggestions are by no means comprehensive, but if you are wanting to make sure you’re in a good place to take on the coming year, the article is a great start.