Any time Google’s Penguin or Panda updates are mentioned, site owners and bloggers alike work themselves into a mini frenzy about the possibility that their totally legitimate website might have been penalized. It’s warranted, in a way, because a few innocent bystanders have been affected, but largely Google is policing those breaking the rules.

Meanwhile, bloggers have tended to downplay just how much rule breaking there is. Black hat SEO is treated as a fringe issue when in reality it is a huge issue. Writers tend to focus on a small aspect of black hat SEO in which competitors use shady links and other SEO tactics to bring your site down, and that is incredibly rare. Google considers all explicit spam to be black hat, and with that definition, black hat SEO is the most pervasive type of SEO around.

It is also the type of spam Google spends most of their time fighting. Matt Cutts, Google’s webspam team leader, took to YouTube recently to answer a question about how many notifications Google sends out to website owners, and 90% of Google’s manual penalties are still spent on blatant spam pages.

Google sends out hundreds of thousands of notifications each month, but the chances of your common SEO or website owner seeing one are rare. There is a chance though. The other 10% of notifications focus on problems that SEOs who have fallen out of the loop or novices may have gotten sucked up into such as link buying, link selling, or even hacking notifications.

New business have a lot to manage in a short period of time if they hope to be sustainable, and one of the most important marketing tools they can use is SEO. Establishing your company online is a huge step towards establishing your business in your community, and the only way to get popular online is to have a website showing up in the search rankings.

In the past, creating a respectable website with good content would have been enough to get your page on the search results page, but the internet is now an incredibly competitive arena. To get your website on the front page, you have to create great content while also managing a number of ranking signals that Google and Bing use to rank websites.

These signals are read by “bots” or “spiders” that index web pages and all of their internal information, which are sorted by algorithms that decide what pages get ranked where. There are seemingly countless signals, and it can be overwhelming when you are just getting started.

Startups trying to understand SEO often feel completely confused by the barrage of technical information out there, but there are some basic steps you can take to get started. Sujan Patel has five rules you can apply to running your website which will help any fledgling business firmly ground their online presence.

Source: RBertelg/Flickr.com

Source: RBertelg/Flickr.com

Website owners and SEOs have to budget their time wisely. There are a billion different ways you can try to gather traffic, but some are more effective than others. Of course, anyone that preaches that they have a quick way to get visitors is probably pushing questionable or outright terrible practices that won’t actually work, but there are also methods out there that under perform because they have become outdated or just fail to understand the field.

Sujan Patel put together a list of seven of these SEO tasks that waste precious time at Search Engine Journal. Some of these tasks are harmless, but don’t have any actual value. Checking your site traffic every day can be tempting, especially to new site owners. There is a legitimate thrill to seeing people begin to trickle onto your content, and the number of visitors is a helpful metric to keep note of, but checking traffic every day focuses too much on individual visitors and not the overall trends in traffic. Trends in traffic numbers give you much more useful information than seeing every single visitor arriving on your page.

Some of the other tactics Patel points out are downright frowned upon by the SEO community, and the Search Engines are trying to put a stop to them. Buying backlink packages was nothing more than a scheme to get sites to the top of rankings without having any actual value. It was a loophole that many took advantage of, but it has absolutely no real worth, and Google’s algorithm updates have made it very clear that the practice isn’t tolerated anymore.

Monitoring keyword density, unlike the past two, used to actually be fairly useful, but it has absolutely no function in the current SEO climate. Keyword density was never quite as important as some made it seem, but for a period Google’s system did favor sites with a reasonable amount of keywords within the content. That is pretty much completely gone now, and the more advanced search engines favor natural sounding content rather than overstuffed robotic sounding paragraphs.

Patel has even more tasks that are draining your time without giving anything back. It is easy to be tempted by easy paths to high rankings or to fall out of touch with the constantly changing SEO world if you let it happen. The best way to know where to focus your energy is to keep up to date with everything happening in SEO regularly, and to look for practices which offer long-term, sustainable growth for your site.

Link building is still considered a staple to SEO, despite what any bloggers may say. Yes, Google has clamped down on those using questionable quality links or outright spam to try to boost their rankings, but if you have been building a quality link profile, you likely never had problems with any of the countless Penguin updates.

For new sites, understanding where and how to begin building a link profile can be a bit confusing however. The most important tip for building up links is to start broad. While links tightly connected to keywords have a much bigger effect on rankings, they only improve your rankings in very specific searches.

Instead, you should be trying to create a broader relevance. This makes you rank higher for all keyword combinations rather than the few specific keyword combinations. Once you start seeing broader improvement, you can see what specific keyword combinations are doing the best, and which ones need your focus.

Peter van der Graaf explains how to begin your hunt for a better link profile over at Search Engine Watch, where he explains how to identify quality link partners and how to shift from a broad link profile to specific keyword focused links once the time is right.

Building a backlink profile is considered a staple of SEO techniques, but eventually you may have to do some cleaning up, especially now that Google has introduced multiple algorithms to clamp down on the use of low-quality links.

If you’ve seen a sudden drop in traffic or rankings lately, it is likely you were hit by one of these algorithms. You may have received a notification of being penalized, but unless it was a manual action, it is highly likely you got no warning that you were hit by the changes. Either way, one way towards repairing the drop in traffic is to do some pruning on your backlinks, and removing low-quality links that are pointing to your site.

Cleaning up your links is neither fast nor easy. It takes time and patience, but with effort you can restore your site’s health. You can’t just go in and cut out random links hoping to solve the issue. Attacking the problem broadly could cause more problems, and pruning backlinks is considered a last-ditch effort according to SEO.com. “You should exhaust all of your other efforts like updating your content, building higher quality links and producing good content to promote and engage users before you consider removing bad links.”

After you have tried all these methods and determined whether your website was hit by a penalty or an algorithm update, then you can create a strategy for fixing your backlinks. Neither problem can be fixed automatically. If you received a manual penalty, you will have to do everything you can to fix the issue identified, and submit a reconsideration request. Algorithm updates, on the other hand, require changing your methods and waiting to see positive growth for your site.

If you are ready to put in the work and time to try to properly repair your site, and you’ve already tried everything else, then it is time to really get your hands dirty. SEO.com has a full tutorial for cleaning up backlinks, and it walks you through every step, including suggesting tools for analyzing backlinks.

Last week, Matt Cutts responded to a question he receives fairly regularly concerning the PageRank feature in the Google toolbar. Specifically, why haven’t they removed it? It is apparent that many believe that the PageRank feature is “widely used by link sellers as a link grading system.”

There is, of course, some truth to this. While spammers do take advantage of the PageRank system, Cutts says that it is still relevant to many others. “There are a lot of SEO’s and people in search who look at the PageRank toolbar, but there are a ton of regular users as well.” Apparently, many internet users see the PageRank feature as indicative of reputability  and Google doesn’t plan on forcing them to stop.

That doesn’t mean PageRank is here to stay forever. While Google plans to keep supporting it so long as it is relevant to their users, it is telling that Chrome does not have the PageRank feature built into Chrome. Now, IE 10 is disavowing add ons, meaning Google’s toolbar will no longer work with the browser.

Considering that Internet Explorer was the only browser supporting the Google toolbar, it is highly likely the PageRank feature, as well as the toolbar as a whole, will fade away before long. As Matt Cutts puts it, “the writing is on the wall” that the new iteration of IE could be the end of PageRank, but we will have to wait and see.

After the big shift to content focused SEO this year, a lot of the talk has been about the technical ways experts can use to try to get higher rankings behind the scenes. Everyone talks about how important is, but many are still more distracted by the ways they can mathematically manipulate that content to tailor to Google’s algorithms.

What too many are missing is that now the best way to tailor to Google is to turn your focus towards what consumers and visitors want.

The truth is, the top sites online have been doing this for years, because the most popular sites are those that provide quality content. Smaller SEO’s seem to have trouble accepting this for two reasons. The first is that it is hard to quantize how to make effective content. There isn’t necessarily a magic formula for the best blog, even for search engines.

Search engines run on algorithms, and it is an SEO’s job to adapt or even create a site to best fit those algorithm’s needs. However, trying to take advantage of those algorithms has lead to more and more using questionable practices to try to “trick” Google into higher rankings for sub-par content. This lead to Google instituting the Penguin and Panda updates, so that low-quality sites had a much harder time making their way to the top.

The other reason SEO’s often have trouble understanding that great content has ALWAYS been important is the competitive nature of website rankings and business in general. Just having excellent content alone has never been enough, and never will be, because there is a lot behind the scenes that pretty much has to be done to remain competitive for the great content to ever be noticed. The trick is finding the line between being competitive and slipping into more questionable practices.

But, there are thousands of pages worth of articles on how to tackle all of that behind the scenes SEO that you can do. When it comes to lessons on how to actually make the great quality your visitors and the search engines want to see, there’s a lot less to work with. Rebecca Garland, in an article for One Extra Pixel, gives some great pointers on how to actually improve the quality of your content, while also favoring the current search engine climate.

When business owners finally decide to use SEO, they are often uninformed or confused on a lot of the basics of the industry. It isn’t surprising, considering how complex and ever-changing SEO is. While trying to explain all of SEO to a client or business owner is impossible, Nick Stamoulis thinks a few key ideas can help orient people new to the industry with a better understanding of what we do.

SEO is Long Term

One of the most common misconceptions about SEO and the internet as a whole is that there is some magic way to dominate search results or gain visitors overnight. There are a select few cases of websites that have sprung up over the span of a couple months, but those are rare, and there were other factors contributing to their quick success.

SEO is a long term process that builds on itself over time. It can take months just to see the kind of effects your SEO strategy is having on your site. For example, content creation and marketing are huge parts of the current SEO field, and SEO companies pump content out steadily through the work week. Most of this content can go unnoticed, while an occasional article gains gets some attention, but in the end they are all positively contributing to the sites SEO strategy and SERP placement.

No one wants to wait to see positive results, but some things you just can’t force.

Always Put Visitors Before Search Engines

Good SEO relies on creating a good user experience. No marketing campaign in the world will raise an objectively bad website out of the ether, because people won’t return to a site, or even stay on the page long enough to matter, if the site doesn’t work well or have interesting information.

The types of people who put all of their focus on what search engine algorithms want are the type of people who try to take advantage of every loophole and questionable strategy they can find. It might even work for a while, but eventually a new algorithm will identify what they are doing and, as Liz Lemon would say, shut it down.

Conclusion

Stamoulis has two more ideas in his article he feels it is important for business owners to understand but these two points identify the biggest misunderstandings the uninitiated have. If you’re a business owner trying to get into SEO, ask yourself why you want to start now. If you want to dominate the rankings to start making tons more money tomorrow, you are barking up the wrong tree (is there even a right tree for that?). But if you are trying to make your already reputable product or brand more available to the masses over time, SEO can help.

If Eric Schmidt’s book, “The New Digital Age”, is to be believed, Google’s authorship markup is going to play a huge role in search engine result pages before long. Given, as Search Engine Watch points out, Schmidt has a “talk first, think later” habit which has caused some great, though not always reliable, soundbites  but the fact that this is in his upcoming book, rather than a random interview, lends this quite a bit of reliability.

The Wall Street Journal published some excerpts from the book, and it is one in particular which has caught the eye of SEO professionals.

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.”

Google introduced their authorship markup in 2011, and stated at the time that they were “looking closely at ways this markup could help us highlight authors and rank search results,” but since then it has faded into the background in many ways. Google’s plans for the future bring it very much so back onto the table. Schmidt’s comment has made it very clear that Google wants to implement Google+ as a verification device. On one hand, it would be one of the best combatants against spammers imaginable. On the other, do we really want a future where we are forced to be on Google+ just so people can find your website?

There is more than enough talk out there about negative SEO, and how to prevent it or fight back against it, but Matt Cutts says the actual number of occurrences of people trying to use negative SEO is extremely low. He explains that Google designs their algorithms to try to ensure that they can avoid penalizing innocent sites and now that Google has added the Disavow Links tool to their repertoire, it is very easy to shut down “black hat” SEO if it does happen to you.

Cutts, the head of the Google Webspam team took to YouTube to answer the huge number of questions he has received about negative SEO, and also further explain the Disavow Links tool, clearing up any misconceptions there could be. Cutts doesn’t think negative SEO should be a concern for the vast amount of website owners out there, unless you are in extremely competitive spheres. “There’s a lot of people who talk about negative SEO, but very few people who actually try it, and fewer still who actually succeed,” he said.