Recently, Google updated the link schemes web page that gives examples of what Google considers to be spammy backlinks. The additions are pretty notable as article marketing or guest posting campaigns with keyword rich anchor text have been included. Advertorials with paid links and links with optimized anchor text in press releases or articles were also added.

With all the new additions, it can be hard to keep up to date with what Google is labeling spammy backlinks or backlink schemes. But, Free-SEO-News’ recent newsletter simply and efficiently lays out the 11 things that Google doesn’t like to see in backlink campaigns.

  1. Paid Links – Buying or selling links that pass PageRank has been frowned upon for a long time. This includes exchanging money for links or posts that contain links, sending ‘free’ products in exchange for favors or links, or direct exchange of services for links. It is pretty simple, buying links in any way will get you in trouble.
  2. Excessive Link Exchanges – While exchanging links with relevant other websites in your industry is absolutely normal for websites, over-using those links or cross-linking to irrelevant topics is a big sign of unnatural linking. Simple common sense will keep you from getting in trouble, just don’t try to trick the system.
  3. Large-Scale Article Marketing or Guest Posting Campaigns – Similar to the last scheme, posting your articles and guest posts on other websites it perfectly normal. However, doing it in bulk or posting the same articles to numerous websites will appear to be blogspam to Google. Also, if you do guest posts just to get keyword rich backlinks, you will see similar penalties. Only publish on other websites when it makes sense and offers value.
  4. Automated Programs or Services to Create Backlinks – There are tons of ads for tools and services that promise hundreds or thousands of backlinks for a low price and very little work. While they may do what they say, Google also easily spots these tools and won’t hesitate to ban a site using them.
  5. Text Ads That Pass PageRank – If you’re running a text ad on another website, you have to make sure to use the rel=nofollow attribute, otherwise it appears to be a manipulative backlink.
  6. Advertorials That Include Links That Pass PageRank – If you pay for an article or ad, always use the rel=nofollow attribute. Simply put, if you paid for an ad or article, it won’t do you any good and can bring a lot of damage if you don’t use the attribute.
  7. Links with Optimized Anchor Text in Articles or Press Releases – Stuffing articles and press releases with optimized anchor text has been a strategy for a long time, but Google has shut it down recently. If your page has a link every four to five words, you’re probably looking at some penalties.
  8. Links From Low Quality Directories or Bookmark Sites – Submitting your site to hundreds of internet directories is an utter waste of time. Most links won’t ever get you a single visitor and won’t help your rankings. Instead, only focus on directories that realistically could get you visitors.
  9. Widely Distributed Links in the Footers of Various Websites – Another older trick that Google has put the squash on was to put tons of keyword rich links to other websites in the footer. These links are always paid links and are an obvious sign of link schemes.
  10. Links Embedded in Widgets – It isn’t uncommon for widget developers to offer free widgets that contain links to other sites. It also isn’t uncommon for these developers to reach out to site owners and offer to advertise through these widgets. However, Google hates these links and considers them a scheme. I’d suggest against it, but if you do advertise through these widgets, use the nofollow attribute.
  11. Forum Comments With Optimized Links in the Post – It is very easy to get a tool that automatically posts to forums and include links to websites. It is a pretty blatant form of spam which won’t get any actual visibility on the forums and the links are more likely to get you banned than draw a single visitor.

There’s a pretty obvious underlying trend in all of these tactics that Google fights. They all attempt to create artificial links, usually in bulk. Google can tell the quality of a link and all of these schemes are easily identifiable. Instead, focus on building legitimate quality links, and use respected tools such as SEOprofiler. It will take longer, but you’re site will do much better.

SEO Magnifying Glass

Source: Flickr

Search engine optimization (SEO) isn’t the easiest thing to get into, even though it is one of the most important things you can learn when starting an online business or building a website for your company. It isn’t that SEO is too difficult for most to learn, it is simply that most people in the industry have been working in it for so long that even the basic guides often come out overly complicated.

SEO is extremely important for bringing in new customers and being found online. In basic terms, SEO is notifying search engines to the existence of your site and telling them what its about. This way, search engines can rank the quality of sites and decide where you belong in the results. Of course, the higher you are in the search results, the more people will come to your site.

Daily SEO Tip categorizes SEO into four basic parts: keywords, content, links, and relevance. If you understand each of these components, you are well on your way to setting up your search engine optimization.

Keywords

Keywords act as the basic main ingredients of your website. The amount of keywords you have, their relevance, and how often you use them all play a role in a search engine determining your site’s quality.

  • Make sure all keywords you use are directly related to your service, brand, or product. Keep them specific to what you do, not just the broad industry you work in.
  • There is a practice called keyword stuffing that can get you into a lot of trouble. Keyword stuffing is the practice of overusing keywords in order to trick search engines. But, the search engines are very smart and will quickly see that you’re using words out of context or unnecessarily.

Content

Search engines are basically rating your website, and content is the main thing they are judging. The engines want to show searchers sites with valuable information. That doesn’t mean the content is selling to the user. It should be offering something of real value such as informative videos, up to date news, or helpful tutorials. Instead, the content establishes yourself as an expert in your field and raises your site’s reputability with search engines.

Linking

Ratings are partially decided based on how many inbound links a website has. They serve essentially as arrows directing the search engines to your site. It also follows the theory that if people are linking to your site there must be something of value there. It also shows that you aren’t an isolated spammy site in the internet ether, which is why you should also include links on any social media sites (aside from simply helping visitors find your business.)

Relevance

Relevance is less of a concrete component of SEO, but it is relevant in every facet of the work. Search engines spend the majority of their time fighting spam, and irrelevant content, keywords, or links are a huge red flag that a site may not be reputable. Search engines assume webpages deal with specific topics, be it news, jewelry, or a Buffy the Vampire Slayer fanpage. By keeping your content relevant to your topic, search engines know you are focused, professional and informative.

Conclusion

If you can get a hang on these four basic ideas, you will have a solid grasp on how SEO functions and how you can get your site showing up on search engines, bringing in new visitors and potential customers. SEO can be a broad, complicated topic, but the basics tend to always stay the same. Follow these principles, and you’ll be able to figure out the rest.

Google is great for searching for quick and concise information, such as what you might find on Wikipedia or IMDB. If you have a simple question with an objective answer, the biggest search engine is the perfect tool. But, as anyone who has tried to do actual research for academics or work will tell you, Google is not so great with providing lots of in-depth content.

You might find news, or maybe a couple books and articles on Google Scholar, but the main search results on Google can be limiting. You are getting the results for the people with the best short answers for your questions. Now, Google is trying to change that as they recognize roughly 10% of searches and people (their estimate) are looking for more comprehensive information.

Google has announced that over the next few days they will be rolling out “in-depth articles” within the main search results. Now, when searching for broader topics that warrant more information such as stem cell research or abstract topics such as happiness or love, Google will feature a block of results in the middle of the page like below.

In-Depth Articles Screenshot

Source: The Official Google Search Blog

This is yet another way Google is forming results intuitively tailored to fit the type of information you are searching for. As a Google spokesperson told Search Engine Land, “Our goal is to surface the best in-depth articles from the entire web. In general, our algorithms are looking for the highest quality in-depth articles, and if that’s on a local newspaper website or a personal blog, we’d like to surface it.”

It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.

According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.

SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.

Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.

If you’ve ever worked with PPC, you know how important “landing pages” can be. Google partially decides where a paid ad will appear and how much each click costs based on the quality of the landing page that ad leads to. Similarly, SEO professionals surely know all about “optimized pages” and how Google analyzes them for the SERPs. However, Stoney deGeyter from Search Engine Land says we should stop thinking of landing pages and optimized landing pages as different things. Now is the time for an optimized landing page.

SEO and PPC have always worked very closely, and in this case they overlap to the point where keeping them separate is doing a disservice to you and your site. Landing pages need to be optimized and optimized pages need to be respectable landing pages. Merging the two concepts into one idea simply makes sense.

So what does an optimized landing page look like? They simply change the intent of the optimization towards conversions. While SEO optimized pages are intended to rank highly, they can and should be performing the additional purpose of getting users to perform whatever action you desire such as purchasing a good or service or signing up for an e-mail mailing list. To do this, you just need a few things.

  • Compelling, Keyword Focused Title Tag – The title tag is probably the most 8-10 words you will write when optimizing your page. Not only does it need to be keyword focused, but it also needs to be interesting enough for searchers to choose your link over the others in the search results. Anyone could do one or the other, but achieving both at the same time is tricky.
  • Well-Written Description – Meta descriptions may not be important for rankings, but that shouldn’t diminish their importance for SEO. It displays in the search results and gets people to click to your page, so it is automatically essential for proper search engine optimization. It is also a great place for a strong call-to-action for the searcher.
  • Keyword Focused Headline – Headlines are the first thing users see when hitting your landing page from a search engine, so it is important for the keyword to be relevant if not similar to what was listed on the results page. It should also be wrapped in an H1 tag for proper optimization. Proper heading and sub-heading use helps search engines and browsers alike to determine what type of content you are offering and decide if they will stay on the page. Make yours compelling.
  • Topically Focused Content Concentrating on Benefits – For anyone to stay on your page, you need to keep your content on topic and interesting. Wandering off on tangents or not getting to the point will lose your visitors. Your content can be long, but it must also be trimmed of all excess. Not only that, but the value of your content should be readily available. Customers want to know what they will be getting from the content. Being positive and focusing on real tangible benefits will keep readers and consumers interested.
  • Keep Your Content Scannable – Even long content needs to be scannable so users can find what they want without hassle. Even interested visitors might not care about everything on your page. Keep your pages cleanly laid out, and clearly divide your content with sub-headlines that show the users where they want to look. White space and line spacing can be especially important to overall readability.
  • Call-to-Action – Without a call-to-action, there isn’t even a reason to have a landing page. Each page should have a goal that comes with a desired action or results that you want each visitor to take. The landing page should be a first step, not the only one. The only way to accomplish this is by clearly showing users what you want them to do. Whether you want them to share your content, sign up, or purchase, make it obvious.

There are some other small aspects deGeyter says these pages need, but the ones listed are by far the most essential. Optimized landing pages combine the best of both worlds when it comes to SEO and PPC. They accomplish two missions while saving stress and effort. SEO and PPC have their unique focuses and functions, but sometimes they work best when working together.

By now, the hacker craze of the 90’s and early 2000’s has died down quite a bit. Most people don’t worry about hackers all that much, so long as you use some solid anti-virus and keep your router protected. Big businesses may have to worry about Anonymous’ hi jinks, but the common person don’t tend to concern themselves with the issue. Hacking especially doesn’t seem like that big of an issue for SEO, at first.

But, hackers can actually do your site some damage, and can even get your site entirely dropped from the Google search index. Sites get blacklisted when hackers inject malicious code onto servers, as Google seeks to protects searchers’ computers from any sort of compromising.

While Google doesn’t immediately drop sites from their index, being blacklisted leads to a complete drop in organic traffic and can be a crisis for SEO. Blacklisting starts as a warning to searchers that a site may be compromised, and few will continue past that alarm.

This has become a rather significant problem for Google. To help provide wide support for the increasing number of webmasters dealing with compromised servers, Google has launched the ‘Webmasters Help for Hacked Sites‘ support center. They give detailed information on how to clean and repair your server and prevent your site from getting entirely dropped from the Google index.

If you think this sort of hacking isn’t a big deal, check out the charts below. They show just how frequent this type of malicious activity has become. It isn’t just banks and large corporations dealing with it. Small businesses are just as at risk as international franchises. The most common form of attack is an automated set of processes that indiscriminately discover and exploit vulnerabilities on servers, which are often left completely unprotected.

Search Engine Journal recently explored the issue more in depth, unpacking why the issue is such a large concern to Google and webmasters alike. Compromised sites can destroy a search engine’s credibility just as your own, so the problem has to be taken very seriously.

Timer

Source: WikiCommons

Everyone working in SEO knows that Google has a multitude of factors they use to determine the order of search engine results, and the majority of these ranking factors are based on either the content of the webpage or signs of authenticity or reputability. That was the case for the longest time, but since 2010, Google has made significant shifts towards a focus on usability, and the harbinger of this change was the inclusion of website speed to ranking factors.

The problem is, website speed and other usability issues aren’t exactly objectively defined. What exactly is a slow loading site? What is the cutoff? No one has gotten a definitive answer from Google, but in June Matt Cutts explicitly stated that slow loading sites, especially on mobile platforms will begin seeing search rank penalties soon.

Obviously these changes are good for searchers. Searchers want sites that load quickly, offer quality user experience, and deliver great content. And, the emphasis on speed is certainly highlighted on mobile platforms where on-the-go users are likely to go back to the results if the site takes too long for their liking. The issue we face as search optimization professionals is trying to figure out exactly what Google is measuring and how that information is being used.

Matt Peters from Moz decided to break through Google’s intentionally vague information to figure out exactly how site speed affects rankings with the help of Zoompf. They can’t explicitly disprove causation between site speed and rankings, due to the number of other algorithmic ranking factors that complicate the study. But, their results did show very little to no correlation between page load time and ranking.

I wouldn’t take this information as gospel, but it does suggest that loading time isn’t a huge consideration into long tail searches and doesn’t need to be worried about too much. If your site is loading quickly enough to please the people coming to it, your site will also likely pass Google’s expectations.

DuckDuckGo LogoInternet security and privacy has been at the forefront of many people’s minds with the recent headlines about the NSA keeping data on the public’s online activity, and the issue has had subtle affects on search engines. We’ve seen a small group of searchers migrating to search engines with stricter privacy policies. Of course, those who are truly outraged by the NSA news would expect to see a pretty large shift, but so far the change has been slow. But, it is picking up momentum.

More and more people are learning about how Google actually decides which results to show you, as an individual, and many are a little concerned. While Google sees the decision to collect data on users as an attempt to individually tailor results, a few raise their eyebrows at the idea that a search engine and huge corporation is keeping fairly detailed tabs on the internet activities of users. The internet comes with an assumption that our activity is at least fairly private, though that notion is getting chipped away at daily. But, there is still the widespread assumption that our e-mails or simple search habits are our business alone, an assumption that is also being proved wrong.

These privacy issues have a fair number of people looking for search engines that keep our searches completely anonymous and don’t run data collection processes. The most notable solution people seem to be moving to is DuckDuckGo.com, a search engine whose privacy policy claims will not retain any personal information or share that information with other sites. The search engine has been seeing a traffic rise by close to 2 million searches per day since the NSA scandal broke.

There are numerous debates surrounding these issues. Political discourse focuses on the legality and ethical aspects of the government and large corporations working together to collect information on every citizen of the United States (other companies included in the NSA story include Yahoo, Facebook, and Microsoft). But, as SEO professionals, the bigger question is the ethical and practical reality of individually tailored results which rely entirely on data collection.

If you’ve ever taken a look at the ads on the edges of websites, you’ve probably noticed that the ads are loosely based on your personal information. The ads reflect your gender, age, location, and sometimes loose search histories. The ads you are shown are chosen based on information your computer relays to almost every site you access. Google acts the same way, but they collect this data and combine it extended data of your search history to deliver search results they believe are more relevant to you.

There is a practicality to this. We all have fine tuned personal tastes, and innately we desire for search engines to show us exactly what we want with the first search result, every time. While poll responses say that the majority of people don’t want personalized search results, are online actions belie our true desires for efficient search. The best way to do this is to gather data and use the data to fine-tune results. On a broad scale, we don’t want results for a grocery store in Los Angeles when we are physically situated in Oklahoma. On a smaller scale, we don’t want Google showing us sites we never go to when our favorite resource for a topic is a few results down the page.

In this respect, the move towards search engines like DuckDuckGo is actually a step back. These privacy-focused search engines are essentially acting how Google used to. They use no personal information, and simply try to show the best results for a specific search. It is a trade of privacy for functionality, and this could possibly explain the slow uptake or migration to these types of search engines. But, people are moving.

The longer the NSA story stays in the news, the more searches DuckDuckGo receives, and this could potentially have a significant affect on the search market in the future. The question is, do we want to sacrifice personal privacy and assumed online anonymity for searches that match our lives? Andrew Lazaunikas recently wrote an article on the debate for Search Engine Journal. He admits DuckDuckGo delivers excellent, unbiased results, but in the end, “when I want to know the best pizza place or car dealer in my area, the local results that Google and Bing shows are superior.”

Lazaunikas isn’t deterred by the aspect, and notes, “I can still get the information I need from DuckDuckGo by modifying my search.” He ends his statement by vowing to use DuckDuckGo more in the future, but the question is whether the public at large will follow. For the moment, it seems as though most people prefer quick easy searches and familiarity to trying out these new search engines.

After two fairly explicit warnings about advertorials this year, Google has added advertorials to their webmaster guidelines, as well as other popular spammy linking techniques in the Link Schemes help document.

Google Continues To Downplay Links

The biggest change is the removal of the entire first paragraph from the help article, which addressed how incoming links influence rankings. Search Engine Journal says the removed paragraph read:

Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.

Links have been steadily falling out of favor throughout the past few years, and it appears we are finally reaching a tipping point for Google’s reduction of linking’s role in search algorithms. Or, as Google has been advising, high-quality sites matter much more than links of any quality.

Keyword-Rich/Optimized Anchor Text Links

Google also tackled heavily-optimized anchor text used in press releases that are usually distributed across other sites. The technique has enjoyed a quick rise in highly competitive markets, and Google appears to finally be putting the squash on the practice. They did note that guest posting is still a popular practice, which can be valuable when done correctly. However, sites that accept guest blogging have been using nofollow or an optimized URL link to avoid issues.

Advertorials

And of course, the final change is the addition of advertorials as an example of unnatural links that violate Google guidelines.

Advertorials or native advertising where payment is received for articles that include links that pass PageRank.

Google has been making swift changes to linking policy and practice, so it is highly likely changes like this will keep occurring. Links can still be a strong weapon in your SEO strategy, but you have to tread carefully, and they maybe shouldn’t be your highest priority when optimizing.

Site audits can be ugly work. Nothing can be more disastrous to a client-SEO relationship than informing someone of all the issues wrong with their site in too harsh of a way. They’ve spent time and money having a site created that they think works well for their business, and then we audit the page and everything under the hood and have to break the news that their site is sick or badly put together.

The process is similar to playing “website mechanic” as Stuntdubl SEO put it. We offer diagnostic information and recommendations that are absolutely critical to keeping a site relevant and valuable to organic search, but many people don’t want to hear how bad of shape their car or website are in. Not only will it be costly to fix, but they’ve developed a sentimental relationship to the site they have.

To be able to break the news in the best way possible, we have to be as prepared and informed as possible, which means running extensive auditing and answering a lot of questions. This also means understanding all of the tools at your disposal so that you can get the best answers in the fastest way possible.

Todd Malicoat took 50 of the most important questions for site audits, and identified the best tool available for answering every inquiry. Not only will this speed up your data collection and auditing, it will make you more prepared to create a better site and communicate properly with the clients you are working with.