If Eric Schmidt’s book, “The New Digital Age”, is to be believed, Google’s authorship markup is going to play a huge role in search engine result pages before long. Given, as Search Engine Watch points out, Schmidt has a “talk first, think later” habit which has caused some great, though not always reliable, soundbites  but the fact that this is in his upcoming book, rather than a random interview, lends this quite a bit of reliability.

The Wall Street Journal published some excerpts from the book, and it is one in particular which has caught the eye of SEO professionals.

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.”

Google introduced their authorship markup in 2011, and stated at the time that they were “looking closely at ways this markup could help us highlight authors and rank search results,” but since then it has faded into the background in many ways. Google’s plans for the future bring it very much so back onto the table. Schmidt’s comment has made it very clear that Google wants to implement Google+ as a verification device. On one hand, it would be one of the best combatants against spammers imaginable. On the other, do we really want a future where we are forced to be on Google+ just so people can find your website?

There is more than enough talk out there about negative SEO, and how to prevent it or fight back against it, but Matt Cutts says the actual number of occurrences of people trying to use negative SEO is extremely low. He explains that Google designs their algorithms to try to ensure that they can avoid penalizing innocent sites and now that Google has added the Disavow Links tool to their repertoire, it is very easy to shut down “black hat” SEO if it does happen to you.

Cutts, the head of the Google Webspam team took to YouTube to answer the huge number of questions he has received about negative SEO, and also further explain the Disavow Links tool, clearing up any misconceptions there could be. Cutts doesn’t think negative SEO should be a concern for the vast amount of website owners out there, unless you are in extremely competitive spheres. “There’s a lot of people who talk about negative SEO, but very few people who actually try it, and fewer still who actually succeed,” he said.

The amount of talk about SEO coming from blogs and experts help make SEO one of the more discussed aspects of the internet behind the scenes. You won’t see search engine optimization coming up on the news, but just one search can lead to dozens of resources filled with writers offering their opinions and ideas.

In many ways, this is great because it keeps the community up to date with continuous changes, and delivers a wealth of free knowledge to anyone trying to get involved. However, it also creates an echo chamber where misconceptions run rampant, and there is always a need to clear up the bad information out there.

This time around, it was Eric Ward over at Search Engine Land who took it upon himself to dispel the rumors and lies surrounding linking. Links are a hugely important part of SEO, and many don’t understand exactly how they are used and evaluated. Add to this the never-ending changes to search rank signals, and bad ideas grow into monsters.

Many of these bad ideas come in the form of absolute statements, such as “anchor text will stop being used as a ranking signal altogether” in the next year. Google has done work to spot people misusing anchor texts, especially those attached to purchased links that say anything you want. But, as with most Google changes, they haven’t disavowed the practice altogether, they have only tried to punish those who take advantage and misuse the practice.

As Ward puts it, “Are you really going to tell me that if the Library Of Congress site links to Consumer Reports magazine’s site using the words “Consumer Product Reviews” that this would be a useless signal? No way.”

Another preposterous statement is that linking will no longer be the most important ranking signal, dethroned by social media signals. This concept ignores the number of Google searches done without being signed in, and not only that, Google uses tons of signals, and social media is one of them. But, relying on one user generated signal to return results to that one user doesn’t make any sense, when Google considers tons of signals as of now to return results.

The reason social signals will never be the primary signal for search engines is, quite simply, people like to do some things anonymously. They don’t want questions about body hygiene, marital issues, or personal problems being associated with their Facebook.

While linking may not be the clear-cut MVP it once was for SERPs, claiming that it is going away altogether doesn’t make any sense. It is this type of misinformation that leads to confused clients and well-intentioned but misinformed bloggers spreading the information far and wide.

One of the most common criticisms hurled at SEO is that is manipulates sites based on what Google or Bing want rather than what users would like to see. Many perceive this as a conflict between SEO and good user experiences, almost as if SEO is antagonistic to an enjoyable website visit.

On the surface, this assumption makes sense, as SEO’s do tend to get wrapped up in pleasing algorithms rather than people, but good SEO and quality user experience don’t have to be mutually exclusive. There are plenty of times where focusing on both aspects of the web page create great sites that are popular for search engines, and many SEO practices actually benefit the user.

Sitemaps, for example, are an essential part of SEO strategy, as search engines do limited crawls, where many sites do not have all of their pages indexed by the engine. Having a well organized and updated sitemap, as well as simple navigation, you make sure the search engines index the pages most important for you.

Source: Flickr

These sitemaps and navigation systems have the added bonus of making users able to easily navigate a site. Nobody enjoys having to scour a website for the specific page they are looking for, and a well done navigation system quickly erases that issue.

Keyword based SEO practices also help both parties, as long as you keep your readers in mind while optimizing your text. All text-based content should be easy to read, but search engines rely on keywords in the blocks of text to understand what your site is about.

The problem is, this makes some SEOs starting placing the keyword every other word which is going to drive readers crazy. The general rules are to include the keyword in the title, headline tag, and body content, but no more than once in the headline tag and title. You can use it a few times in your content as needed, but not overdoing it is important. In fact, including the keyword too many times could actually hurt your site.

Sujan Patel over at Search Engine Journal has even more ways you can combine SEO and a user-experience focus to make web sites that make both the search engines and your visitors happy.

Yep, it’s time again for a post about content marketing! It looks like there will be plenty of these throughout the next year as content marketing stays on the tip of everyone’s tongue when talking about SEO or digital marketing.

But, pumping out quality content continuously takes a lot of time and effort, which can be difficult for a site or marketing team to maintain for a long time. This causes most to get burnt out and ideas for new content stop coming as quickly. If you’re having trouble coming up with new things to talk about and ways to present your content, Sujan Patel has some suggested formats which might help you get started at Search Engine Journal.

  • List Posts – You’ve almost certainly seen lists before unless you stay away from almost all forms of media and information. If that is the case, thanks for reading this before picking up a newspaper or looking at “the cutest 25 cats sitting on things”. Yes, lists are a super common choice for bloggers and writers of all kinds. They are easy to write, and they tend to be more shared than most blog posts.
  • Interviews – Interviews have also always been popular for media, and SEO benefits for the same reasons. When you get an interview with a subject, you will automatically gain exposure to that figure’s followers and draw traffic to your own content. Interviews are also fairly easy. Make sure you understand the technology you would be using to record the interviews, like audio recorders, cameras, etc., then all you have to do is start asking anyone you would be interested on interviewing. You’ll get a bite faster than you know.
  • Reviews – If you have writer’s block when it comes to coming up with topics, reviews are a great way to keep content coming regularly while keeping it interesting for your viewers. Try to be objective and fair with your reviews, and use specific details to keep others from thinking you are just attacking other writers and creators.
  • Link Round-ups – Similarly to reviews, this is a go-to for those who can’t figure out what to talk about. Gathering collections of links has the upsides of collecting resources you might use on your own, while also earning goodwill for other creators’ content you are sharing.

Obviously, the best way to get traffic to come to your site is to just offer quality content filled blog posts informing peers in the industry. These formats shouldn’t replace the standard blog post, but when you are at a total loss for topics, these formats are handy to have in your back pocket.

The breakout star in SEO so far this year appears to be content marketing. It was pretty talked about in 2013, but with Google’s penalties and algorithms it will only be more important as the year progresses.

Of course, just as with any SEO tactic, content marketing has its risks. Google has shown that even using the best practices too much can still lead to penalties, and the Penguin and Panda updates have made it clear that you have to put good thought into any campaign you are going to run. Algorithmic updates by their nature don’t have room for leniency.

If you want to keep succeeding with SEO, you have to follow the rules to the letter. Of course, this is complicated Google’s reluctance to give hard rules for SEO. From what we know, it seems moderation is really the key to content marketing and optimization.

Adam Mason, writer for Search Engine Journal and SEO manager at Zazzle Media, shows the best method of dealing with content marketing optimization is to learn the history and know what has changed in the past few years. If you want to know how to push your content marketing campaigns without being hurt by penalties, his article covers anything and everything you would want to know.

Source: Flickr

With the new year come new reports about the way we searched the internet in 2012. The Search Agency posted their “State of Paid Search Report” for the last quarter of the previous year, and while it is mostly confirming what other reports have already shown, there were a few notable discoveries.

Every report of the past few years has found that searches from smartphones and tablets have been growing significantly now claiming 25-percent of all search clicks. But, what wasn’t expected was The Search Agency’s announcement that the growth in mobile searching does not come at the expense of desktop browsing.

“Desktops computer searches remained level from Q3 2012 to Q4 2012, while mobile experienced an increase in search share. This demonstrates the industry’s steady growth and good health,” states their report.

Greg Sterling has the full breakdown from the report over at Search Engine Land, but the facts are that mobile browsing has no intention of slowing its growth, and tablets are at least partially responsible for the continued interest.

Less than a year ago, Google unleashed an update called the “webspam algorithm” that seemed innocuous at first, until experts began to notice how widespread its effects were. The impact of the update was so large, Google eventually gave it an official name more in line with their other update, Panda. The “webspam algorithm” became Penguin.

The original name for the update was an accurate description for what this update did. It was aimed to demote sites violating the Webmaster Guidelines for Google, specifically sites full of webspam. These sites used manipulation to improve their rankings in the search engines, but some innocent sites were affected, and more have been affected by each subsequent update to Penguin.

These “black hat” methods such as keyword stuffing, cloaking, participating in link schemes, and purposefully using duplicate content had been around on the internet since SEO has existed (pretty much as long as the internet has been widely used), and Penguin sought to finally deal with the spammers, but with it a new set of rules for SEO were created.

Pratik Dholakiya has collected these rules into “The Definitive Guide To Penguin Friendly SEO” which explains which methods have been shunned and what new techniques are favorable for SEOs.

If you were actively using black hat techniques, you won’t find new ones to continue spamming in a different way, but for any SEO looking to legitimately improve their search performance with good content and practices, this list will help steer readers away from any bad methods.

Receiving an email from Google saying that they have noticed unnatural links associated with your website is never a good thing. The best outcome still involves losing organic search traffic from Google for at least some time, plus the email means you have work to do really quickly.

Search Engine Watch writer Chuck Price has seen webmasters respond to manual penalties and many of them actually make their problems worse, especially when they panic. Webmasters who panic when they receive a manual penalty website tend to fly into manic states running around doing the first thing they can think of to try to get the penalty fixed; thoughts like filing reconsideration requests before actually fixing the problem.

If you’ve received an email about unnatural links or manual penalties, take a minute to breath. There is no reason to panic. You have to do some work to identify the issues and fix your site’s linking problems, but panicking isn’t going to get that done any faster, and might blow your one chance to get your links reconsidered by Google in the near future.

Once you’re calm, it is time to get to work. Price has suggestions for how to get your page back in order no matter how many bad links you have. However, if you have been actively building thousands of unnatural links, you will have to make huge efforts to make up for the spamming.

On January 15th, Facebook announced they will be starting their own personal search engine called Facebook Graph Search. That’s right, the little white bar at the top of your page now has an actual purpose. The search engine relies strongly on “likes” and other relevant Facebook information such as page popularity and location signals.

While this new change could lead to some interesting methods of finding businesses close to you (for example, Facebook claims you will be able to search things like “Italian restaurants that my friends have been to”), but it also has business owners and SEO experts wondering how to take advantage of Facebook’s search. As Matt McGee found out however, Facebook was already ahead of everyone.

They released tips for how to make sure your business gets found in the Graph Search. Releasing these tips helps them as much as it helps others use the search, because it will help business position themselves to appear in the search while simultaneously populating the search engine for Facebook.

The most important things to know are that Graph Search is only available for a few right now, but it will offer local search from the very start, and the search results are created by compiling information created and shared by businesses, especially through their pages.

That doesn’t mean you have to have a page for your business however. It will help enormously, but businesses also will show up so long as customers or visitors have tagged them as a “place”.

We will have to wait to see just how Facebook’s Graph Search works once it is unveiled for the wider public. It could become another gimmick type feature of Facebook that many don’t use, like the FourSquare like ability to check into locations is treated at the moment. But, it is possible Facebook’s new search engine could be a useful tool for finding local businesses.