Longtail SEO is beginning to become the dominant method for article marketers to be successful in the results pages, as well as strengthening brand visibility and awareness. It is the most effective method for most marketers.

In the past, the problem for many has been deciding whether to invest energy, time and money into marketing a single primary keyword, which might receive a high volume of searches every month, or possibly to focus on a longtail keyphrase. The longtail keyphrase might only get a small amount of search queries every month, but it allows for the business to achieve the top ranking, which receives the most traffic.

Trying to focus on a single keyword puts you at a disadvantage. It may get queried more than your longtail phrase, but it will be such a crowded market, you would be lucky to get on the third page or results. When most traffic goes to the very first result, being on the third page isn’t going to get you many visitors.

Longtail phrases on the other hand put you in a much higher ranking on SERPs for less popular related queries, which will net you more traffic overall. As Justin Arnold, writer for The Mightier Pen, puts it, you have to choose between theoretical popularity, and actual sales traffic.

Choosing a longtail phrase is much too big of a subject to cover here, but the main idea is to think about claiming a corner of the market. People are searching for more specific queries, so marketing a longtail phrase for your specific area of the market puts you in a good place to actually get some sales.

 

I recently wrote about the release of Google’s Disavow Links tool, but there are some more questions popping up that need answering. So, let’s cover a little bit more about the tool.

First off, the tool does not immediately take effect. This is one of many reasons Google suggests publishers try to remove questionable links first by working with site owners hosting links, or companies that they may have purchased links through.

Instead of disavowing the links immediately, “it can take weeks for that to go into effect,” said Matt Cutts, head of Google’s web spam team at a keynote during the Pubcon conference. Google also has reserved the right to not use submissions if it feels they are questionable.

It is important to be accurate when making your file to submit to Google. Because of the delay in processing the file, it may take another few weeks to “reavow” links you didn’t mean to discount.

Once you have submitted a file to Google, you can download it, change it, and then resubmit.

The tool is mainly designed for site owners affected by the Penguin Update, which was focused on hitting sites that may have purchased links or gained them through spamming. Before, Google ignored bad links, but now they act as a negative mark against the site.

This change prompted fear in some of the SEO industry that site owners would create bad links pointing to their site, or “negative SEO.” This tool helps to ensure that negative SEO is not a worry by allowing you to disavow any of those types of links.

Danny Sullivan from Search Engine Land has even more information about the tool, and Matt Cutts has a 10 minute long video answering questions.

 

The latest research from the Interactive Advertising Bureau and Pricewaterhouse Coopers, which examines the first half of 2012, finds that the biggest contributor to online advertising in the U.S. continues to be spending on Search Marketing.

With 48% of all interactive advertising in the first half of the year, search ads brought in $8.1 billion.  It is also 19% higher than during the same period of 2011.

Performance pricing, usually cost-per-click, remains the dominant pricing model and has continued to get stronger.

For graphs of the data, visit Pamela Parker’s write up over at Search Engine Land.

 

If you’ve ever received a notification from Google about a manual spam action based on “unnatural links” pointing to your webpage, Google has a new tool for you.

Links are one of the most known about factors Google uses to order search results, and they examine the links between sites to decide which pages are reputable. As you probably know, this is the foundation of PageRank, another of the most well-known “signals” Google uses to order search results. Google is concerned about spammers trying to take advantage of PageRank, and often they have to take manual action.

The notification you may have received in Webmaster Tools about those unnatural links suggests you got caught up in linkspam. Linkspam is the use of paid links, link exchanges, and other tactics like those. The best response to the message would be to remove as many low quality links as possible from your site. This keeps Google off of your back, and will improve the reputation of your site as a whole.

If you can’t seem to get rid of all of the links for some reason, Google’s new tool can help you out. The Disavow Links page allows you to input URLs which you would like disavowed from your site, and the “domain :” keyword will help you disavow links across all pages on a specific site.

Everyone is allowed one disavow file per website, and the file is shared among site owners through Webmaster Tools.

If you need assistance finding bad links in your site, the “Links to Your Site” feature in Webmaster Tools can also assist you in starting your search.

Google’s Webmaster Central Blog included a few quick answers in their announcement for the tool for questions you may have, noting that most sites will not need to use the feature in any way unless they’ve received a notification.

 

Now as much as ever, the web design industry and the SEO industry are intertwined. The question that arises anytime a business industry and a creative industry become so connected is whether the business side limits the creative side or not.

Most in the web design industry will agree that SEO shouldn’t limit web designers at all. SEO is important, but limiting art isn’t necessary.

One of the most important things for web designers and SEO professionals to be concerned about is load times. Lots of designers want to make amazing headers, but these lead to slow load times. There are sites where load times do matter less. Portfolio sites should have plenty of quality graphics of work, but in these instances SEO doesn’t matter.

For commercial websites however, fast load times are essential because customers will go elsewhere rather than wait.

For those that think standard navigation practices limit their artistic license, think about this. The job of a web designer isn’t just to create an aeshtetically pleasant site, but to make one that is also functional and user-friendly. Breadcrumbs and easily accessible navigation systems make users happy, and it allows them to see all of the well designed areas of the site.

Overall, if you aren’t designing overly flashy sites, SEO shouldn’t be limiting your abilities as a designer. The latest SEO practices rely on quality content, and the designer’s job is to to deliver this content is a good looking package. If anything, SEO guidelines will help you understand how to create a site your viewers will like.

For some more pros and cons of the relationship between SEO and web design, Rean Jean Uehara has a great article at 1stwebdesigner.

 

New research from Compete.com is suggesting being the first result on a SERP can make a huge difference from being second.

The analysis comes from “tens of millions” of consumer-generated search engine results pages from the last quarter of 2011. It also had some really interesting findings. 85 percent of all listings shown are organic, with only 15 percent paid search listings.

Out of the organic listings, 53 percent of clicks are going to the very first result, with the second result only seeing 15 percent, and all others getting even less.

Analysts from Compete.com summarize “since the vast majority of listings on a SERP are organic, and the majority of clicks are on the first listing, it’s imperative that brands strategy including constantly monitoring results due to the ongoing evolution of search engine algorithms.”

The paid results are also getting a large amount of clicks. Most specifically, ads in the top of the page perform very well, with between 59 percent and 9 percent of all paid results clicks. Ads on the right hand of the page however, get at most 4 percent of paid results clicks.

Overall, it is important to get your listings in the top position, if you want your page to be getting attention. For graphs and analysis of the results, read Miranda Miller’s article at Search Engine Watch.

 

While we’ve been talking about how to optimize content quite a bit, there really are no guidelines out there for more broad questions you should be asking when going through the process of optimization. Jenny Halasz from Search Engine Land realized this, and created a flow chart for the optimization process, complete with what questions you should be asking yourself.

Optimization Flow Chart

“What is the page about?” – This is a really simple question, and if you can’t answer it, you probably shouldn’t be building the site. For your page to have any value, you have to know what it is about, obviously.

“What is the purpose of this page?” – Are you trying to create a blog post? Or maybe a sales pitch? How about a press release? No matter what the purpose is, you certainly need to have one, and be able to identify it while working on the page. Thinking about this before hand will help you put your content into context.

“How long will this content remain relevant?” – Educational pieces stay relevant until more information is found. Depending on the field, this could be years or just a few months. Product pitches on the other hand, stay relevant until your next line is due to be released, which can last as much as a year or two. Either way, adapt your content to the time frame it will still be important.

“What makes sense for optimization?” – The previous questions should be considered when creating the page, but now we’re at optimizing the site for search. Are the keywords you’re using relevant? How are you handling linking? Make sure you actually consider these factors rather than “going through the motions.”

The flow chart and questions should help you focus your process to reflect your client’s needs. Every step needs to be planned, and every question should be answered. If you’re optimizing right, the answers should come to mind pretty quickly.

 

I’ve talked a lot about how important it is to try to think like your customers. It’s always important to find out what people are thinking, what questions they are asking, etc., but I didn’t offer any specific ways to accomplish this. But today I have one method of finding out what questions people are asking about topics important to you.

Justin Arnold from The Mighter Pen suggests using Twitter because it offers real time feedback on what people are talking and thinking about relative to keywords.

Of course, this is pretty common knowledge, but what people don’t realize is Twitter has some key features built into its search engine that really benefit the person looking for questions people are asking.

Finding out what questions people are asking is as simple as adding a space and a question mark after a querie. Suppose you are writing about painting. You can search ‘painting’ but you probably will get a lot of extraneous posts not of interest to you. If you search ‘painting ?’ however, Twitter filters your results to only include tweets with questions.

Now, the problem we are faced with is Twitter is used pretty heavily for promotion. Don’t you wish you could filter out any tweet containing links to avoid all of the ads? Well, you can. Just add ‘-filter:links’ to your searches to do away with all of the promotions. What you have now is a list of questions users are asking about a topic in real time.

This is just one way to try to get into the minds of your audience. Trying to gain some perspective is always important when creating content.

 

The overlap between SEO and content strategy often ends up turning content creation into a marketing ploy, and little more. The blogs cite industry folks and data, and offer tips that are either glaringly obvious or recycled to the point of redundancy.

Guillaume Bouchard from Search Engine Watch has another idea for content creation. Think about what people want, not what “works” within the market. What works changes as fast as the industry can, while what people want stays relatively consistent. Long term success comes from reading what your visitors want.

For SEO professionals, you can follow the 70/20/10 model for a simple model for content creation.

The 70/20/10 model goes like this:

  • 70 percent of content should be low-risk
  • 20 percent should try to improve on what already works
  • 10 percent should be high-risk experimentation

The model comes from Coca-Cola, and can be transferred to SEO pretty easily. Link baiting is low-risk. Optimizing and trying to capitalize on some newer trends in the market covers trying to improve on what works, and that leaves 10 percent experimentation.

70 Percent: The Link Bait – Link baiting certainly has its pros and drawbacks, but for this model just think of it as content made with a purpose. It informs audiences, communicates complicated ideas, and establishes your reputation as an expert. This helps establish your brand in the industry. This acts as the mainstay of your content. Always available, but it can’t be all you have.

20 Percent: Optimize and Sharpen – For optimizing, look at what content is doing the best and what people are saying about your content. Try to improve upon what is doing best, and reinvigorating old debates with new information. Stay aware of trends and ideas in your industry, and react to them with content. This type of content creation helps keep you tuned to the changes in your industry, and keep you relevant, which will always translate to your audience.

10 Percent: Proactive and Reactive Experimentation – Time to have some fun. Experimentation requires really understanding your audience, and being confident enough to have an opinion. Think about fashion trendsetters. They see what is popular now, and act on their impulses in response. Content creation experimentation is all about seeing what is popular in the field, and making new content that people have never seen before.

This model isn’t something to keep set in stone, but it will help keep you relevant and interesting. Those are two things audiences always want.

 

It’s hard to keep up with Google’s constant adjustments, and AuthorRank is a future feature that isn’t as understood as it probably should be. Its history dates back to August of 2005 when Google filed a patent for “Agent Rank”.

This patent included ranking “agents” and using the public reception to the content they create to determine their rank. Basically, the more popular websites with positive responses would be higher in rankings than less-authoritive “agents”.

After the patent, “AgentRank” disappeared for a while, until in 2011 Eric Schmidt made references to identifying agents in order to improve search quality. A month later, they filed a patent for what is assumed to have become Google+, which acts as a digital signature system for identification, which can be tied to content. And that content can be ranked. Hello, AuthorRank.

It has yet to be officially implemented, but there have been rumors all year that AuthorRank is under development, and AJ Kohn has stated it could completely change the search engine game. It would act as a factor in PageRank, which makes high-quality content higher ranked.

Mike Arnesen at SEOmoz says it’s not a matter of “if Google rolls out AuthorRank, but when.” He also has some great suggestions of how to be prepared for when AuthorRank arrives. I highly suggest reading his extensive article, because I agree strongly with the idea AuthorRank will be here sooner rather than later.

With Google’s recent focus on social media, and the natural concept that people want to see quality content in their results, it is just a matter of time before AuthorRank is a serious concern to the SEO industry.