ChecklistThings go in and out of favor in SEO all the time, but some things seem to never change. While content creation is the new buzzword and link building is enduring a big face lift, on page optimization stands tall as a piece of tradition surviving on, much as it has for years. Of course, there have been some tweaks, but largely on page optimization remains as one of the easiest methods of site optimization to implement and understand.

On page optimization is also special in SEO in that we know fairly certainly the best ways to handle this type of optimization. Everything else may be up for some level of debate, but on page SEO can be distilled down to a simple checklist, such as the one CanuckSEO recently shared. This checklist works with any type of CMS or page editing method you want to use, all you have to do is make sure you follow through with every step, in order.

  1. Page Title – Each and every page needs a unique and informative page title, and they all need to be less than 70 characters. Okay, the 70 character limit is a guesstimate which many will question, but the point is you need to distill every page into a short and sweet title. Trying to use generic or repetitive titles won’t get anyone, including search engines, interested in what you’re offering.
  2. Keyword importance – When adding tags or keywords, try to keep them arranged left to right, or in descending importance. Search engines read tags and keywords as a hierarchy  so it is best to show them the most important first.
  3. META Descriptions – Each page also needs its own description. Search engines look for these descriptions to understand what your page is essentially about, and searchers are generally hesitant about sites without descriptions in the search engines. Let people know about the content you want them to see.
  4. H1 Headline – Every page needs a headline, just like newspapers and your old school papers. Every page only needs one, but without a title the page is just woefully incomplete.
  5. Body Text – Write your body text as you think others would like to read it, but also be sure to include your keywords. Don’t go crazy and force the keywords in the text in ways that don’t make sense, but make sure you’re keeping your message targeted and using the words you want people to search for. Moderation is definitely key here.
  6. Image SEO – This is where most people get lazy. Images without names, or strings of letters and numbers for their names, don’t have the weight that images with full titles, alt-text, and descriptions will. Keeping content organized and fully tagged is better for search engines, so don’t skimp.
  7. On Page Internal Links – Links are scrutinized by Google just as much as every other page element, if not more, so keep your on page links fully readable and relevant. Trying to pigeonhole links just to squeeze them in will only lower your relevancy overall, so keep links well focused.

Most importantly, the key to good on page SEO is consistency and organization without resorting to using the same titles or headers for every page. Search engines have been able to tell when SEOs slack with their on page SEO, so don’t let yourself fall into that trap. Keeping your pages, images, and links well organized will benefit your work as much as it will help how the search engines see you.

There are more than a few lists of the most important rules to follow in SEO, and to their credit, they all largely say the same things. This is good for site owners and SEOs getting started, but what do you do when you’ve checked off every one of those standard entries? Is your site perfect? Does that mean there is nothing left to perfect? Of course not.

Your site’s SEO is always able to be improved upon, and some things left off the more popular lists can still hurt you terribly. Bill Slawski created his own list of SEO rules that features suggestions you might not have seen before if you stick with just the biggest websites available.

Slawski’s suggestions approach slightly more technical issues than many will give you, and many of them seem trivial until you understand how picky Google’s crawlers and indexers are. For example, site architecture doesn’t seem that important so long as it is organized in some ways, but in reality there are very specific ways you should have your site set up. Having more than one web address that search engine crawlers are able to visit your site from, for instance, can end up frustrating Google’s bots, and you may even end up with a message in your Google Webmaster Tools telling you to cut it out.

Another common site architecture mistake for commerce sites is creating different product pages for all manner of tiny variances. Some will create individual pages for different sizes and different colors, which only creates a mess for your visitors and Google’s crawlers alike. Keeping the architecture of your site as streamlined and efficient as it can be to fit your needs is always important, and unnecessary bulks of pages don’t attract the search engines.

On-site SEO is also a wide spread problem for many site owners, and that is never more obvious than when you see pages that don’t have unique titles. Titles are supposed to describe a page and explain what is featured on the page. Consider it the title for a book. Would you look for a book with no title? Would libraries be able to organize those books? In this case, searchers are wary of any site that doesn’t make every effort to tell them what they offer before they click onto the page, and search engines are the librarians unable to sort your mess without titles on your pages. Don’t upset the librarian.

Of course, even for Slawski, one of the most common problems is simply that people create sites that are too slow for our current standards. It may look nice, but visitors are impatient and won’t hesitate to hit the back button if your page isn’t loading quickly. This is even more true for mobile users who are on-the-go and don’t want to wait for their content. The slower your page loads, the more prospective visitors you’ve lost.

Those were some of Bill Slawski’s most important rules for SEO. What are the rules you always keep in mind while working on a site?

The Short Cutts

Are you familiar with Matt Cutts, the head of the Google Webspam team, and his YouTube videos? I share them here frequently, but even the ones I write about are just a selection of some of his best. Since he has started making the short informative videos in 2009, Cutts has made over five hundred of the videos.

Five hundred videos are a lot to sort through, and YouTube isn’t the best at helping you navigate large numbers of videos so Cutts’ videos were starting to get a bit jumbled. That’s why the online marketing company Click Consult created The Short Cutts, a site which organizes all of the Cutts videos into an easily usable resource for all SEO questions you may have.

Short Cutts QuestionFor anyone not already aware of Cutts’ YouTube posts, they all follow the fame pattern. A Google user asks a question about a topic, and Cutts answers the question as well as he can within a short two or maybe three minutes. Some question the usefulness of the videos because Cutts often can’t go into depth in the short time limit, but I think anyone can understand how important it is to hear information and answers to common SEO questions straight from his mouth, even if it is a little vague.

Possibly the best part of The Short Cutts is their method of displaying videos above two sets of text which may help give you a quick answer. The first block of text consists of the question Cutts is asked, and the second block of text gives a quick “yes or no” type answer which can help give you the answer to many of the more simple issues.

While we all like to believe our blogs have weight and share important information with mass of internet users out there, the truth is the majority of blogs are white noise in a field so congested that few actually rise above the static and build a reputation and brand image for themselves.

So how do those select few succeed while the others flounder? The top blogs and content based websites out there all do two things that the majority of the other content creators out there don’t do. They produce great content, and they market their content to reach out to the public.

That seems like such an easy plan. While the first part is a combination of talent and dedication, the marketing side is entirely teachable. The problem is, most don’t actually know what great content looks like, at least when it comes time to gauge their own work.

The foundation of great content is almost always writing ability. You may not be the best writer at the start, but over time you can refine your voice and motivation for writing, and before long, you will be much better. But being able to write well doesn’t mean you’re automatically creating great content. Data is what raises competent writing to the level of great content.

Bloggers can write formally, but the blogging medium is largely used for subjective sharing. People don’t look for boring press releases when they search blogs. They are looking for one person to share their experience and information on a topic in a way that hopefully cuts past the normal politics that make up other advertising formats. The problem is, subjective information isn’t very useful unless you back it up with real quantitative information. It just isn’t very believable without stats and data to prove your point.

Just throwing objective data into a blog post won’t make your mediocre content great however. You have to know how to use the data within your post and build your argument around that data. Chris Warden gives some examples of blogs that do just that, as well as explaining more about how you can improve your content with objective data, all at Search Engine Journal.

These days, everyone has an app. Apple has over 800,000 apps in their store, and Android is close behind. Search for anything you need an app for, and there is little chance you won’t find an option delivering the solution, quite possibly even for free.

With that many apps out there, making one of your own has more than a few risks. How do you attract users? How do you find a market not already covered? How do you improve over the already available options? You’re trying to get people to flock to your application when, according to Noupe, over 60-percent of apps in Apple’s store have not been downloaded a single time.

The truth is, getting your app in front of others’ eyes requires creating a quality product, then optimizing the heck out of it. App stores work just like search engines, and there is plenty of App Store Optimization to be done.

However, just like with SEO, simply optimizing a bad product isn’t going to get you far. There are numerous concerns you must address if you want your own app to stand a chance before you even get to the optimization stage. New Relic, an analytics service, recently released a new product specifically for Apps, and they accompanied the release with an infographic any App designer would be smart to keep around for their next project.

MobileAppDevIG_final

Have you received an unnatural link penalty from Google? Are you worried about getting one? Or maybe you are just curious what constitutes an unnatural link. The answers out there are often woefully incomplete, or contradictory to other reputable sources out there.

It can sometimes feel like every different major SEO news source has their own exact definition of unnatural links, and sometimes they aren’t even that consistent. The problem just gets worse as these varying definitions are then interpreted by other writers trying to offer tips on how to recover from the penalties many have received.

If we can’t agree on a singular definition to unnatural links, how are we supposed to agree on a united way to deal with the penalties? All the confusion does is lead many site owners trying to get their site back on track down yet another wrong path.

Well, Search Engine Journal’s Pratik Dholakiya undertook the mammoth task of condensing all the information anyone could ever need to know about unnatural links and the penalties that come from them all into one informative article. From the basic information of how unnatural penalties became a huge problem for the SEO community and a singular definition for unnatural links, all the way to the secret tips many professionals haven’t been sharing, everything you need to find is there.

Another day, another Matt Cutts Google Webmaster Help video to talk about. This recent one deals with how SEO professionals pay close attention to any new Google patent that is remotely related to Search or Search Quality terms, and then speculate until some believe some very incorrect ideas about how Google is operating.

Cutts was asked what the latest SEO misconception he would “like to put to rest” and you could almost see the relief in his eyes as Cutts began explaining that patents aren’t necessarily put into practice.

“Just because a patent issues that has somebody’s name on it or someone who works at search quality or someone who works at Google, that doesn’t necessarily mean we are using that patent at that moment,” Cutts explained. “Sometimes you will see speculation Google had a patent where they mentioned using the length of time that a domain was registered. That doesn’t mean that we are necessarily doing that, it just means that mechanism is patented.”

Basically, there is a practice of SEO professionals, especially bloggers and writers, to speculate based on patents they see have been filed, and this can grow to offering tips and suggestions about how to run your website based on speculation stemming from a patent that isn’t in use, which all comes together to create some widespread misinformation.

For example, consider the speculation that comes every time Apple files patents for future phones. While they’ve recently had trouble with leaking physical prototypes in various ways, in the past, Apple kept their secrets well guarded, and the speculation based on their patents were often outlandish, and at best completely wrong.

That doesn’t mean you can’t learn and make predictions based on patents, especially if you see indicators that it has been implemented, but it is important to take every patent with a grain of salt. While Google has created the mechanisms for these patents, unless you see evidence, they probably aren’t worth getting worked up over.

While quality SEO is a complex, time-consuming job, there are many types of SEO that any site owner can do. There are also a lot of basic mistakes that site owners regularly make while trying to optimize their own page.

To help prevent these easily corrected mistakes, Matt Cutts, Google’s head of their Webspam team, devoted one of his recent YouTube videos (which you can watch below) to identifying the five most basic SEO mistakes anyone can make.

1) Not Making Your Site Crawlable – According to Cutts, the most common mistake “by volume” is simply not making Google able to crawl your site, or not even having a domain to begin with.

The way Google learns about sites is through web “crawlers” that index pages by following links. If you don’t provide links allowing Google’s bots to find your site, it won’t know what is there. If you can’t reach content by clicking normal links on the page in a text browser, it might as well not exist to Google.

2) Not Using Words People Are Searching For – Google also tries to connect people with the most relevant information for the exact search they used. If someone searches “how high is Mount Everest,” they will be connected with a site using those exact words on a page before they will be suggested a page using just “Mount Everest elevation.”

My favorite example Cutts uses of this is a restaurant’s website, mainly because it seems many restaurants have very minimal websites that are insanely in need of optimization and a bit of a design overhaul. When people look for a restaurant to eat, they search for a couple of things, mainly the location, menu, and hours. If the page has those listed in plain text, Google will index that information and direct more people to the site, than those with PDF menus or no information at all.

3) Focusing On Link Building – One of the biggest buzzwords in SEO is link building. It is one of the oldest strategies, and it is constantly tweaked by Google’s algorithms to keep it in the news regularly, but it may actually be dragging you down.

When people think link building, they cut off many other ideas and marketing options which will equally boost your site. Cutts suggests instead to focus on general marketing. If you make your website more well-known and respected within your community, you will attract real people, which will bring organic links which are much more respected by the search engines.

4) Bad Titles and Descriptions – Many people neglect their titles and descriptions assuming they will either be automatically filled in, or won’t matter in the long run. If your website says “untitled” in the title bar, it will also say “untitled” in a bookmarks folder as well as actual search results. Now ask yourself, would you click on a website without a title?

Similarly, the descriptions for webpages are often blank or copy and pasted straight from the page with no context. Your description should be enticing people to want to click on your page, as well as showing that you have the answer to the question they are searching for. If people can build entire followings around 140 character tweets, you should be able to make someone want to click your page with a 160 character description.

5) Not Using Webmaster Resources – This problem can only be born out of ignorance or laziness. There are countless SEO resources available out there, and most of them are free. The best resources anyone can turn too are the Webmaster Tools and Guidelines that Google offers, but you shouldn’t just stick to those either. There are blogs, webinars, videos, and forums all happy to teach you SEO, you just have to use them. If you’re reading this however, you probably don’t have this problem.

Conclusion

The most common SEO problems, according to Cutts, are also the most simple problems imaginable. There are resources available that will help you fix all your basic SEO problems, and you’ll learn more and get better through finding them and practicing. If you’re currently dealing with trying to learn how to make your site crawlable, you have a long way to go, but if you just keep working at it, you’ll be an SEO pro eventually.

Help!

Google has been getting some bad press lately surrounding their penalty notices. Their notices are notoriously vague, and this has come to the surface of the topic after the BBC received an “unnatural link” warning last month due to links pointing to a single page on the site, and Mozilla was notified of a “manual” penalty this week because Google identified a single page of spam on their site.

In both of those cases, the penalties were only applied on the individual pages in question, but that information wasn’t included in the notices, which makes for obvious concern. These cases also pinpoint one of the biggest issues with issuing notices without specifically identifying to problem for the site owners. With millions of pages of content, trying to identify the problem pages would be a needle-in-the-haystack situation.

Many have been concerned about the ambiguous notices, and Google has said they will work to improve their transparency, but what do you do if you get a notice that says you have been penalized but doesn’t tell you exactly where the problem is? Matt Cutts, head of Google’s web spam team, says you should start at Google’s webmaster help forum.

If help can’t be found in the webmaster help forums, Cutts says filing a reconsideration request could result in being given more information and possibly advice, though he concedes  “we don’t have the resources to have a one-on-one conversation with every single webmaster.”

This is notable, because many believed in the past that filing a reconsideration request after a penalty was a one-time attempt to restore your site’s name. Many speculated that Google would not be keen to reviewing repeated requests and to only file for reconsideration once the site master is sure they have solved the issues. According to Cutts, this doesn’t seem to be the case.

Telling site owners to turn to a forum or file requests where they might be given extra information doesn’t seem like very consistent advice for trying to overcome a penalty. Luckily, there are some other solutions for investigating what part of your site is causing all the problems. Danny Sullivan works through some other ways you can try to diagnose your site at Search Engine Land.

When things go wrong with an SEO campaign, it puts everyone involved in a tricky position. The first step is obviously to figure out what happened and who is responsible in order to fix the problem, but pointing out who is responsible for failure can hurt egos and business relationships if not handled right.

The most problematic situation is when a client is at fault, which is indeed possible. The customer is always right may be a good philosophy to live by in many cases, but it isn’t actually all that true when it comes to implementation. This is especially true when you are working with someone not all that informed about SEO.

Some SEOs will try to cut out the client, but that hurts the campaign as well. Instead, the best option is making sure to educate clients about the process in order to avoid issues, though that obviously can’t keep all problems from popping up. If one does arise, it is your job to talk the issue through with your client. While it may be their fault for not following through on a responsibility, it is equally likely you are also responsible due to a failure of communication.

Amanda DiSilvestro suggested a few ways clients can end up bringing down an SEO campaign, as well as how Search Engines and SEOs themselves can derail your progress. The most common issues for clients include:

  • Failing to Change – Many times, SEOs will suggest changes to make onpage to optimize a website, and often it will mean tweaking content to include keywords or possibly editing a meta tag. Clients are often very protective of their content however, and sometimes ignore these suggestions. In this case, the SEO has done their job, but if the client isn’t willing to cooperate, there is little the expert can do.
  • Failing to Plan as a Group – When SEOs aren’t confident in their client’s understanding of optimization, they sometimes begin to ignore the client all together. But, even if a client doesn’t want to be very hands on with the campaign, they almost certainly had goals in mind when they hired the pros, and those goals should be included in the plan for optimization. If a client tries to avoid being a part of the SEO process, including reading the regular reports, there will be a schism between the SEO expert and the company, which will likely splinter the campaign and weaken it.
  • Giving Up Too Early – Too many potential clients come to SEO agencies wanting quick fixes. No matter how earnestly you try to explain that optimization is a slow process, if the client doesn’t comprehend how long it will actually take, they are likely to get frustrated and shut the whole thing down before they really had a chance to reap rewards. There is little SEOs can do here except try to really communicate about time estimates and benchmarks you expect to hit, or just refusing clients that refuse to understand there is no way to get to the number one spot on Google overnight.

Now, we all know clients aren’t always the problem. In fact, it is usually the professional that ends up torpedoing the whole campaign. SEO firms and experts have the power in the campaign, and it is a tough balancing act to get everything on a site working as well as it can to impress the search engines. There are endless reasons a campaign may not work, but unfortunately the most common all stem from just plain bad practices.

  • Going Black Hat – It seems everyone writing about SEO knows how blatantly terrible an idea black hat practices are, but yet there is are never-ending “optimization” services available that use keyword stuffing, duplicate content, cloaking, shady link building  and several other bad practices that Google already knows to look out for. Sure these services might get a site good rankings initially, but it won’t be long at all before they sink under the weight of penalties.
  • Poor Communication – Just as it was said above, even when the client is at fault, the SEO is sometimes responsible for not explaining the process or keeping the client in the loop. SEO work is a partnership, no matter how independent you may be. The client relies on you to inform them about this unique field and help them make informed decisions. If you aren’t communicating and they make a mistake it is your fault. Similarly, if you make a decision without consulting the company you are working with and they don’t like it, you have no excuse.
  • Laziness – When it all comes down to it, a lot of SEO is maintaining and tweaking things to make a site the most efficient possible at signaling to search engines. Experts can get lazy too, but when a site starts under-performing because you haven’t been paying it the attention it deserves, there is no one to blame but yourself. The solution to this one is obvious. Drink a coffee, get up, and do the work clients are expecting of you.

While these categories cover many mistakes made in SEO, there are also innocent problems like misreading a market, and simply putting your faith in the wrong type of campaign.

No one likes having the finger pointed at them when things fall apart, but it is important to honestly assess who is responsible for the faults.

A bruised ego may sting for a little, but if you are the client can put that aside and focus on the good of the site, you can use the understandings gained about what went wrong to repair SEO mistakes and bad habits. With those lessons under your wing, soon you’re site will be performing as you would like it to.