rsz_2971658475_e27d08f561_b

SEO experts are always happy to tell you how to improve your website, and maybe get some more conversions while your at it, but you don’t tend to hear much about what people are doing wrong. Maybe the SEO community is more positive than I’ve ever noticed, but we tend to prefer telling you what you can do better to telling you how you’re messing up.

Well today we’re going to change that, with some help from Inessa Bokhan. Sometimes it is just easier to tell people what not to do, and quickly put an end to these bad practices. She chose 17 of the most common mistakes website owners have been making for years, and I’m highlighting the worst offenders here.

One of the worst crimes you can commit as a site manager or content creator is ignoring your readers. It is so common for blog posts to go up, and the author to just vanish afterwords having moved onto new ground, even when readers are asking questions in the comments. Why would you just leave them hanging?

Creating content isn’t the whole process. We create content because Google likes it, yes, but you should also just be trying to attract real people with interesting information and a great site. Once you have those people on your site, you should be trying to keep them around as much as possible, and the best way to do that is simply interacting with them. Answer their questions, cement your reputation, and help foster a dialogue.

Another “sin” which personally drives me crazy is the constant use of registration when it isn’t necessary. There are so many times I’ve tried to read a random article, look at a picture, or register in order to leave a comment. The ability to register through Facebook or Twitter eases this problem as it doesn’t feel like such an invasion of privacy, but why would any web owner expect me to give them my private information just to see their content?

Some website owners just can’t help but turn off their “sell” switch, and “hide” advertising throughout their content.This can come in many forms, such as misleading links making you think you are on your way to a nice concise article, only to end up being offered a webinar, e-book, or even paid consulting.

As Bokhan points out, misleading links won’t even help if you have a pay-per-click campaign. Your audience will just leave. There are also those that simply break up their content with ads for those types of resources. This is a better solution than misdirection, but it is a personal annoyance to me to be distracted or have my train of thought misdirected with irrelevant paragraphs with similar formatting suddenly selling me a product.

These all lead me to the biggest mistake any website can make: lying to their customers. On the web, your customers make you or break you. Google is refined enough now that they can even identify when you are lying to your customers, and they will too. The worst case scenario is customers see through your lies immediately, and you go nowhere. The worst case is you temporarily fool them, are found out, and your reputation is destroyed through social media and forums.

Every business should be putting their customers above all else, and this is especially true on the internet where one bad customer interaction can lead to a fiasco.

Here’s a theoretical scenario: You’ve been hit with a manual penalty from Google. You take all the time and effort it takes to complete a link audit and remove all the bad links you’ve accumulated, and made sure your link profile doesn’t look questionable to Google’s eyes. You resubmit, but even after weeks your website is still flat-lining. What the heck?

As it turns out, that link audit and resubmission process was only half the battle. Google does use over 200 different signals to determine ranking, but links are still the heavyweights in the arena. Now think back to all those unnatural links you just removed. Often, those “bad” links were some of the most powerful in your profile, and you don’t have anything healthy replacing them.

I have some bad news. If you got hit with a manual penalty, you most likely used questionable or downright spammy methods to climb the rankings before, and that doesn’t cut it anymore. There is a way to recover, but it takes basically restarting your SEO process to get your site back in the rankings, and this time you can’t take short cuts.

Search Engine Watch suggests a four step process to getting your sites ranking again, but if you loved the spammy old ways of the web, these steps may seem counter-intuitive or just boring and difficult. Unfortunately for you if you feel that way, there aren’t many other options, and there will be less the more refined Google gets. Chuck Price put it best when he said, “adhering to the webmaster guidelines is no longer a “suggested” course of action, it is required.”

The four-step process will help you clean house on all the remnants of less savory SEO methods, and make your site look as clean and reputable as it should. Don’t try to toe the line again or take advantage of any loophole you find. You only really get one chance to come back after a manual penalty. If you get hurt again, it will be nearly impossible to fix everything.

rsz_john_muellerThere is a misconception amongst a small few that Google only wants the absolute best websites and they don’t index websites they think aren’t worth their time or space in their index. In reality, this is far from the truth.

Google is always indexing content and they index pretty much anything they can find. Supposedly, the only thing they don’t index is spam.

SEO Roundtable pointed out that Google’s John Mueller commented in a Google Webmaster Help thread recently saying “unless the content is primarily spam (eg spun / rewritten / scraped content), we’d try to at least have it indexed.”

He was responding to a question about a site bot being fully indexed over a prolonged period of time, which he believes is the result of a bug, though he didn’t have any definite answers until it is shown to the indexing team.

Before anyone gets up in arms, that statement is a little misleading on the aspect of spam. Everyone knows Google still indexes their fair share of spam, and in some cases they even get ranked. Mueller’s comments instead show how Google tries to avoid adding spam to their index, but we it is obvious that they don’t succeed in avoiding indexing all of the junk.

Getting indexed isn’t the same as ranking, but to have any chance of being ranked you have to be indexed.

Many website owners and SEOs have seen it happen. Your website is getting going, and Google is responding to your content with decent initial rankings. Everything seems fine, then gradually your ranking starts plummeting with no explanation.

You could time every day checking your rankings watching for this to happen  but that is a waste of time, as Search Engine Journal explains. Checking rankings isn’t an income generating activity, and your time is simply better spent elsewhere, like creating content or networking.

So then what is there to do about this mystery fall in the rankings? First, we have to understand what is happening, which Matt Cutts so helpfully explains in one of his latest YouTube videos.

Cutts uses an analogy of an earthquake to get to the heart of what is occurring. When an earthquake hits, the news about it is pretty broad. We know where it happened, but not many more details. Similarly, when content is posted, Google’s initial read of it is pretty wide. It is a best guess about where your content should rank.

As time goes by after an earthquake, we learn more and more. You will find out how much damage is caused, how many people died, how many aftershocks there were, and much more. As Google learns more about your content, it adjusts rankings. It contextualizes your content within the broader scope and repositions as needed.

So what can be done if you see your site drop in the ratings like this? Change up your practice. Most likely, your content is appearing to be quality at first, but Google is gradually peeling back the facade and seeing what your website really is, and it doesn’t like it.

Years ago, all a local business had to do was build a lot of links and their business would show up on the first results page. SERPs have gotten much more competitive in that time, and Google has introduced a strict local algorithm, so now local SEO has become a unique sector that is often more difficult to implement than almost any other online marketing strategy.

You can always hire a company to take care of all of your SEO needs, but if you have a tight budget and are willing to get your hands dirty, there are steps you can take to try to get onto the coveted first page of local results, called the 7 Pack. You’ll recognize the 7 Pack as the listing of businesses directly under the map of the area. Search Engine Journal set out a five step plan to improve your local business rankings.

The first step is checking to see if your target keywords actually trigger the local algorithm. Usually simply including key phrase combinations such as the city and most important keyword should connect with the local search results, but sometimes this doesn’t work. If that is the case, then it would seem your SEO strategy should be less localized as Google doesn’t register your service as part of its local algorithm.

One of this biggest tricks for local businesses is knowing where to establish your company online. Google’s algorithm always gives preference to businesses located within the city limits searched for, often called the “centroid” bias. This means Google will rank businesses located closer to the heart of the city higher than those on the outskirts if all other factors are equal.

For businesses located in suburbs or just outside of city limits this is poses a big question. Most want to capitalize on the bigger market from the closest city than the small market in their local town, but trying to rank in a metropolitan area when you aren’t physically established within that boundary is a incredibly difficult task. You have to decide if you want to fight to get into the rankings for the city, and possibly only achieving the second or third page on the local listings, or you can aim to corner the market in your town and rank first every time for a smaller audience.

Deciding that move usually requires determining how competitive your niche is, and even businesses already well situated in a metropolitan market will be rewarded for investigating. The quickest way to find out how competitive your market is starts ith taking the #1 ranking in the 7 Pack and copying all of their information exactly as it is displayed in the search bar with quotation marks. This gives you an approximate estimate as to how many directories and citations you will need to outrank the top listing in the 7 Pack. You can do the same for the lowest ranking. Your results obviously have to outdo the lowest ranked business in the 7 Pack to overtake it, so exploring will give you an idea just how tall the SEO mountain you have to climb is.

Once you’ve done your research you can actually begin working on your local SEO, but the process will be much easier thanks to informed decisions only possible through understanding your local online market. Search Engine Journals last two steps can get you going on improving your local site’s ranking but nothing happens overnight. Local SEO is competitive and time consuming, but without it you are falling behind the times.

Source: Hannes Grobe

Source: Hannes Grobe

Compared to Panda’s regular changes, Google’s Penguin algorithm has been relatively static. Since its first introduction in April of last year, Penguin has only been refreshed twice, but there is an update coming soon and it appears this “next generation” of Penguin will have a major impact.

The first big Penguin update took the SEO world by surprise. It was originally referred to as the Web Spam Algorithm Update, and impacted over 3 percent of English searches. This change has the possibility of affecting just as many pages.

When the original update came out by surprise, many SEO experts and website owners claimed they were penalized unfairly, though the more done by the community the more it appears many of those were using questionable tactics. There were likely a very small number of site owners unjustly hurt, but the majority were simply erring on the wrong side of the line.

The possibility of having your website penalized has many in the community concerned about the new Penguin update, and while we don’t know too much about what the update holds for us, plenty of SEO writers are making predictions and suggestions to try to keep innocent site owners safe.

Search Engine Watch analyzed the types of data Google has been gathering and what the company has learned from the past year of spam filtering and through tools like the Link Disavow Tool.

Google’s new update is likely to be a more efficient, intelligent, and thorough algorithm to fight spam. As always, the best way to be sure you will be safe when Penguin rolls around is to be following Google’s Webmaster Guidelines and best practices for SEO. If you think you may be in the gray area, you can use Search Engine Watch’s analysis to see how to judge your site before Google judges it for you.

Source: Search Engine Watch

Source: Search Engine Watch

While there are always new, complicated, and exciting things happening in SEO to talk about, it is always good to get back to the basics occasionally and discuss what makes a great foundation for all of the more fancy aspects of SEO.

Carolyn Shelby, Director of SEO for the Chicago Tribune and 435 Digital, emphasized the need to not neglect the basics of SEO at the Introduction to SEO session at SES New York. She told the crowd, “skipping the basics and spending all your time and money on social and ‘fancy stuff’ is the same as skipping brushing your teeth and showering, but buying white strips and wearing expensive cologne.”

That session was aimed at newcomers, but her words are just as relevant to seasoned SEO experts. Just as your morning routine should always include brushing your teeth, your SEO strategy should always pa the proper attention to the basic SEO.

Getting back to the basics starts at the very top, with establishing exactly what SEO is. SEO aims to do two things: create an enjoyable user experience, and communicate with search engines so that they will view your site as valuable to users. Many forget those ideas in favor of trying to cheat the search engines, but those actions are to SEO what stealing is to shopping. You may get the end product you wanted, but if you use questionable or illegal means, you will just as likely be penalized.

Once you understand what you should be aiming to achieve through SEO, you have to understand what the search engines are actually looking for and not looking for.

Search engines judge websites based on a variety of different criteria but the most reliable factors all circle around user experience, reputation, and what content you give to user. Relevant content and seamless user experience establish value to your website, even if you’re just starting out. If you make sure those two elements are consistently worth the time of visitors, gradually you will build a reputation through authorship, and you will see your site getting closer to the top of rankings.

If you please users with your website and content, generally you will also please the search engines’ most basic wants. However, if you focus on the broad idea of what search engines are looking for and try to cheat your way to the top, you will instead be surprised to see the search engines penalizing you. Keyword stuffing and purchased links may have worked in the past, but Google knows how to spot them, and you will be cut from the SERPs before you know it.

Those guiding principles will get you a long ways in SEO, but there is always more to do. I only covered three of the eight topics Search Engine Watch talks about in their article about the basics of SEO, but their comment section shows there is even more that could be included in just the most basic elements of SEO. Start with making your website worthwhile to visitors, then expand your SEO repertoire, and you will see positive results.

App StoreMobile optimization has fallen out of popularity a little bit as the new responsive design trend makes the need for a secondary mobile website obsolete. Of course, there are many businesses that have opted to have a specific mobile website, but there is no denying that responsive design is gradually merging mobile and desktop optimization.

What responsive design doesn’t negate is the possible need for an app. There are over 600,000 apps in the Apple App Store alone, and more businesses are deciding to create an app for their products every day.

What many don’t realize is that apps require optimization just like websites. With the huge number of apps out there, you can’t simply get your app approved and expect to see a huge number of people downloading it.

Over the past few weeks, there has been a discussion about ASO (App Store Optimization) stemming from a Techcrunch article claiming ASO is the new SEO. We use apps more every day, relying on them for weather, news, entertainment, shopping, and organization, but I was initially skeptical as to whether ASO will ever achieve some sort of dominance.

Then I started considering my tablet usage throughout each day. I check a number of news sources including CNN and Vice, skim through the more lighthearted Buzzfeed and Cracked, and often browse Reddit. The only one of those activities I don’t do in an app is read news from Vice only because there isn’t one to use and I have checked more than once to see if an app existed (there is one for the iPhone however).

The thing is, I use these apps regularly in the morning and evenings when I’m away from work. For more casual viewers, these apps may not be used enough to justify the space they take up. Most of the apps I acquire either serve a distinct purpose, or allow me quicker access to content I would normally have to open in a web browser. The only type of apps I download without already being familiar with a company are tools.

None of this is to say apps do not have their purpose, or that optimization should be an important part of creating and managing an app, as well as reaching out to the public. However, there are many markets where the apps largely serve to make frequent visitors’ interaction with your content more efficient, and won’t reach as many uninitiated consumers as other markets would.

If you decide an app is an important product you release to the public however, ASO is practically required to keep your app from going nowhere. There are simple steps you can take such as making sure to clearly advertise the app on your website and sharing it on social media, but you can also do keyword research and find out what people are searching for.

While ASO certainly has its place, the debate over whether it will be the “new” SEO seems kind of silly to me. We may reach a point where it is important for every company to have an app, though I don’t think we are quite there. Even then, ASO will only be a small portion of what we do. SEO applies to every business online, and I don’t see it going away any time soon.

When you write about SEO regularly, it is easy to get caught up on the things that are changing and shifting, but we often forget about the old standards of SEO and how they might fit into the new climate.

If you take a look, you will see there aren’t many articles about the importance of quality title tags in the past months or even year, even though it is one of the most powerful elements on a page. Just the title tag alone can tell a search engine your relevance to a topic of search term, distinguish yourself to searchers, and even draw in visitors, all in a single line.

Crafting a great title is deceptively difficult. It would seem creating a single line statement of the purpose of your page should be quick and simple, but crafting one that will make your page alluring to both search engines and customers alike is a complicated trick.

First, you need to match the recommended guidelines, and good luck finding a consistent set. I have seen anywhere from 50 to 70 characters suggested as the maximum you should include in a title, but so long as you are around 60 characters there shouldn’t be much of a concern. Going over risks having the terrible ellipsis trailing your truncated title.

Of course, there is no evidence Google doesn’t see all the text in your title, even when it is obscured by the “…”, but why waste the text? Searchers won’t get the entire topic you are addressing, and the extra 15 characters a search engine sees likely won’t help you. Doing something like trying to stuff keywords in after the ellipsis would actually hurt you.

Once you’ve met the common guidelines, there becomes a problem. Everyone wants a simply formula that will work every time, and one simply doesn’t exist. Every website is different, and making a title tag that is correct for your brand depends on your message and what you want to emphasize.

An amazing amount of information can be coded into 60 characters. You can tell searchers the product of brand name, descriptors, price, and many other aspects of your page simply in one sentence with very careful word choice. For products, you want to fit in as many hard facts about the products as you can in that small space. Search Engine Journal suggests product name, number, size, color, and unique features could all be included in the title, while with blog posts you want to tell searchers what question or topic you will be addressing clearly.

Just because there isn’t a magic formula for titles, doesn’t mean you shouldn’t be concerned with them. A weak title tag will get your pages ignored by everyone that sees your listing, while a quality one will stop casual browsers and show them exactly what they were looking for. Stand out and make your titles fantastic.

Despite everything that has changed in SEO over the years, keywords have always maintained their importance. A good SEO campaign can only be made from a foundation of the right keywords to work from. No matter how great the rest of your strategy is, it will be weakened by the wrong keywords, because they simply don’t have the potential for return that others do.

Selecting the right keywords can be an arduous task though. You have to gather data, and then analyze the massive amounts of information so that projections on returns can be made. Gathering all that data isn’t quick, and that means it is expensive.

Startups with limited budgets or without access to paid SEO tools just don’t have the resources to do the type of expansive data gathering that quality keyword selection requires. Or so it used to be. There are several free tools out there which can often do huge amounts of keyword research for you.

A single one of these tools may not be able to do the heavy lifting that the expensive top of the line programs offer, but by implementing a few of the free tools into your workflow you can cover almost all of the ground one expensive program would.

It will always take time and effort to analyze data for projections on return for specific keywords, but it is worth the effort. With just the three free tools Marc Purtell suggests over at Search Engine Journal, you will find you can more efficiently make informed decisions about your keyword selection, and soon you’ll be on your way to a better SEO campaign.