AdWords scripts offer a great opportunity to personalize your campaigns, but they have their flaws. For instance, you’ll need to write the code yourself and their output logs are not very user friendly. Frederick Vallaeys has some in-depth, expert suggestions for frustrated scripts users to get more out of their campaigns and overcome these flaws at Search Engine Land.

Though you probably don’t need to worry about how to make scripts work for the largest of AdWords accounts, his advice on making scripts accessible even if you don’t know how to write code is particularly valuable information. Check it out if you are already using, or are thinking of using, AdWords scripts.

What if I told you there was a simple five step process you can use to create great quality logos? Seems to good to be true? It kind of is. There are no shortcuts to great logos, because you always have to put the work in during every step, but if your problems stem from not knowing where to get started rather than simply skimping on the effort, Martin Christie’s five step process may be just what you need.

Every designer might have their own work flow, but if you don’t have one in place you are sacrificing efficiency and most likely quality. Christie’s process starts where every good design should, with a design brief, and walks you through every step all the way up to the presentation. He simplifies it into the image below, but to get the full idea of the process, you should see it in his own words over at Design Instruct.

Five Step Design Process


Google has been getting some bad press lately surrounding their penalty notices. Their notices are notoriously vague, and this has come to the surface of the topic after the BBC received an “unnatural link” warning last month due to links pointing to a single page on the site, and Mozilla was notified of a “manual” penalty this week because Google identified a single page of spam on their site.

In both of those cases, the penalties were only applied on the individual pages in question, but that information wasn’t included in the notices, which makes for obvious concern. These cases also pinpoint one of the biggest issues with issuing notices without specifically identifying to problem for the site owners. With millions of pages of content, trying to identify the problem pages would be a needle-in-the-haystack situation.

Many have been concerned about the ambiguous notices, and Google has said they will work to improve their transparency, but what do you do if you get a notice that says you have been penalized but doesn’t tell you exactly where the problem is? Matt Cutts, head of Google’s web spam team, says you should start at Google’s webmaster help forum.

If help can’t be found in the webmaster help forums, Cutts says filing a reconsideration request could result in being given more information and possibly advice, though he concedes  “we don’t have the resources to have a one-on-one conversation with every single webmaster.”

This is notable, because many believed in the past that filing a reconsideration request after a penalty was a one-time attempt to restore your site’s name. Many speculated that Google would not be keen to reviewing repeated requests and to only file for reconsideration once the site master is sure they have solved the issues. According to Cutts, this doesn’t seem to be the case.

Telling site owners to turn to a forum or file requests where they might be given extra information doesn’t seem like very consistent advice for trying to overcome a penalty. Luckily, there are some other solutions for investigating what part of your site is causing all the problems. Danny Sullivan works through some other ways you can try to diagnose your site at Search Engine Land.

If you read many blogs, it is easy to notice how rampant content scraping is. For the lucky few out there who haven’t run into it yet, content scraping is stealing content from a site to display it on someone else’s blog, usually with Adsense ads to make money off of your hard work.

Thankfully for all the bloggers out there, experienced coders have been fighting off these content scrapers for years and they are happy to share their latest tricks to keep their content from appearing on other sites. Given, this battle is similar to the ongoing battles against copyright infringers and hackers, in that while these solutions may work for the moment, scammers and scrapers are already at work to find a way around the defenses.

None-the-less, it is better to put up a fight rather than giving up when it comes to these content bandits. Jean-Baptiste Jung, co-founder of Cats Who Code, has offered snippets you can use in WordPress to help fend off exactly these types of content thieves, each with their own unique solution.

One common way scrapers steal content is by displaying your blog within a frame on their page, with the ads in another frame so that they will always be shown, and thus earn the scrapers money. Jung’s first snippet breaks out of these frames so that your blog covers the entire window, effectively blocking the scraper site from being seen.

The single most frequent content scraper method is to simply use your RSS feed, and display it on their site so that they also get to take advantage of your original (or paid for) images, as well as not using their own bandwidth. To solve this problem, Jung disabled hotlinking to images so that every time someone tries to use your pictures on their site, they instead see an image informing viewers the content is stolen from your website. It is pretty entertaining to see the results he shared from one such website.

Source: Cats Who Code

Source: Cats Who Code

Obviously, most content scrapers are using tools that do all the work for them, and these tools normally steal the title as well as the content of your post. The solution here is a simple snippet that adds a link automatically to your post titles that directs back to your original post.

To get the snippets, you’ll have to head over to Jung’s article, which also offers a couple more solutions to content thiefs. If you haven’t been bothered by scrapers yet, you are either very lucky, or not paying enough attention. The bandits may eventually figure out how to thwart these defenses, but at least your content will be safe for a while.

When things go wrong with an SEO campaign, it puts everyone involved in a tricky position. The first step is obviously to figure out what happened and who is responsible in order to fix the problem, but pointing out who is responsible for failure can hurt egos and business relationships if not handled right.

The most problematic situation is when a client is at fault, which is indeed possible. The customer is always right may be a good philosophy to live by in many cases, but it isn’t actually all that true when it comes to implementation. This is especially true when you are working with someone not all that informed about SEO.

Some SEOs will try to cut out the client, but that hurts the campaign as well. Instead, the best option is making sure to educate clients about the process in order to avoid issues, though that obviously can’t keep all problems from popping up. If one does arise, it is your job to talk the issue through with your client. While it may be their fault for not following through on a responsibility, it is equally likely you are also responsible due to a failure of communication.

Amanda DiSilvestro suggested a few ways clients can end up bringing down an SEO campaign, as well as how Search Engines and SEOs themselves can derail your progress. The most common issues for clients include:

  • Failing to Change – Many times, SEOs will suggest changes to make onpage to optimize a website, and often it will mean tweaking content to include keywords or possibly editing a meta tag. Clients are often very protective of their content however, and sometimes ignore these suggestions. In this case, the SEO has done their job, but if the client isn’t willing to cooperate, there is little the expert can do.
  • Failing to Plan as a Group – When SEOs aren’t confident in their client’s understanding of optimization, they sometimes begin to ignore the client all together. But, even if a client doesn’t want to be very hands on with the campaign, they almost certainly had goals in mind when they hired the pros, and those goals should be included in the plan for optimization. If a client tries to avoid being a part of the SEO process, including reading the regular reports, there will be a schism between the SEO expert and the company, which will likely splinter the campaign and weaken it.
  • Giving Up Too Early – Too many potential clients come to SEO agencies wanting quick fixes. No matter how earnestly you try to explain that optimization is a slow process, if the client doesn’t comprehend how long it will actually take, they are likely to get frustrated and shut the whole thing down before they really had a chance to reap rewards. There is little SEOs can do here except try to really communicate about time estimates and benchmarks you expect to hit, or just refusing clients that refuse to understand there is no way to get to the number one spot on Google overnight.

Now, we all know clients aren’t always the problem. In fact, it is usually the professional that ends up torpedoing the whole campaign. SEO firms and experts have the power in the campaign, and it is a tough balancing act to get everything on a site working as well as it can to impress the search engines. There are endless reasons a campaign may not work, but unfortunately the most common all stem from just plain bad practices.

  • Going Black Hat – It seems everyone writing about SEO knows how blatantly terrible an idea black hat practices are, but yet there is are never-ending “optimization” services available that use keyword stuffing, duplicate content, cloaking, shady link building  and several other bad practices that Google already knows to look out for. Sure these services might get a site good rankings initially, but it won’t be long at all before they sink under the weight of penalties.
  • Poor Communication – Just as it was said above, even when the client is at fault, the SEO is sometimes responsible for not explaining the process or keeping the client in the loop. SEO work is a partnership, no matter how independent you may be. The client relies on you to inform them about this unique field and help them make informed decisions. If you aren’t communicating and they make a mistake it is your fault. Similarly, if you make a decision without consulting the company you are working with and they don’t like it, you have no excuse.
  • Laziness – When it all comes down to it, a lot of SEO is maintaining and tweaking things to make a site the most efficient possible at signaling to search engines. Experts can get lazy too, but when a site starts under-performing because you haven’t been paying it the attention it deserves, there is no one to blame but yourself. The solution to this one is obvious. Drink a coffee, get up, and do the work clients are expecting of you.

While these categories cover many mistakes made in SEO, there are also innocent problems like misreading a market, and simply putting your faith in the wrong type of campaign.

No one likes having the finger pointed at them when things fall apart, but it is important to honestly assess who is responsible for the faults.

A bruised ego may sting for a little, but if you are the client can put that aside and focus on the good of the site, you can use the understandings gained about what went wrong to repair SEO mistakes and bad habits. With those lessons under your wing, soon you’re site will be performing as you would like it to.

Running a competitions through your Facebook page can be an effective way to build your audience and enhance brand recognition. But that’s only if you do it correctly. Neville Luff posted a list of concerns at Business2Community that you need to be aware of to make sure you get the most benefit out of your Facebook contest.

Have you read through Facebook’s terms and policies and page guidelines? Probably not, but not doing so could lead to Facebook shutting down your contest. For example, requiring a like or share to win is frowned upon. And you must use an app for your promotion.

Now, if you follow Facebook’s rules, your contest won’t be embarrassingly shut down, but will you get the most out of it. Be sure you promote it properly. Too many times, a business assumes simply having a contest will attract attention, but if you’re going to go to the effort of a giveaway, go the extra mile to make sure as many people know about it as possible. You need to have a plan in place to promote your contest, as well as on to actually execute it.

Finally, give away something relevant to your company. We all love Microsoft products, but are you really getting more customers because you gave away an Iphone? If your prize gives the winner access to your services, you are building your customer base and those who register will actually be users interested in what you do.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

The entire SEO community is bracing themselves. A new Google Penguin update should be here any time, and it is looking like it will be quite a big deal. Supposedly it will be much more brutal than the already merciless update that came last April.

Judging from what we already know about Penguin, there are some ways to prepare yourself and all of your sites to make sure you don’t get hit by the first wave of penalties. Plus, if you follow these suggestions from Marcela De Vivo, you’ll be improving your SEO all around.

  1. Monthly Link Audits – Knowledge is power, and audits give you a lot of knowledge. Start with the backlinks and get a baseline. Find out how may high quality and low quality links you have. Who are these links connecting to? If there are spammy links, work to have them removed. You can choose from a huge selection of audit tools to make the process easy, and you will always know how your link profile is doing.
  2. Anchor Density – A popular way to try to cheat search engines is cranking up anchor density for money terms, and Penguin already penalizes those that do it too much. There is a good chance they will get stricter on their anchor density guidelines, so it is important to keep an eye out. You want to be under 15% for the money term. Any higher is risking penalties when the new Penguin update arrives.
  3. Link Ratios – Links are all about finding the right balance. Google talks about Earned vs. Unearned links, and when they do that they mean Images vs Mentions or Text, Sitewide Ratios, Deep Link Profiles, etc. De Vivo breaks down the categories a little more, but the main idea is to keep a good balance between them all.
  4. Use Your Webmaster Tools – For every siteowner who thinks this is obvious is another siteowner who doesn’t know what Webmaster Tools is or how to monitor it. This is the best line between you and Google, and watching the links Google displays in your account can help identify problematic links as well as keeping you informed as to how they are effecting your rankings. There are numerous problems that Webmaster Tools can inform you of, you just have to look.
  5. Don’t Do Spammy Link Building – This one is the most obvious out of all of these, but it seems no amount of telling site owners to keep away from this practice will ever stop the problem. If something sounds too good to be true in SEO, IT IS. If you can’t identify spammy links, don’t do the work yourself. Google will penalize you if it hasn’t already, and the money you spent on those links wil be wasted.

Google Penguin isn’t the bad guy, nor is it the authoritarian figure not letting anyone have fun. Google’s spam fighting efforts are keeping our browsing running smoothly, and the “innocent” people affected by these changes are participating in questionable tactics. Read Google’s best practices, and follow them. If you are taking proper care of your site and following Google’s rules, the new Penguin update won’t feel near as scary.

You may have already noticed ads with a company’s number of Google+ followers noted at the bottom of them. This is a new feature from AdWords Enhanced Campaigns and one that you, like I did, might be wondering about. Does it really make that much of a difference how many followers you have? Does it make a consumer more likely to click on your ad? According to Google, yes.

Frederic Lardinois reports for TechCrunch that these ads with the follower count ‘annotations’ receive a 5 to 10-percent bump in CTR than regular ads. A large number of followers would likely lend a little more credibility to an ad, but those companies with thousands or millions of followers likely already have that credibility through name recognition.

And this new feature isn’t available to just anyone with an AdWords account. You’ll not only need a “significant number of followers”, but you also need “recent, high-quality posts”. The whole thing sounds a little subjective, but it may be worth putting the time in to build up your Google+ page to get the boost in CTR.

Everyone hopes their next website design work is going to be a big project. Steady income for months and a full website redesign are much more attractive than improving some small aspects for a company before looking for another client. But, sometimes pitching a full redesign could lose you a client rather than win you long-term work.

Companies today have tight budgets and fierce competition so many businesses in the current economic climate are much more interested in revamping what they have rather than building from the ground up. Sure, there are sometimes a full redesign is necessary, but often it isn’t the best choice for your prospective client.

Henry Waterfall-Allen suggests 9 reasons you may not want to pitch a full redesign when you are sitting down with a client you are hoping to work with. It may not be as fun to work with a previously existing design, but it may win you a long term client if you can spot their needs.

The most alluring aspect of not going with a redesign is the lowered development and promotion costs. Obviously tight budgets are a large reason companies are looking for the most bang for their buck, and it is entirely possible to increase conversions and revenue through a website without tearing out the existing page.

With lower costs comes less risk, but the opportunity for similar rewards. Instead of trying to revamp everything, you are just trying to achieve immediate results. Designing a page or tweaking elements of an existing design see results much faster than taking the time for a full redesign.

The smaller scale also allows you to target demographics more with a single page. You don’t have the weight of trying to draw in an entire audience. Instead, you are refining and targeting what already exists to create an instant boost to revenue.

More than anything, having the honesty to tell a possible client that they don’t need all of the work many other designers are trying to sell them on will go a long ways and make you stand out from the crowd. Showing clients a more focused way of spending their money and showing them the possibility for much sooner revenue increases will win over many, while being willing to lose the big redesign in favor of being the best fit for the job will keep clients coming back to you.

Of course, there will always be times when a redesign is the only option. You can’t tweak a bad website and hope to come out with a good page. Being able to spot the times when a redesign is needed and when a smaller project will benefit everyone will make you a valuable asset to clients, and will win you longer term jobs.

Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

With all of the different ways Google can penalize you these days, it is easy to get confused about what you need to do to fix your mistakes. Between Penguin, Panda, Unnatural Link Penalties, and Manual Penalties, there are more ways to get in trouble than ever.

Google’s increasing strictness is far from a bad thing, but it is also getting increasingly complex which makes for confusion when trying to bounce back from a mistake.

Marie Haynes knows just how confusing it can be. She has been working in SEO and writing for SEOMoz for years, but even she got confused when trying to help someone with what she thought was a Penguin-related penalty. She then saw another respected writer make the same mistake in a recent article but confusing unnatural links penalties with Panda.

It seems we need to go to the root of these issues and break down what each of these different penalties are and how they are different from each other.

The Penguin Algorithm came about last April as a algorithm change aimed at fighting webspam, which explains the initial title “The Webspam Algorithm” and it mainly targeted sites participating in link schemes and other questionable linking practices, though it also looked for indications of keyword stuffing.

The Penguin Algorithm isn’t to be confused with an Unnatural Link Penalty. The main difference is that Unnatural Links Penalties are manually taken against you rather than by an automated algorithm. They mainly place these algorithms when they believe a site is attempting to manipulate search engine results through the creation of links. The real question is what causes Google to investigate your site.

It is widely believed that filing a spam report will flag a site for manual review, but others have guessed that Google monitors more cutthroat niches such as “payday loans” or casino sites and consistently manually checks for unnatural links. Thanks to Google’s secrecy, we may never know exactly what makes Google personally examine a site.

So what is the main difference between Penguin and Unnatural Links Penalties? It really all comes down to the different way algorithms act compared to penalties taken by a living breathing person. Algorithms view all sites the same and is effective almost immediately. All sites hit by an algorithmic penalty will see the damage within the day of the algorithm update. Manual penalties on the other hand are being placed against sites at all times, and can be appealed more easily than an algorithmic penalty.

You can always recover from any of these penalties with effort, as Marie Haynes shows in her article, but you have to clean up your page and your methods. SEOs can’t get away with participating in link schemes or engaging other black hat techniques anymore, and there is no way to cheat the search engines anymore.