Tag Archive for: Google

One of the biggest mistakes you can make in SEO is optimizing your website at the expense of the audience. While it may help get people onto your page, when visitors are met with a page filled with too much content on a bad design, they leave and you lose a sale.

With Google’s latest emphasis on usability, over optimizing may actually hurt your search engine rankings anyways. Jay Taylor recently shared some tips to make sure you are creating websites that customers and search engines alike will love. If you want to get people to come to your page and stay, follow these important rules.

  1. Understand Your Customer – First and foremost, the internet is now more about the user than it ever has been. Google includes aspects of usability such as speed, design, and content in their algorithms as strong indicators of quality sites. Not only that, but obviously you should be trying to appeal to your actual customers, not just your own tastes. The way you percieve your brand may not be the same as how your customers understand your product, so you want to find out why they choose you over your competitors. Once you know that, you know what to play up when introducing potential customers to your brand.
  2. Websites Don’t Have To Be Beautiful – That’s a bit of an overstatement, but it is far more important for a website to be usable and interesting to your target audience than it is to have a website that looks like a work of art. Use visuals that appeal to your customers in a way that solidifies your credibility and appeal to your customers. You want your website to look professional and be extremely usable, not unapproachably artistic.
  3. Create Great Content With a Purpose – The days of creating content stuffed with keywords solely to appease search engines is long gone. Your content should have a purpose to your reader and be aimed at actually informing your audience rather than rambling with specific words to attract crawlers. Poor grammar, unnecessary vocabulary usage, and awkwardly mechanical text turn people off, and can lose you customers. Instead, make sure your content has a purpose, value to your customer, and inspires action of some sort.
  4. Provide Easy-to-Use Navigation – User-friendly navigation is essential to allowing your customers to quickly and easily find what they’re looking for on your site, but it also allows search engines to more effectively index your site. There should be navigation in the header and footer so that customers always have access to it, and you might consider a drop-down menu in the top navigation if you have a lot of pages.
  5. Measure and Improve – Keep track of your key performance indicators such as conversions, contacts from the website, and possibly purchases to see how any new changes may be affecting your performance. You should also be using Google Analytics to watch where your customers are coming from, and what may be causing you to lose conversions.
Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

Penguin 2.0 only affected 2.3% of search queries, but you would think it did much more from the response online. Ignoring all of the worrying before the release, there have been tons of comments about the first-hand effects it seems many are dealing with in the post-Penguin 2.0 web. Those spurned by the new Penguin algorithm have even accused Google of only releasing the update to increase their profitability.

Matt Cutts, head of Google’s Webspam team, used his recent Webmaster Chat video to attack that idea head on. The main question he was asked is what aspect of Google updates Cutts thinks the SEO industry doesn’t understand. While Matt expresses concern about the amount of people who don’t get the difference between algorithm updates and data refreshes, Cutts’ main focus is the concept that Google is hurting web owners to improve their profits.

Most notably, the algorithm updates simply aren’t profitable. Google experienced decreases in their revenue from almost all their recent updates, but Cutts says that money isn’t the focus. Google is aiming at improving the quality of the internet experience, especially search. While site owners using questionable methods are upset, most searchers will hopefully feel that the updates have improved their experience, which will keep them coming back and using Google.

As far as the misunderstandings between algorithm updates and data refreshes, Cutts has expanded on the problem more elsewhere. The biggest difference is that the algorithm update changes how the system is working while data refreshes do not and only change the information the system is using or seeing.

Cutts was also asked which aspect of SEO that we are spending too much time on, which leads Cutts to one of the main practices that Penguin focuses on: link building. Too many SEOs are still putting too much faith in that single practice though it is being destabilized by other areas that more directly affect the quality of users’ experiences such as creating compelling content. Instead, Matt urges SEOs to pay more attention to design and speed, emphasizing the need to create the best web experience possible.

Cutts’ video is below, but the message is that Google is going to keep growing and evolving, whether you like it or not. If you listen to what they say and tell you about handling your SEO, you may have to give up some of your old habits but you’ll spend much less time worrying about the next algorithm update.

Well, the big event that the SEO community has been talking about for weeks has finally hit and everything is… mostly the same, unless you run sites known for spammy practices like porn or gambling. Two days ago, Google started rolling out Penguin 2.0. By Matt Cutts’ estimate, 2.3 percent of English-U.S. queries were affected.

While 2.3 percent of searches doesn’t sound like a lot, in all actuality that is thousands of websites being hit with penalties and sudden drops in the rankings, but if you’ve been keeping up with Google’s best practices, chances are you are safe.

None-the-less, in SEO it is always best to stay informed on these types of updates, and Penguin 2.0 does change the Google handles search a bit. To fill in everyone on all the details, Search Engine Journal’s John Rampton and Murray Newlands made a YouTube video covering everything you could want to know about Penguin 2.0.

Oh, and if you’ve been wanting to know why it’s called Penguin 2.0, Cutts says, “This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh).”

Google is always fighting to maintain diversity on their search engine results pages (SERPs). It has proven difficult over time to walk the line between offering searchers the content they want in easily browsable form, and keep the big established sites from completely dominating the results.

Matt Cutts, head of Google’s Webspam team, recently used one of his YouTube videos to talk about how Google is managing this, and highlight an upcoming change that will hopefully keep you from getting pages full of essentially the same results. No one wants to see eight results from Yelp when they are looking for a restaurant review.

The change Google is making is aimed at making it harder for multiple results from the same domain name to rank for the same terms. Basically, once you’ve seen three or four results from a domain, even over the spread of a few results pages, it will become increasingly harder for any more pages from that domain to rank.

If you don’t quite get what this means, it is easier to understand in context. In the video, Matt walks us through the history of Google’s domain result diversity efforts. It also shows how Google tries to manage bringing you the best authoritative and reputable search results without allowing bigger brands to form monopolies on the results.

You can see the full breakdown of the domain diversity history at Search Engine Land or in Cutts’ video, but basically when Google started out there were no restrictions on the number of results per domain. It was quickly apparent that this system doesn’t work because you will get page upon page of results from the single highest ranked domain. Then came different forms of “host clustering” which prevented more than two results per domain to be shown in the search results, but this was easily worked around by spammers.

More recently, Google has used a sort of tiered system where the first SERPs for a term are as diverse as possible, allowing only a few results from the same domains, however as you progress into the later search result pages, more and more results were allowed from repeat domains. Now, Google is tightening the belt and making it harder for those repeat domains to even get onto the later SERPs.

In my opinion, you can never read too many opinions and advice columns on how to manage your PPC campaigns. Sure, some may turn out to be full of bad advice, but I believe every bit of information can either guide you to improving your own campaigns, or steer you away from looming mistakes. At the very least, it’s good to see what other people are doing in order to inspire you to come up with your own methods.

With that in mind, how could you avoid Chris Kent’s article at Search Engine Journal called ’10 Golden Rules of AdWords.’ It’s loaded with good information. Some of it is bordering on cliche, such as logging in to your account at least once a day and testing every conceivable movable piece. But, even these have been repeated for a reason. They are important and are a key to building a successful campaign.

My favorite pieces of advice are a suggestion of how to determine how much to bid for certain keywords. For many, this seems to be a guessing game, which is not good. Also, remember to link your PPC ads to the specific page your ad refers to. Don’t just leave traffic at your doorstep, invite them in and put them right where you want them. In other words, bypass your homepage and get users as close to a conversion as you can.

A PPC war has started between Bing and Google and Microsoft Search Network’s GM fired the most recent shots. David Pann has bashed the effectiveness of AdWords Enhanced Campaigns for larger advertisers because of its bundling of desktop and tablet targeting options.

“For smaller advertisers that don’t distinguish between mobile, tablets and PCs Enhanced Campaigns may make sense. But for larger advertisers which understand that their messages must be different according to the device it will be harder and they will have to create workarounds,” Pann said.

Pann has a point and there have many independent reviewers who have essentially had the same critique since Google unveiled Enhanced Campaigns.

Take his opinions with a grain of salt, however, considering he is working for a direct competitor, who just happens to be rolling out their own version of Enhanced Campaigns in the coming months. Pann says Bing’s version will allow user’s to choose whether to combine mobile and desktop campaigns, or to keep them separate. Bing plans to launch their new product in beta sometime before fall and have a full release by the end of summer 2014.

For more, check out Jessica Davies article at The Drum.

The Short Cutts

Are you familiar with Matt Cutts, the head of the Google Webspam team, and his YouTube videos? I share them here frequently, but even the ones I write about are just a selection of some of his best. Since he has started making the short informative videos in 2009, Cutts has made over five hundred of the videos.

Five hundred videos are a lot to sort through, and YouTube isn’t the best at helping you navigate large numbers of videos so Cutts’ videos were starting to get a bit jumbled. That’s why the online marketing company Click Consult created The Short Cutts, a site which organizes all of the Cutts videos into an easily usable resource for all SEO questions you may have.

Short Cutts QuestionFor anyone not already aware of Cutts’ YouTube posts, they all follow the fame pattern. A Google user asks a question about a topic, and Cutts answers the question as well as he can within a short two or maybe three minutes. Some question the usefulness of the videos because Cutts often can’t go into depth in the short time limit, but I think anyone can understand how important it is to hear information and answers to common SEO questions straight from his mouth, even if it is a little vague.

Possibly the best part of The Short Cutts is their method of displaying videos above two sets of text which may help give you a quick answer. The first block of text consists of the question Cutts is asked, and the second block of text gives a quick “yes or no” type answer which can help give you the answer to many of the more simple issues.

Do you use the AdWords tools ‘Google Keyword tool’ or ‘AdWords Traffic Estimator’? If so, this is news you’ll need to sit up and take notice of. Both tools seem to be being phased out by a new tool unveiled earlier this month, ‘AdWords Keyword Planner’.

Keyword Planner is a streamlined, focused way to launch new campaigns. Its easy to use wizard interface guides you step-by-step through the process of creating new campaigns and new ad groups.

Larry Kim, of Search Engine Land, has all the details of how to use the tool and what it is capable of doing. However, you may check your AdWords account and find no sign of the Keyword Planner. Right now, it’s only been made available in about 20-percent of accounts, but more accounts are being added all the time.

Have you received an unnatural link penalty from Google? Are you worried about getting one? Or maybe you are just curious what constitutes an unnatural link. The answers out there are often woefully incomplete, or contradictory to other reputable sources out there.

It can sometimes feel like every different major SEO news source has their own exact definition of unnatural links, and sometimes they aren’t even that consistent. The problem just gets worse as these varying definitions are then interpreted by other writers trying to offer tips on how to recover from the penalties many have received.

If we can’t agree on a singular definition to unnatural links, how are we supposed to agree on a united way to deal with the penalties? All the confusion does is lead many site owners trying to get their site back on track down yet another wrong path.

Well, Search Engine Journal’s Pratik Dholakiya undertook the mammoth task of condensing all the information anyone could ever need to know about unnatural links and the penalties that come from them all into one informative article. From the basic information of how unnatural penalties became a huge problem for the SEO community and a singular definition for unnatural links, all the way to the secret tips many professionals haven’t been sharing, everything you need to find is there.

Another day, another Matt Cutts Google Webmaster Help video to talk about. This recent one deals with how SEO professionals pay close attention to any new Google patent that is remotely related to Search or Search Quality terms, and then speculate until some believe some very incorrect ideas about how Google is operating.

Cutts was asked what the latest SEO misconception he would “like to put to rest” and you could almost see the relief in his eyes as Cutts began explaining that patents aren’t necessarily put into practice.

“Just because a patent issues that has somebody’s name on it or someone who works at search quality or someone who works at Google, that doesn’t necessarily mean we are using that patent at that moment,” Cutts explained. “Sometimes you will see speculation Google had a patent where they mentioned using the length of time that a domain was registered. That doesn’t mean that we are necessarily doing that, it just means that mechanism is patented.”

Basically, there is a practice of SEO professionals, especially bloggers and writers, to speculate based on patents they see have been filed, and this can grow to offering tips and suggestions about how to run your website based on speculation stemming from a patent that isn’t in use, which all comes together to create some widespread misinformation.

For example, consider the speculation that comes every time Apple files patents for future phones. While they’ve recently had trouble with leaking physical prototypes in various ways, in the past, Apple kept their secrets well guarded, and the speculation based on their patents were often outlandish, and at best completely wrong.

That doesn’t mean you can’t learn and make predictions based on patents, especially if you see indicators that it has been implemented, but it is important to take every patent with a grain of salt. While Google has created the mechanisms for these patents, unless you see evidence, they probably aren’t worth getting worked up over.