Tag Archive for: Google

A recent Google Webmaster Hangout seems to have implied that Google is pushing out Penguin Updates without announcing them. Penguin has only been officially updated twice after its initial release, and the last update was in October 2012. In the video, John Meuller from Google makes it appear that Google has been updating Penguin on a regular basis but has not announced them all. The comments come at around the four minute mark in the video below.

When asked for clarification by Search Engine Land, Meuller says that he was referring to general “link analysis” refreshes, but does not include the Penguin algorithm. They also confirmed the last update was the one announced in October.

One of the reasons some questioned if Penguin was being refreshed is Panda, the update always mentioned in association with Penguin, has been updated on roughly a monthly basis. Google didn’t confirm another update is coming, but the updates have been coming steadily, and there are signs a new one should arrive in the next few days.

The simple goal for your AdWords campaign should be to get the most conversions possible while spending the least money possible. If you have a good ratio here, you’re likely doing everything right. However, there are plenty of potential pitfalls to avoid in order to lose out on conversions or spend way too much in getting them.

Check out Patrick McDaniel’s tips at Business2Community and find out if you could be saving money somewhere or getting more out of your campaigns.

It is impossible to understate just how quickly SEO changes and how important it is to keep up. Strategies change, and search engines update countless times. Google’s Penguin and Panda updates are clearly the most talked about, but Google has had plenty of other updates with less catchy names throughout the last year, like the Knowledge Graph (okay, that one has a catchy name too).

Penguin and Panda changed the landscape of searching completely and strategies have had to adapt to them quickly, though SEOs not taking advantage of gray area SEO tactics like link buying were mostly unaffected. That doesn’t mean that they don’t have to follow the new guidelines as well.

Most of these guidelines are more broad however, but Don Pathak, writer for Search Engine Journal, tried to simplify and explain them, and in doing so came out with a few specific points.

Many writers, usually with vested interests, have argued that SEO success can’t be done with just great content, and it is true to an extent that the internet is competitive to the point where great content doesn’t quite get you to the top search result. However, Google has also made it very clear that it wants to favor the quality of content over SEO tactics. Keeping a site fresh and relevant will give you as much of a boost as any behind the scenes tweak can.

The new Google also favors locality, so if your business has a local presence in a marketplace, optimizing for that location will help customers find your service. You can get started by simply establishing a local profile on Google Places for Business, and encourage customers to give you reviews on the site.

SEO will likely always concern itself with the technical dealings behind the curtain of a website, but Google wants to give preference to those who operate valuable and well made websites, not those manipulating every loophole to get the market advantage. As with anything run mostly through algorithms, there will always be “hacks” or weaknesses, but rather than exploiting them as they open, it is better to just create a website with real value.

Any time Google’s Penguin or Panda updates are mentioned, site owners and bloggers alike work themselves into a mini frenzy about the possibility that their totally legitimate website might have been penalized. It’s warranted, in a way, because a few innocent bystanders have been affected, but largely Google is policing those breaking the rules.

Meanwhile, bloggers have tended to downplay just how much rule breaking there is. Black hat SEO is treated as a fringe issue when in reality it is a huge issue. Writers tend to focus on a small aspect of black hat SEO in which competitors use shady links and other SEO tactics to bring your site down, and that is incredibly rare. Google considers all explicit spam to be black hat, and with that definition, black hat SEO is the most pervasive type of SEO around.

It is also the type of spam Google spends most of their time fighting. Matt Cutts, Google’s webspam team leader, took to YouTube recently to answer a question about how many notifications Google sends out to website owners, and 90% of Google’s manual penalties are still spent on blatant spam pages.

Google sends out hundreds of thousands of notifications each month, but the chances of your common SEO or website owner seeing one are rare. There is a chance though. The other 10% of notifications focus on problems that SEOs who have fallen out of the loop or novices may have gotten sucked up into such as link buying, link selling, or even hacking notifications.

Building a backlink profile is considered a staple of SEO techniques, but eventually you may have to do some cleaning up, especially now that Google has introduced multiple algorithms to clamp down on the use of low-quality links.

If you’ve seen a sudden drop in traffic or rankings lately, it is likely you were hit by one of these algorithms. You may have received a notification of being penalized, but unless it was a manual action, it is highly likely you got no warning that you were hit by the changes. Either way, one way towards repairing the drop in traffic is to do some pruning on your backlinks, and removing low-quality links that are pointing to your site.

Cleaning up your links is neither fast nor easy. It takes time and patience, but with effort you can restore your site’s health. You can’t just go in and cut out random links hoping to solve the issue. Attacking the problem broadly could cause more problems, and pruning backlinks is considered a last-ditch effort according to SEO.com. “You should exhaust all of your other efforts like updating your content, building higher quality links and producing good content to promote and engage users before you consider removing bad links.”

After you have tried all these methods and determined whether your website was hit by a penalty or an algorithm update, then you can create a strategy for fixing your backlinks. Neither problem can be fixed automatically. If you received a manual penalty, you will have to do everything you can to fix the issue identified, and submit a reconsideration request. Algorithm updates, on the other hand, require changing your methods and waiting to see positive growth for your site.

If you are ready to put in the work and time to try to properly repair your site, and you’ve already tried everything else, then it is time to really get your hands dirty. SEO.com has a full tutorial for cleaning up backlinks, and it walks you through every step, including suggesting tools for analyzing backlinks.

AdWords for Video launched last August, but it took until earlier this month to tack on the much needed analytical tools that have become standard for regular, old AdWords for text.

As Katie Ingram reports for CMS Wire, AdWords for video has added 3 essential tools to help advertisers track who is experiencing their ads.

Reach and Frequency Reporting allows users to see how many unique viewers their ad has received, which seems like something that shouldn’t have taken 6-months to include.

Column Sets takes a company’s marketing goals and shows them relevant metrics to reach said goal. Users can use default columns, such as Website and Conversions or Views and Audience, or make their own out of the available metrics.

GeoMap simply shows where viewers of your ad are located. None of these are groundbreaking inventions, but rather relevant and useful tools to help make AdWords for Video as effective and popular as the original flavor.

Last week, Matt Cutts responded to a question he receives fairly regularly concerning the PageRank feature in the Google toolbar. Specifically, why haven’t they removed it? It is apparent that many believe that the PageRank feature is “widely used by link sellers as a link grading system.”

There is, of course, some truth to this. While spammers do take advantage of the PageRank system, Cutts says that it is still relevant to many others. “There are a lot of SEO’s and people in search who look at the PageRank toolbar, but there are a ton of regular users as well.” Apparently, many internet users see the PageRank feature as indicative of reputability  and Google doesn’t plan on forcing them to stop.

That doesn’t mean PageRank is here to stay forever. While Google plans to keep supporting it so long as it is relevant to their users, it is telling that Chrome does not have the PageRank feature built into Chrome. Now, IE 10 is disavowing add ons, meaning Google’s toolbar will no longer work with the browser.

Considering that Internet Explorer was the only browser supporting the Google toolbar, it is highly likely the PageRank feature, as well as the toolbar as a whole, will fade away before long. As Matt Cutts puts it, “the writing is on the wall” that the new iteration of IE could be the end of PageRank, but we will have to wait and see.

After the big shift to content focused SEO this year, a lot of the talk has been about the technical ways experts can use to try to get higher rankings behind the scenes. Everyone talks about how important is, but many are still more distracted by the ways they can mathematically manipulate that content to tailor to Google’s algorithms.

What too many are missing is that now the best way to tailor to Google is to turn your focus towards what consumers and visitors want.

The truth is, the top sites online have been doing this for years, because the most popular sites are those that provide quality content. Smaller SEO’s seem to have trouble accepting this for two reasons. The first is that it is hard to quantize how to make effective content. There isn’t necessarily a magic formula for the best blog, even for search engines.

Search engines run on algorithms, and it is an SEO’s job to adapt or even create a site to best fit those algorithm’s needs. However, trying to take advantage of those algorithms has lead to more and more using questionable practices to try to “trick” Google into higher rankings for sub-par content. This lead to Google instituting the Penguin and Panda updates, so that low-quality sites had a much harder time making their way to the top.

The other reason SEO’s often have trouble understanding that great content has ALWAYS been important is the competitive nature of website rankings and business in general. Just having excellent content alone has never been enough, and never will be, because there is a lot behind the scenes that pretty much has to be done to remain competitive for the great content to ever be noticed. The trick is finding the line between being competitive and slipping into more questionable practices.

But, there are thousands of pages worth of articles on how to tackle all of that behind the scenes SEO that you can do. When it comes to lessons on how to actually make the great quality your visitors and the search engines want to see, there’s a lot less to work with. Rebecca Garland, in an article for One Extra Pixel, gives some great pointers on how to actually improve the quality of your content, while also favoring the current search engine climate.

If Eric Schmidt’s book, “The New Digital Age”, is to be believed, Google’s authorship markup is going to play a huge role in search engine result pages before long. Given, as Search Engine Watch points out, Schmidt has a “talk first, think later” habit which has caused some great, though not always reliable, soundbites  but the fact that this is in his upcoming book, rather than a random interview, lends this quite a bit of reliability.

The Wall Street Journal published some excerpts from the book, and it is one in particular which has caught the eye of SEO professionals.

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.”

Google introduced their authorship markup in 2011, and stated at the time that they were “looking closely at ways this markup could help us highlight authors and rank search results,” but since then it has faded into the background in many ways. Google’s plans for the future bring it very much so back onto the table. Schmidt’s comment has made it very clear that Google wants to implement Google+ as a verification device. On one hand, it would be one of the best combatants against spammers imaginable. On the other, do we really want a future where we are forced to be on Google+ just so people can find your website?

Some companies can afford to just throw money at their online marketing campaigns and get results, but if you run a small business with an equally small advertising budget, you’ll want to look into these tips from Business2Community.

1. Google Search

Surprisingly, they suggest those with a limited budget limiting their options to search only. While it is likely the most efficient platform, I would argue that, depending on your business, you could yield great results with other options as well. But, it is a great place to start and if your budget is extremely limited, maybe also a great place to stop.

2. Keywords

You get a high conversion rate at a low cost-per-click. Keywords are a huge money saver and hone in on the users who are most likely to be looking for you. Also, be sure to learn about “long tail keywords” to get the most out of your ads.

3. Geotargeting

Chances are, if you have a small business and limited budget, you are only interested in those consumers living in your area. Use Google’s tools to only show ads to those in your vicinity. You can set parameters by city, zip code or even a mile radius around your physical address.

4. Day Parting

This one requires some legwork on your part. Check into your campaigns and find out when the peak hours are for conversions. You can then choose to either turn ads off during down times, or turn ads off during some of these peak hours when costs are at their highest. Either way, it is an opportunity to save some coin.

I do not endorse turning off ads simply because your business is closed for the day, however. Many consumers do conduct searches outside of normal business hours, which means you could be missing out on a large part of the market.

5. Device Targeting

The main reason to use this tool is to ensure ads ideal for mobile devices are shown only on mobile devices, while ads ideal for laptops, or not ideal for mobile, are only shown to laptops and tablets.