Source: Stock.xchangIt took a couple weeks for everything to even back out after the recent Penguin update, and now its time to start looking forward to what is coming up in SEO. It is an especially good time to make predictions for the rest of 2013 as we are just now passing the halfway point in the year and Google has made some of their intentions moving forward very clear.

Google has pulled out the big guns in their fight against spam, and have publicly stated their interests in user experience through site design and quality content. None of that is a surprise, but at the turn of the year none of it had actually been confirmed by people within the search engine juggernaut. A few months later and Matt Cutts basically confirmed everything we assumed before. Focus on the user and don’t try to cheat or loophole your way to the top and you should be fine.

Still, Google isn’t content to simply focus on one or two things at a time, and there are bound to be quite a few other changes in the near future that we haven’t been told about. Jayson DeMers analyzed all of the evidence from Google’s more subtle changes and announcements in the past few months to attempt to make predictions for what we might be seeing in the next year or so in SEO. They are all just guesses from the information available, but it’s always good to stay ahead of the curve and aware of changes that may be on the horizon.

For those still pushing backlinks as the golden goose of SEO, a recent revision to Google’s Ranking help guidelines could be potentially frightening. But, if you’ve been watching the changes in SEO over the past few years it shouldn’t come as much of a surprise. Google has become more and more strict about backlink quality and linkbuilding methods, and links were bound to be dethroned.

As reported by Search Engine Watch, it was spotted late last week that Google updated the Ranking help article to say “in general, webmasters can improve the rank of their sites by creating high-quality sites that users will want to use and share.” Before, it told webmasters that they could improve their rank “by increasing the number of high-quality sites that link to their pages.”

There have been countless signs that Google would officially step back from linkbuilding as one of the most important ranking signals. There were widespread complaints for a while about competitors using negative SEO techniques like pointing bad links to websites, and every Penguin iteration that comes out is a significant event in SEO.

To top it all off, when Matt Cutts, the esteemed Google engineer, was asked about the top 5 basic SEO mistakes, he spent a lot of time talking about the misplaced emphasis on link building.

“I wouldn’t put too much of a tunnel vision focus on just links,” Cutts said. “I would try to think instead about what I can do to market my website to make it more well known within my community, or more broadly, without only thinking about search engines.”

Depending on your skill set, a recent Webmaster video may be good or bad news to bloggers and site owners out there. Most people have never considered whether stock photography or original photography has any effect on search engine rankings. As it happens, not even Matt Cutts has thought about it much.

There are tons of writers out there who don’t have the resources or talent with a camera to take pictures for every page or article they put out. Rather than deliver countless walls of text that people don’t like looking at, most of us without the artistic talent instead use stock photos to make the pages less boring and help our readers understand us more. For now, we have nothing to worry about.

Cutts, the head of Google’s Webspam team, used his latest Webmaster Chat to address this issue, and he says that to the best of his knowledge, original vs. stock photography has no impact on how your pages rank. However, he won’t rule it out for the future.

“But you know what that is a great suggestion for a future signal that we could look at in terms of search quality. Who knows, maybe original image sites might be higher quality, whereas a site that just repeat the same stock photos over and over again might not be nearly as high quality. But to the best of my knowledge, we don’t use that directly in our algorithmic ranking right now.”

Logically, I would say that if Google does decide to start consideration photo originality on web pages, Cutts appears to be more worried about sites that use the same images “over and over” rather than those who search for relevant and unique stock images for articles. Penalizing every website owner without a hired photographer to continuously produce images for every new page would seem a bit overkill.

Matt Cutts, head of Google’s Webspam team, recently announced via Twitter that a new ranking update focusing on spammy queries has officially gone live, according to Danny Goodwin from Search Engine Watch. At the same time, Google has made it clear that if you don’t have a quality mobile website, you’re going to start seeing your rankings dropping.

Spammy Queries Ranking Update

The ranking update for spammy queries is supposed to affect 0.3 to 0.5 percent of English queries, but it shouldn’t be much of a shock to anyone who has been listening to what Cutts says. It was one of the most notable updates Cutts spoke about in an earlier Google Webmaster video where he discussed what to expect from Google this summer.

Cutts says the updates are specifically focused on queries notorious for spam such as “payday loans” on Google.co.uk as well as pornographic queries. The roll-out of the update will be similar to many of Google’s recent changes in that it is being implemented gradually over the next few months.

Smartphone Ranking Changes

SmartphoneIt appears we’ve finally reached the point where slacking on mobile SEO is going to objectively hurt your site as a whole. A recent post added to the Google Webmaster Central Blog warns that “we plan to roll out several ranking changes in the near future that address sites that are misconfigured for smartphone users.”

Google named two primary mobile mistakes as their primary targets: fault redirects and smartphone only errors. Faulty redirects are “when a desktop page redirects smartphone users to an irrelevant page on the smart-phone optimized website,” such as when you get automatically sent to a homepage on a smartphone, rather than the actual content you searched for. Smartphone only errors, on the other hand, occur when sites allow desktop users reaching a page to see content, but gives smartphone users errors.

This is Google’s first big move in adding mobile configuration as a ranking consideration, but their advice belies their intent to continue to pay attention to mobile. They suggest “try to test your site on as many different mobile devices and operating systems, or their emulators, as possible.” It isn’t acceptable to only pay attention to desktop anymore.

Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

Penguin 2.0 only affected 2.3% of search queries, but you would think it did much more from the response online. Ignoring all of the worrying before the release, there have been tons of comments about the first-hand effects it seems many are dealing with in the post-Penguin 2.0 web. Those spurned by the new Penguin algorithm have even accused Google of only releasing the update to increase their profitability.

Matt Cutts, head of Google’s Webspam team, used his recent Webmaster Chat video to attack that idea head on. The main question he was asked is what aspect of Google updates Cutts thinks the SEO industry doesn’t understand. While Matt expresses concern about the amount of people who don’t get the difference between algorithm updates and data refreshes, Cutts’ main focus is the concept that Google is hurting web owners to improve their profits.

Most notably, the algorithm updates simply aren’t profitable. Google experienced decreases in their revenue from almost all their recent updates, but Cutts says that money isn’t the focus. Google is aiming at improving the quality of the internet experience, especially search. While site owners using questionable methods are upset, most searchers will hopefully feel that the updates have improved their experience, which will keep them coming back and using Google.

As far as the misunderstandings between algorithm updates and data refreshes, Cutts has expanded on the problem more elsewhere. The biggest difference is that the algorithm update changes how the system is working while data refreshes do not and only change the information the system is using or seeing.

Cutts was also asked which aspect of SEO that we are spending too much time on, which leads Cutts to one of the main practices that Penguin focuses on: link building. Too many SEOs are still putting too much faith in that single practice though it is being destabilized by other areas that more directly affect the quality of users’ experiences such as creating compelling content. Instead, Matt urges SEOs to pay more attention to design and speed, emphasizing the need to create the best web experience possible.

Cutts’ video is below, but the message is that Google is going to keep growing and evolving, whether you like it or not. If you listen to what they say and tell you about handling your SEO, you may have to give up some of your old habits but you’ll spend much less time worrying about the next algorithm update.

You run a small local business with brick and mortar locations. What reason do you have to invest in online marketing? Actually, there are quite a few reasons local businesses can benefit from online marketing.

You want your business to be reaching out to customers everywhere they are looking for you or services like yours, and more and more people are turning to the internet before they make a purchase. If they aren’t buying straight off the web, they are checking reviews and public perception of the products they are looking for.

A recent BIA/Kelsey report said that 97% of consumers use online media before making local purchases, and Google suggests that 9 out of 10 internet searches led to follow up actions such as calling or visiting businesses. That means the majority of consumers are turning to the internet, and if your business isn’t there, they will find others.

Online marketing isn’t as intimidating as many thing, either. Search Engine Land says that 50% of small businesses’ online listings are wrong, and the majority of small business owners claim they don’t have the time to keep online listings up to date. Keeping Google’s information on your business updated only takes a few minutes, and that is where most will find you. You can create a local business listing even if you don’t have a website or sell anything online.

The one step above this is to embrace social media. Many smaller businesses focus almost their entire web presence on Facebook, Google+, and Twitter, because these are where the brands can reach out directly to consumers.

If you do wish to fully capitalize on online marketing, but don’t think you have the time, hiring someone to manage your online brand and website eventually pays itself off in public awareness of your brand and cementing your brand identity as a trusted business in the community. However, you can’t just do a little. A shoddy or out of date website can hurt public perception of your company, so keeping your site up to date with all the current web standards is important to maintaining your brand’s integrity.

Well, the big event that the SEO community has been talking about for weeks has finally hit and everything is… mostly the same, unless you run sites known for spammy practices like porn or gambling. Two days ago, Google started rolling out Penguin 2.0. By Matt Cutts’ estimate, 2.3 percent of English-U.S. queries were affected.

While 2.3 percent of searches doesn’t sound like a lot, in all actuality that is thousands of websites being hit with penalties and sudden drops in the rankings, but if you’ve been keeping up with Google’s best practices, chances are you are safe.

None-the-less, in SEO it is always best to stay informed on these types of updates, and Penguin 2.0 does change the Google handles search a bit. To fill in everyone on all the details, Search Engine Journal’s John Rampton and Murray Newlands made a YouTube video covering everything you could want to know about Penguin 2.0.

Oh, and if you’ve been wanting to know why it’s called Penguin 2.0, Cutts says, “This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh).”

Google is always fighting to maintain diversity on their search engine results pages (SERPs). It has proven difficult over time to walk the line between offering searchers the content they want in easily browsable form, and keep the big established sites from completely dominating the results.

Matt Cutts, head of Google’s Webspam team, recently used one of his YouTube videos to talk about how Google is managing this, and highlight an upcoming change that will hopefully keep you from getting pages full of essentially the same results. No one wants to see eight results from Yelp when they are looking for a restaurant review.

The change Google is making is aimed at making it harder for multiple results from the same domain name to rank for the same terms. Basically, once you’ve seen three or four results from a domain, even over the spread of a few results pages, it will become increasingly harder for any more pages from that domain to rank.

If you don’t quite get what this means, it is easier to understand in context. In the video, Matt walks us through the history of Google’s domain result diversity efforts. It also shows how Google tries to manage bringing you the best authoritative and reputable search results without allowing bigger brands to form monopolies on the results.

You can see the full breakdown of the domain diversity history at Search Engine Land or in Cutts’ video, but basically when Google started out there were no restrictions on the number of results per domain. It was quickly apparent that this system doesn’t work because you will get page upon page of results from the single highest ranked domain. Then came different forms of “host clustering” which prevented more than two results per domain to be shown in the search results, but this was easily worked around by spammers.

More recently, Google has used a sort of tiered system where the first SERPs for a term are as diverse as possible, allowing only a few results from the same domains, however as you progress into the later search result pages, more and more results were allowed from repeat domains. Now, Google is tightening the belt and making it harder for those repeat domains to even get onto the later SERPs.

In my opinion, you can never read too many opinions and advice columns on how to manage your PPC campaigns. Sure, some may turn out to be full of bad advice, but I believe every bit of information can either guide you to improving your own campaigns, or steer you away from looming mistakes. At the very least, it’s good to see what other people are doing in order to inspire you to come up with your own methods.

With that in mind, how could you avoid Chris Kent’s article at Search Engine Journal called ’10 Golden Rules of AdWords.’ It’s loaded with good information. Some of it is bordering on cliche, such as logging in to your account at least once a day and testing every conceivable movable piece. But, even these have been repeated for a reason. They are important and are a key to building a successful campaign.

My favorite pieces of advice are a suggestion of how to determine how much to bid for certain keywords. For many, this seems to be a guessing game, which is not good. Also, remember to link your PPC ads to the specific page your ad refers to. Don’t just leave traffic at your doorstep, invite them in and put them right where you want them. In other words, bypass your homepage and get users as close to a conversion as you can.

A PPC war has started between Bing and Google and Microsoft Search Network’s GM fired the most recent shots. David Pann has bashed the effectiveness of AdWords Enhanced Campaigns for larger advertisers because of its bundling of desktop and tablet targeting options.

“For smaller advertisers that don’t distinguish between mobile, tablets and PCs Enhanced Campaigns may make sense. But for larger advertisers which understand that their messages must be different according to the device it will be harder and they will have to create workarounds,” Pann said.

Pann has a point and there have many independent reviewers who have essentially had the same critique since Google unveiled Enhanced Campaigns.

Take his opinions with a grain of salt, however, considering he is working for a direct competitor, who just happens to be rolling out their own version of Enhanced Campaigns in the coming months. Pann says Bing’s version will allow user’s to choose whether to combine mobile and desktop campaigns, or to keep them separate. Bing plans to launch their new product in beta sometime before fall and have a full release by the end of summer 2014.

For more, check out Jessica Davies article at The Drum.