There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.

A couple weeks ago, Google released an update directly aimed at the “industry” of websites which host mugshots, which many aptly called The Mugshot Algorithm. It was one of the more specific updates to search in recent history, but was basically meant to target sides aiming to extort money out of those who had committed a crime. Google purposefully targeted those sites who were ranking well for names and displayed arrest photos, names, and details.

Seeing how a week went by without response, you wouldn’t be judged for thinking that was the end of the issue, but finally one of the biggest sites affected, Mugshots.com, publicly responded to Google’s update. Barry Schwartz reported Mugshot.com published a blog post, in which they claim Google is endangering the safety of Americans.

Mugshots was among three sites who suffered the most from the algorithm, the others including BustedMugshots and JustMugshots.

In their statement, they say, “Google’s decision puts every person potentially at risk who performs a Google search on someone.”

If Mugshots.com could tone down the theatrics, they might have been able to make a reasonable argument. However, they also ignore there are many other means for employers and even common citizens to find out arrest records and details in less humiliating and more contextualized means.

Advertisers on Facebook won’t have to go through demand-side platforms (DSPs) to manage their retargeting campaigns for much longer. According to Search Engine Watch, Facebook is creating new retargeting options that won’t force you to go through FBX (Facebook Ad Exchange) or any other platform other than Facebook’s own interface.

Up until now, Advertisers using FBX have only been able to serve their ads on desktops within the news feed or right sidebar and they must buy their ad space through separate DSPs. Considering how many Facebook users are accessing the social media platform via smartphones or tablets, it is surprising it has taken this long for Facebook to allow advertisers to target individuals on mobile devices.

What’s New?

The big new feature will be Custom Audiences, which will allow advertisers to set up their retargeting campaigns directly through Facebook’s interface. That will include the ability to overlay standard Facebook targeting options as well.

The ability to target mobile devices is of course another huge aspect of this update, as it is undeniable a remarkable percentage of Facebook users are primarily using mobile devices for social media.

What is FBX Still Better At?

FBX still has benefits over the options that will be available through the Facebook interface. Most important of those benefits is predictive buying. If an individual continuously browses for a certain product of type of service, FBX’s predictive buying capabilities allow advertisers to show an ad reflecting that interest.

HalloweenThere have never been more opportunities for local businesses online than now. Search engines cater more and more to local markets as shoppers make more searches from smartphones to inform their purchases. But, in the more competitive markets that also means local marketing has become quite complicated.

Your competitors may be using countless online tactics aiming too ensure their online success over yours, and to stand a chance that means you also have to employ a similarly vast set of strategies. When this heats us and online competition begins to grow convoluted, some things get overlooked. The more you have to juggle, the more likely you are to make a serious mistake.

In true Halloween fashion, Search Engine Watch put together the four most terrifying local search mistakes that can frighten off potential customers.

Ignoring the Data Aggregators

A common tactic is to optimize Google+ listings, as well as maybe Yelp, or a few other high-profile local directories. But, why stop there? Google crawls thousands and thousands of sites that contain citations every day, so optimizing only a few listings is missing out on serious opportunities.

The most efficient way to handle this and optimize the sites most visible to customers, businesses should focus on data sources that Google actually uses to understand local online markets. The best way to do this is to submit business data to the biggest data aggregators, such as Neustar Localeze, InfoUSA, Acxion, and Factual.

Not Having and Individual Page for Each Business Location

A few years ago Matt Cutts, one of Google’s most respected engineers, said, “if you want your store pages to be found, it’s best to have a unique, easily crawlable URL for each store.” These days organic ranking factors have become much more influential in Google’s method of ranking local businesses, so this advice has become more potent than ever before.

There are also numerous non-ranking based reasons you should have optimized location pages for each location. If you don’t have actual results on individual pages, Google isn’t indexing that content separately, and instead only sees the results offered in a business locator. Think of it like optimizing a product site without product pages. If the results don’t have separate pages, it loses context and usability.

Ignoring the Opportunity to Engage Your Customers

Whether you want to face it or not, word of mouth has managed to become more important than ever as consumers talk about businesses online on social media. Each opinion has an exponentially larger audience than ever in history, so a single bad review is seen by hundreds or thousands of potential customers. Thankfully, that one review doesn’t have to be your down bringing.

First, if bad reviews get seen by more people, the same can be said for good reviews. If a bad review is an outlier, it might not make such an impact on viewers. But, more importantly, every review mention or review or interaction with your business gives you the opportunity to engage them back. If you see a positive mention online, showing gratitude for the remark opens up an entirely new connection with your brand. Similarly, a bad review can be salvaged by simply asking how changes can be made to improve their experience in the future.

Not Using Localized Content

Pretty much every local online marketer has heard about the importance of using the relevant keywords in their content so their website ranks for those terms. But, they tend to only use this logic for the products or types of services they offer.

Local keywords including ZIP codes, neighborhoods, or popular attractions can do as much to help you stand out for important searches as product based keywords can. Simply including information about traffic or directions can help you start ranking for search terms your competitors are missing.

Google’s Carousel may seem new to most searchers, but it has actually been rolling out since June. That means enough time has past for marketing and search analysts to really start digging in to see what makes the carousel tick.

If you’ve yet to encounter it, the carousel is a black bar filled with listings that runs along the top of the screen for specific searches, especially those that are location based or for local businesses such as hotels and restaurants. The carousel includes images, the businesses’ addresses, and aggregated review ratings all readily available at the top, in an order that seems less hierarchical than the “10 pack” listings previously used for local searches.

Up until now, we’ve only had been able to guess how these listings were decided based on surface level observations. But, this week Digital Marketing Works (DMW) published a study which finally gives us a peak under the hood and shows how businesses may be able to take some control of their place in the carousel. Amanda DiSilvestro explains the process used for the study:

  • They examined more than 4,500 search results in the category of hotels in 47 US cities and made sure that each SERP featured a carousel result.
  • For each of the top 10 hotels found on each search, they collected the name, rating, quantity of reviews, travel time from the hotel to the searched city, and the rank displayed in the carousel.
  • They used (equally) hotel search terms—hotels in [city]; best hotels in [city]; downtown [city] hotels; cheap hotels in [city].
  • This earned them nearly 42,000 data points on approximately 19,000 unique hotels.
  • They looked at the correlation between a hotel’s rank in a search result based on all of the factors discussed in step 1 to determine which were the most influential.

Their report goes into detail on many of the smaller factors that play a role, but DMW’s biggest findings were on the four big factors which determine which businesses are shown in the carousel and where they are placed.

1. Google Reviews – The factor which correlated the most with the best placement in the carousel were by far Google review ratings. Both quantity and quality of reviews clearly play a big role in Google’s placement of local businesses and marketers should be sure to pay attention to reviews moving forward. However, it is unclear how Google is handling paid or fake reviews, so many might be inspired to try to rig their reviews. For long-term success, I would suggest otherwise.

2. Location, Location, Location – Seeing as how the Google Carousel seems built around local businesses, it shouldn’t be a surprise that location does matter quite a bit. Of the 1,900 hotels in the study, 50 percent were within 2 miles of the search destination, while 75 percent were within 13 minutes of travel. Businesses would benefit from urging customers to search for specific landmarks or areas of cities, as you never know exactly where Google will establish the city “center”.

3. Search Relevancy and Wording – According to the findings, Google seems to change the weight of different ranking factors depending upon the actual search. For example, searching “downtown [city] hotels” will result in listings with an emphasis on location, while “best hotels in [city]” gives results most dependent on review rankings.

4. Primary Markets and Secondary Markets – It seems both small and larger businesses are on a relatively flat playing field when it comes to the carousel. Many small hotels are able to make it into the listings, right next to huge chains. The bigger businesses may have more capabilities to solicit reviews, but no hotel is too small to be considered for the carousel.

Google made waves last week when they announced the expansion of how “Shared Endorsements” are used in ads, as well as the change to their terms of service to reflect this. The funny thing is, most people don’t understand what is actually changing.

The majority were simply confused when they heard that Google was implementing the use of social information into ads, because that has been going on for about two years now. But, as Danny Sullivan explains, the devil is in the details.

Throughout 2011, Google made changes which allowed advertisers to begin integrating images of people who liked their pages on Google+ into text and display ads. All that really showed was a small profile picture, and the phrase “+1’d this page.”

Starting on November 11, that won’t quite be the case. More than simply the people who +1 a page is going to be shown in ads. For example, if you comment, leave a review, or even follow a particular brand, those types of actions can be shown in ads on Google. A mockup of how it will appear is below.

These changes won’t take place until November, but don’t expect a prompt roll-out. It is possible you may start seeing the changes starting the 11th, but more likely it will gradually appear over the span of a few days or even a couple of weeks.

Not much else is known about how advertisers will be able to create these types of ads yet. Most likely, Google would not have announced the update this early, except they had to get the terms of service updated before they could even begin to implement this feature.

If you don’t want to appear in any of these types of ads, you can go to this page and click the tickbox at the bottom to opt out for all ads in the future.

Bing gave people more control over what shows up about them online last week when they partnered with Klout to create Bing Personal Snapshots. Personal Snapshots are an extension of the previously implemented People Snapshots, but it functions to give you some say in how you appear within the Snapshot column on Bing.

Bing and other search engines are one of the most common ways to find information about people, but those search engines usually gather that information from social media, which isn’t always full of information we want displayed to everyone who searches our names.

These new Personal Snapshots allows you to ensure the information you want displayed is shown while your more personal or embarrassing details can be withheld.

This works by allowing users to sign up for Klout and claim a profile, which Bing will then connect to your social networking profiles. From there, you’ll have some ability to manage your digital appearance and persona. The update will also allow Bing to show your most influential moments from social media within the same bar, along with a verified badge.

This isn’t total control over your online identity, but the change gives more power over your online presence than previously available.

If you don’t have a profile with Klout already, you should be aware that it is a social ranking website which relies on analytics to evaluate individuals’ online influence over social networks.

It seems like everything looks different over at Google these days. Not only has their logo subtly flattened out, but the way we see a significant number of searches has been greatly altered with the introduction of the Google Carousel. Now, AdWords seems to be following suit as reports have started to come in of a new logo and web UI design.

As Search Engine Land reported, Rick Galan tweeted out a screenshot of the new appearance. The logo is now integrated directly into the navigation bar and the green coloring of the bar has been replaced by Google’s widely used desaturated blue-grey.

The new AdWords logo might by signaling a redesign of all Google product logos towards a more flat design, such as what they have done with their flagship logo. Their old logo is below for comparison.

It could also simply just be a test as Google has not released any public statement or announcement for the logo, so much is unclear, especially how long a roll out might take. No one knows when we will see the change, but don’t be surprised if your AdWords experience looks different in the near future.