If you’ve spent much time online in the past year or two, it is almost certain you’ve come across an infographic. They are highly enjoyed by the public, as well as being educational. This is why more companies and content creators are using infographics to communicate and share knowledge with the public than ever before. Some may say it is just a trend, but either way the data shows that searches for infographics have risen over 800 percent in just two years, from 2010 to 2012.

Even if you don’t know what an infographic is, the chances still favor that you have seen one either in your Facebook feed, a news article, or maybe even your email. Infographics are images intended to share information, data, or knowledge in a quick and easily comprehensible way. They turn boring information into interesting visuals which not only make the information easier to understand, but also make the average viewer more interested in what is being communicated.

According to Albert Costill, multiple studies have found that 90 percent of the information we retain and remember is based on visual impact. Considering how much information take in on a day to day basis, and that means you’re content should be visually impressive if you want to have a hope of viewers remembering it. If you’re still unsure about infographics, there are several reasons you should consider at least including them occasionally within your content strategy.

  1. Infographics are naturally more eye-catching than printed words, and a well laid-out infographic will catch viewers attention in ways standard text can’t. You’re free to use more images, colors, and even movement which are more immediately visually appealing.
  2. The average online reader tends to scan text rather than reading every single word. Infographics combat this tendency by making viewers more likely to engage all of the information on the screen, but they also make it easier for those who still scan to find the information most important to them.
  3. Infographics are more easily sharable than most other types of content. Most social networks are image friendly, so users are given two very simple ways to show their friends their favorite infographics. Readers can share a link directly to your site, or they can save the image and share it directly. The more easily content can be shared, the more likely it is to go viral.
  4. Infographics can subliminally help reinforce your brand image, so long as you are consistent. Using consistent colors, shapes, and messages, combined with your logo all work to raise your brand awareness. You can see how well this works when you notice that every infographic relating to Facebook naturally uses “Facebook Blue” and reflects the style of their brand.

Obviously you shouldn’t be putting out an infographic every day. Blog posts still have their place in any content strategy. Plus, if you are creating infographics daily, it is likely their quality will suffer. Treat infographics as a tool that can be reserved for special occasions or pulled out when necessary. With the right balance, you’ll find your infographics can be more powerful and popular than you ever imagined.

Facebook advertisers using the social platform’s API and Power Editor tool have had access to their Custom Audiences ad targeting tool for a short time. But, many advertisers have yet to get access to the targeting tool until now.

Starting yesterday, Facebook will has begun rolling out the ad targeting tool to a limited number of US advertisers, with a global roll-out beginning next week. Amy Gesenhues says all advertisers around the world can expect to see the feature by late November.

This is especially of interest to small businesses who will be able to use their own customers lists to directly reach out to people on Facebook. You will also be able to use MailChimp lists with Custom Audiences for the first time.

Facebook already claims thousands of advertisers are using Custom Audiences, but this will open the door for countless other advertisers to access the feature via Facebook’s ad interface. You will even be able to access the feature from Facebook’s mobile app, assuming you have already uploaded your contacts.

You might not have noticed, but AdWords is working a little different since an algorithm update was quietly introduced on Tuesday. For the most part, not much is different, but there is a notable change in the ad extensions are now working as a factor in determining as positioning.

The update was announced in a blog which details all of the parts of the update. But, the big takeaway is that AdWords was updated mainly to take into account the new features that have rolled out over the past year.

Ad extensions now have an affect of how ads are positioned in Google’s search results. To show how this is working, Google gave the example of two identical ads with the same bid and quality score. With the update, the ad with extensions is more likely to appear in the higher ad position.

You may also see that a higher quality score, bid, or a combination of both also increases the likelihood of extensions appearing. Ad Rank also plays a similar role in deciding whether extensions appear.

As for what this means to marketers, Chris Roat, Staff Software Engineer for Google, says that Google expects ads with extensions to perform better and possibly see a lower cost per click with a higher click-through rate:

“You may see lower or higher average CPCs in your account. You may see lower CPCs if your extensions and formats are highly relevant, and we expect a large positive performance impact relative to other competitors in the auction. In other cases, you may see higher CPCs because of an improvement in ad position or increased competition from other ads with a high expected impact from formats.”

There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.

A couple weeks ago, Google released an update directly aimed at the “industry” of websites which host mugshots, which many aptly called The Mugshot Algorithm. It was one of the more specific updates to search in recent history, but was basically meant to target sides aiming to extort money out of those who had committed a crime. Google purposefully targeted those sites who were ranking well for names and displayed arrest photos, names, and details.

Seeing how a week went by without response, you wouldn’t be judged for thinking that was the end of the issue, but finally one of the biggest sites affected, Mugshots.com, publicly responded to Google’s update. Barry Schwartz reported Mugshot.com published a blog post, in which they claim Google is endangering the safety of Americans.

Mugshots was among three sites who suffered the most from the algorithm, the others including BustedMugshots and JustMugshots.

In their statement, they say, “Google’s decision puts every person potentially at risk who performs a Google search on someone.”

If Mugshots.com could tone down the theatrics, they might have been able to make a reasonable argument. However, they also ignore there are many other means for employers and even common citizens to find out arrest records and details in less humiliating and more contextualized means.

Advertisers on Facebook won’t have to go through demand-side platforms (DSPs) to manage their retargeting campaigns for much longer. According to Search Engine Watch, Facebook is creating new retargeting options that won’t force you to go through FBX (Facebook Ad Exchange) or any other platform other than Facebook’s own interface.

Up until now, Advertisers using FBX have only been able to serve their ads on desktops within the news feed or right sidebar and they must buy their ad space through separate DSPs. Considering how many Facebook users are accessing the social media platform via smartphones or tablets, it is surprising it has taken this long for Facebook to allow advertisers to target individuals on mobile devices.

What’s New?

The big new feature will be Custom Audiences, which will allow advertisers to set up their retargeting campaigns directly through Facebook’s interface. That will include the ability to overlay standard Facebook targeting options as well.

The ability to target mobile devices is of course another huge aspect of this update, as it is undeniable a remarkable percentage of Facebook users are primarily using mobile devices for social media.

What is FBX Still Better At?

FBX still has benefits over the options that will be available through the Facebook interface. Most important of those benefits is predictive buying. If an individual continuously browses for a certain product of type of service, FBX’s predictive buying capabilities allow advertisers to show an ad reflecting that interest.

HalloweenThere have never been more opportunities for local businesses online than now. Search engines cater more and more to local markets as shoppers make more searches from smartphones to inform their purchases. But, in the more competitive markets that also means local marketing has become quite complicated.

Your competitors may be using countless online tactics aiming too ensure their online success over yours, and to stand a chance that means you also have to employ a similarly vast set of strategies. When this heats us and online competition begins to grow convoluted, some things get overlooked. The more you have to juggle, the more likely you are to make a serious mistake.

In true Halloween fashion, Search Engine Watch put together the four most terrifying local search mistakes that can frighten off potential customers.

Ignoring the Data Aggregators

A common tactic is to optimize Google+ listings, as well as maybe Yelp, or a few other high-profile local directories. But, why stop there? Google crawls thousands and thousands of sites that contain citations every day, so optimizing only a few listings is missing out on serious opportunities.

The most efficient way to handle this and optimize the sites most visible to customers, businesses should focus on data sources that Google actually uses to understand local online markets. The best way to do this is to submit business data to the biggest data aggregators, such as Neustar Localeze, InfoUSA, Acxion, and Factual.

Not Having and Individual Page for Each Business Location

A few years ago Matt Cutts, one of Google’s most respected engineers, said, “if you want your store pages to be found, it’s best to have a unique, easily crawlable URL for each store.” These days organic ranking factors have become much more influential in Google’s method of ranking local businesses, so this advice has become more potent than ever before.

There are also numerous non-ranking based reasons you should have optimized location pages for each location. If you don’t have actual results on individual pages, Google isn’t indexing that content separately, and instead only sees the results offered in a business locator. Think of it like optimizing a product site without product pages. If the results don’t have separate pages, it loses context and usability.

Ignoring the Opportunity to Engage Your Customers

Whether you want to face it or not, word of mouth has managed to become more important than ever as consumers talk about businesses online on social media. Each opinion has an exponentially larger audience than ever in history, so a single bad review is seen by hundreds or thousands of potential customers. Thankfully, that one review doesn’t have to be your down bringing.

First, if bad reviews get seen by more people, the same can be said for good reviews. If a bad review is an outlier, it might not make such an impact on viewers. But, more importantly, every review mention or review or interaction with your business gives you the opportunity to engage them back. If you see a positive mention online, showing gratitude for the remark opens up an entirely new connection with your brand. Similarly, a bad review can be salvaged by simply asking how changes can be made to improve their experience in the future.

Not Using Localized Content

Pretty much every local online marketer has heard about the importance of using the relevant keywords in their content so their website ranks for those terms. But, they tend to only use this logic for the products or types of services they offer.

Local keywords including ZIP codes, neighborhoods, or popular attractions can do as much to help you stand out for important searches as product based keywords can. Simply including information about traffic or directions can help you start ranking for search terms your competitors are missing.

Google’s Carousel may seem new to most searchers, but it has actually been rolling out since June. That means enough time has past for marketing and search analysts to really start digging in to see what makes the carousel tick.

If you’ve yet to encounter it, the carousel is a black bar filled with listings that runs along the top of the screen for specific searches, especially those that are location based or for local businesses such as hotels and restaurants. The carousel includes images, the businesses’ addresses, and aggregated review ratings all readily available at the top, in an order that seems less hierarchical than the “10 pack” listings previously used for local searches.

Up until now, we’ve only had been able to guess how these listings were decided based on surface level observations. But, this week Digital Marketing Works (DMW) published a study which finally gives us a peak under the hood and shows how businesses may be able to take some control of their place in the carousel. Amanda DiSilvestro explains the process used for the study:

  • They examined more than 4,500 search results in the category of hotels in 47 US cities and made sure that each SERP featured a carousel result.
  • For each of the top 10 hotels found on each search, they collected the name, rating, quantity of reviews, travel time from the hotel to the searched city, and the rank displayed in the carousel.
  • They used (equally) hotel search terms—hotels in [city]; best hotels in [city]; downtown [city] hotels; cheap hotels in [city].
  • This earned them nearly 42,000 data points on approximately 19,000 unique hotels.
  • They looked at the correlation between a hotel’s rank in a search result based on all of the factors discussed in step 1 to determine which were the most influential.

Their report goes into detail on many of the smaller factors that play a role, but DMW’s biggest findings were on the four big factors which determine which businesses are shown in the carousel and where they are placed.

1. Google Reviews – The factor which correlated the most with the best placement in the carousel were by far Google review ratings. Both quantity and quality of reviews clearly play a big role in Google’s placement of local businesses and marketers should be sure to pay attention to reviews moving forward. However, it is unclear how Google is handling paid or fake reviews, so many might be inspired to try to rig their reviews. For long-term success, I would suggest otherwise.

2. Location, Location, Location – Seeing as how the Google Carousel seems built around local businesses, it shouldn’t be a surprise that location does matter quite a bit. Of the 1,900 hotels in the study, 50 percent were within 2 miles of the search destination, while 75 percent were within 13 minutes of travel. Businesses would benefit from urging customers to search for specific landmarks or areas of cities, as you never know exactly where Google will establish the city “center”.

3. Search Relevancy and Wording – According to the findings, Google seems to change the weight of different ranking factors depending upon the actual search. For example, searching “downtown [city] hotels” will result in listings with an emphasis on location, while “best hotels in [city]” gives results most dependent on review rankings.

4. Primary Markets and Secondary Markets – It seems both small and larger businesses are on a relatively flat playing field when it comes to the carousel. Many small hotels are able to make it into the listings, right next to huge chains. The bigger businesses may have more capabilities to solicit reviews, but no hotel is too small to be considered for the carousel.