Now that the dust has settled after some extended debate, it seems clear that responsive design is here to stay. It won’t last forever, but it certainly isn’t a flashy trend that is going to fade away soon. It makes sense responsive design would catch on like it has, as it makes designing for the multitude of devices used to access the internet much easier than ever before.

Almost as many people accessing the internet right this moment are doing so using a smartphone or tablet, but they aren’t all using the same devices. A normal website designed to look great on a desktop won’t look good on a smartphone, but similarly a site designed to work well on the new iPhone won’t have the same results on a Galaxy Note 3.

This problem has two feasible solutions for designers. Either you can design multiple versions of a website, so that there is a workable option for smartphones, tablets, and desktops, or you can create a responsive website which will look good on every device. Both options require you to test your site on numerous devices to ensure it actually works great across the board, but a responsive site means you only have to actually design one site. The rest of the work is in the tweaking to optimize the site for individual devices.

That all explains why designers love responsive design as a solution for the greatly expanding internet browsing options, but we have to please other people with our designs as well. Thankfully, responsive design has benefits for everyone involved. The design solution is even great for search engine optimization, which is normally not the case with design and optimization working together. Saurabh Tyagi explains how responsive design benefits SEO as much as it does consumers.

Google Favors Responsive Sites

SEO professionals spend a lot of their time and efforts simply trying to appease the Google Gods, or trying to follow the current best practices while also managing to outplay their competition. Google has officially included responsive design into its best practice guidelines, as well as issuing public statements calling for websites to adopt the design strategy, so naturally SEOs have come to love it.

One of the biggest reasons Google loves responsive sites is that it allows websites to use the same URL for a mobile site as they do for a desktop site, instead of redirecting users. A site with separate URLs will have a harder time gaining in the rankings than one with a single functional URL.

Improves the Bounce Rate

Getting users to stay on your page is actually easier than you might think. If you represent yourself honestly to search engines, and offer a functional, readable, and generally enjoyable website, users that click on your page are likely to stay there. By ensuring your website is functional and enjoyable on nearly every device, you ensure users are less likely to hit the back button.

Save on SEO

Having a separate mobile site from your desktop site means double the SEO work. Optimization is neither cheap, fast, or easy, so it doesn’t make sense to waste all that extra time and work on basically duplicate efforts. Instead of having to optimize two sites, responsive websites allow SEOs to put all their efforts into one site, saving you money and providing a more focused optimization effort.

Avoids Duplicate Content

When you’re having to manage running two sites for the same business, it is highly likely you will eventually end up accidentally placing duplicate content on one of the sites. If this becomes a regular problem, you can expect punishments from search engines which could be easily avoidable by simply having one site. Responsive design also makes it easier to direct users to the right content. One of Google’s biggest mobile pet peeves of the moment is the practice of consistently redirecting mobile users to the front page of the mobile site, rather than to the mobile version of the content they asked for. Responsive design avoids these types of issues altogether.

If you’ve spent much time online in the past year or two, it is almost certain you’ve come across an infographic. They are highly enjoyed by the public, as well as being educational. This is why more companies and content creators are using infographics to communicate and share knowledge with the public than ever before. Some may say it is just a trend, but either way the data shows that searches for infographics have risen over 800 percent in just two years, from 2010 to 2012.

Even if you don’t know what an infographic is, the chances still favor that you have seen one either in your Facebook feed, a news article, or maybe even your email. Infographics are images intended to share information, data, or knowledge in a quick and easily comprehensible way. They turn boring information into interesting visuals which not only make the information easier to understand, but also make the average viewer more interested in what is being communicated.

According to Albert Costill, multiple studies have found that 90 percent of the information we retain and remember is based on visual impact. Considering how much information take in on a day to day basis, and that means you’re content should be visually impressive if you want to have a hope of viewers remembering it. If you’re still unsure about infographics, there are several reasons you should consider at least including them occasionally within your content strategy.

  1. Infographics are naturally more eye-catching than printed words, and a well laid-out infographic will catch viewers attention in ways standard text can’t. You’re free to use more images, colors, and even movement which are more immediately visually appealing.
  2. The average online reader tends to scan text rather than reading every single word. Infographics combat this tendency by making viewers more likely to engage all of the information on the screen, but they also make it easier for those who still scan to find the information most important to them.
  3. Infographics are more easily sharable than most other types of content. Most social networks are image friendly, so users are given two very simple ways to show their friends their favorite infographics. Readers can share a link directly to your site, or they can save the image and share it directly. The more easily content can be shared, the more likely it is to go viral.
  4. Infographics can subliminally help reinforce your brand image, so long as you are consistent. Using consistent colors, shapes, and messages, combined with your logo all work to raise your brand awareness. You can see how well this works when you notice that every infographic relating to Facebook naturally uses “Facebook Blue” and reflects the style of their brand.

Obviously you shouldn’t be putting out an infographic every day. Blog posts still have their place in any content strategy. Plus, if you are creating infographics daily, it is likely their quality will suffer. Treat infographics as a tool that can be reserved for special occasions or pulled out when necessary. With the right balance, you’ll find your infographics can be more powerful and popular than you ever imagined.

There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.

A couple weeks ago, Google released an update directly aimed at the “industry” of websites which host mugshots, which many aptly called The Mugshot Algorithm. It was one of the more specific updates to search in recent history, but was basically meant to target sides aiming to extort money out of those who had committed a crime. Google purposefully targeted those sites who were ranking well for names and displayed arrest photos, names, and details.

Seeing how a week went by without response, you wouldn’t be judged for thinking that was the end of the issue, but finally one of the biggest sites affected, Mugshots.com, publicly responded to Google’s update. Barry Schwartz reported Mugshot.com published a blog post, in which they claim Google is endangering the safety of Americans.

Mugshots was among three sites who suffered the most from the algorithm, the others including BustedMugshots and JustMugshots.

In their statement, they say, “Google’s decision puts every person potentially at risk who performs a Google search on someone.”

If Mugshots.com could tone down the theatrics, they might have been able to make a reasonable argument. However, they also ignore there are many other means for employers and even common citizens to find out arrest records and details in less humiliating and more contextualized means.

HalloweenThere have never been more opportunities for local businesses online than now. Search engines cater more and more to local markets as shoppers make more searches from smartphones to inform their purchases. But, in the more competitive markets that also means local marketing has become quite complicated.

Your competitors may be using countless online tactics aiming too ensure their online success over yours, and to stand a chance that means you also have to employ a similarly vast set of strategies. When this heats us and online competition begins to grow convoluted, some things get overlooked. The more you have to juggle, the more likely you are to make a serious mistake.

In true Halloween fashion, Search Engine Watch put together the four most terrifying local search mistakes that can frighten off potential customers.

Ignoring the Data Aggregators

A common tactic is to optimize Google+ listings, as well as maybe Yelp, or a few other high-profile local directories. But, why stop there? Google crawls thousands and thousands of sites that contain citations every day, so optimizing only a few listings is missing out on serious opportunities.

The most efficient way to handle this and optimize the sites most visible to customers, businesses should focus on data sources that Google actually uses to understand local online markets. The best way to do this is to submit business data to the biggest data aggregators, such as Neustar Localeze, InfoUSA, Acxion, and Factual.

Not Having and Individual Page for Each Business Location

A few years ago Matt Cutts, one of Google’s most respected engineers, said, “if you want your store pages to be found, it’s best to have a unique, easily crawlable URL for each store.” These days organic ranking factors have become much more influential in Google’s method of ranking local businesses, so this advice has become more potent than ever before.

There are also numerous non-ranking based reasons you should have optimized location pages for each location. If you don’t have actual results on individual pages, Google isn’t indexing that content separately, and instead only sees the results offered in a business locator. Think of it like optimizing a product site without product pages. If the results don’t have separate pages, it loses context and usability.

Ignoring the Opportunity to Engage Your Customers

Whether you want to face it or not, word of mouth has managed to become more important than ever as consumers talk about businesses online on social media. Each opinion has an exponentially larger audience than ever in history, so a single bad review is seen by hundreds or thousands of potential customers. Thankfully, that one review doesn’t have to be your down bringing.

First, if bad reviews get seen by more people, the same can be said for good reviews. If a bad review is an outlier, it might not make such an impact on viewers. But, more importantly, every review mention or review or interaction with your business gives you the opportunity to engage them back. If you see a positive mention online, showing gratitude for the remark opens up an entirely new connection with your brand. Similarly, a bad review can be salvaged by simply asking how changes can be made to improve their experience in the future.

Not Using Localized Content

Pretty much every local online marketer has heard about the importance of using the relevant keywords in their content so their website ranks for those terms. But, they tend to only use this logic for the products or types of services they offer.

Local keywords including ZIP codes, neighborhoods, or popular attractions can do as much to help you stand out for important searches as product based keywords can. Simply including information about traffic or directions can help you start ranking for search terms your competitors are missing.

Google’s Carousel may seem new to most searchers, but it has actually been rolling out since June. That means enough time has past for marketing and search analysts to really start digging in to see what makes the carousel tick.

If you’ve yet to encounter it, the carousel is a black bar filled with listings that runs along the top of the screen for specific searches, especially those that are location based or for local businesses such as hotels and restaurants. The carousel includes images, the businesses’ addresses, and aggregated review ratings all readily available at the top, in an order that seems less hierarchical than the “10 pack” listings previously used for local searches.

Up until now, we’ve only had been able to guess how these listings were decided based on surface level observations. But, this week Digital Marketing Works (DMW) published a study which finally gives us a peak under the hood and shows how businesses may be able to take some control of their place in the carousel. Amanda DiSilvestro explains the process used for the study:

  • They examined more than 4,500 search results in the category of hotels in 47 US cities and made sure that each SERP featured a carousel result.
  • For each of the top 10 hotels found on each search, they collected the name, rating, quantity of reviews, travel time from the hotel to the searched city, and the rank displayed in the carousel.
  • They used (equally) hotel search terms—hotels in [city]; best hotels in [city]; downtown [city] hotels; cheap hotels in [city].
  • This earned them nearly 42,000 data points on approximately 19,000 unique hotels.
  • They looked at the correlation between a hotel’s rank in a search result based on all of the factors discussed in step 1 to determine which were the most influential.

Their report goes into detail on many of the smaller factors that play a role, but DMW’s biggest findings were on the four big factors which determine which businesses are shown in the carousel and where they are placed.

1. Google Reviews – The factor which correlated the most with the best placement in the carousel were by far Google review ratings. Both quantity and quality of reviews clearly play a big role in Google’s placement of local businesses and marketers should be sure to pay attention to reviews moving forward. However, it is unclear how Google is handling paid or fake reviews, so many might be inspired to try to rig their reviews. For long-term success, I would suggest otherwise.

2. Location, Location, Location – Seeing as how the Google Carousel seems built around local businesses, it shouldn’t be a surprise that location does matter quite a bit. Of the 1,900 hotels in the study, 50 percent were within 2 miles of the search destination, while 75 percent were within 13 minutes of travel. Businesses would benefit from urging customers to search for specific landmarks or areas of cities, as you never know exactly where Google will establish the city “center”.

3. Search Relevancy and Wording – According to the findings, Google seems to change the weight of different ranking factors depending upon the actual search. For example, searching “downtown [city] hotels” will result in listings with an emphasis on location, while “best hotels in [city]” gives results most dependent on review rankings.

4. Primary Markets and Secondary Markets – It seems both small and larger businesses are on a relatively flat playing field when it comes to the carousel. Many small hotels are able to make it into the listings, right next to huge chains. The bigger businesses may have more capabilities to solicit reviews, but no hotel is too small to be considered for the carousel.

Bing gave people more control over what shows up about them online last week when they partnered with Klout to create Bing Personal Snapshots. Personal Snapshots are an extension of the previously implemented People Snapshots, but it functions to give you some say in how you appear within the Snapshot column on Bing.

Bing and other search engines are one of the most common ways to find information about people, but those search engines usually gather that information from social media, which isn’t always full of information we want displayed to everyone who searches our names.

These new Personal Snapshots allows you to ensure the information you want displayed is shown while your more personal or embarrassing details can be withheld.

This works by allowing users to sign up for Klout and claim a profile, which Bing will then connect to your social networking profiles. From there, you’ll have some ability to manage your digital appearance and persona. The update will also allow Bing to show your most influential moments from social media within the same bar, along with a verified badge.

This isn’t total control over your online identity, but the change gives more power over your online presence than previously available.

If you don’t have a profile with Klout already, you should be aware that it is a social ranking website which relies on analytics to evaluate individuals’ online influence over social networks.

Leave it to Matt Cutts to always be there to clear the air when there is an issue causing some webmasters confusion. One webmaster, Peter, asked Matt Cutts whether geo-detecting techniques is actually against Google’s policies, as it is common for websites to be designed so that users are given the information (price, USPs) most relevant to their lives based on geo-location.

In some understandings of Google’s policies, this may be against the rules, but it turns out all is fine, so long as you avoid one issue.

In one of his Webmaster Chat videos, Cutts explained that directing users to a version of a site, or delivering specific information based on location are not spammy or against Google’s policies. It only makes sense to offer viewers information that actually applies to their lives.

What Google does consider spam is directing their crawlers or GoogleBot to a web page of content that users cannot see. Sending GoogleBot to a different location that what visitors see is a bad idea, which is considered spam or a form of cloaking. Instead, treat GoogleBot as you would any user, by checking the location information and sending the crawler to the normal page reflecting that data.