No matter how bad of shape your website is in, Google will crawl it. Google crawls and indexes seemingly the entire internet. Though we know they may not look as deep into low-quality websites, that doesn’t mean they haven’t at least crawled and indexed the landing page. It takes something truly special to keep Google from crawling and indexing a page, but there are two common mistakes that can actually manage to keep Google away.

Technical SEO is one of the most difficult aspects of optimization to grasp, but if you are making these two simple mistakes, it can keep search engines, especially Google, from correctly indexing your websites. If your site isn’t getting correctly indexed, you have absolutely no chance of ranking well. Until you fix the problem your site is going to be severely crippled, so it is imperative you aren’t ignoring these issues.

1. The 301 Redirects on Your Website are Broken

It is a commonly accepted practice to use 301 redirects after a website redesign. As Free-SEO-News mentioned in their latest newsletter, using these redirects properly allows you to retain the ranking equity you’ve built with your website, rather than having to start again from the bottom.

The problem is when these 301 redirects aren’t implemented properly. Even worse, sometimes properly working redirects can suddenly falter, so you can’t place your faith in the redirects working correctly forever. Code changes, new plugins, or broken databases can cause your working 301’s to begin linking to non-existing pages.

Broken links are an automatic wrecking ball to all your efforts building a solid link portfolio. The best way to ensure that all your links are working is to download a website audit tool, such as SEOprofiler, which automatically checks all of your links and redirects. If your links or redirects suddenly stop working, you will be warned before you start getting punished by the search engines.

2. Rel=canonical Attributes Are Causing Problems

Just as with 301 redirects, the rel=canonical attribute serves a legitimate purpose when used correctly. The attribute can help you avoid problems with duplicate content, but those using the tag without knowing what they are doing can find themselves with some major issues.

Two of the biggest faux pas that we see regularly committed by site owners are to add a rel=canonical attribute which points to the index page to all web pages or to other pages that use the ‘noindex’ attribute. In both scenarios, Google won’t index the web pages at all.

The best advise is to simply stay away from the rel=canonical attribute unless you are absolutely sure what you’re doing. The only proper time to use the attribute is on duplicate pages, and anywhere else will result in significant problems. The problems that can come from using the attribute incorrectly are much worse than those you might see by failing to use the tag on duplicate pages.

You may remember that Google recently started testing large banner ads on branded searches. It raised quite a stir in the online community, mostly because it seemed that Google blatantly broke an older promise to never show banner ads. But, Bing is taking branded search result ads to the next level.

Larry Kim reports that last week, at the Bing Ads Next conference, Bing Ads announced their new ad format for exact match keyword searches, specifically those done within the latest Windows 8 update. Instead of a relatively small banner ad, Bing Ads are rolling out Bing Hero Ads, a full landing-page like layout that aggressively promotes the exact brand.

Just as with Google’s banner ads, Bing Hero Ads are only starting with a small number of prominent brand advertisers, such as Disney, Home Depot, Land Rover, and Volkswagen. It will also be a while before you can expect to see Hero Ads on your average search. For the moment, they are only appearing in a small selection of searches done in Windows 8.1 within the US.

It will be interesting to see how the public reacts to these types of branded semi-landing pages. Google’s banner ads looked fairly customized for each brand, , and only take up a relatively small amount of on-page real estate. A full-page ad experience for exact match branded searches may be welcomed as a quick and efficient way to connect with the brand searchers are looking for. It is also possible that consumers will be turned off by the seemingly uniform ad experience.

The one clear advantage Bing’s Hero Ads have over Google’s banner ads is their ability to deep link directly to a larger amount of pages on a site. They offer links such as “contact us”, “find a store” and “request a quote” which speed up users experiences and allow them to convert more quickly.

Last week, Apple announced their new iPads, the iPad Air, a thinner, lighter, and more powerful version of their full-size tablet, as well as an updated iPad mini with Retina Display. The broad public response to Apple’s latest products seems to be underwhelming, but it hasn’t seemed to sway how popular Apple products are with consumers.

The day of Apple’s big announcement, ad network Chitika released their analysis of tablet traffic from North America, and it appears the negative market analysis has done little to diminish Apple’s grip on mobile traffic. The iPhone owned mobile traffic through the entire rise of smartphones, and now it seems the iPad has just as strong of a stranglehold on tablet traffic, raking in 81 percent of the market.

According to Marketing Land, this is actually a decrease from their 84 percent traffic share in June, but Chitika says no other single competitor has directly benefited. Their traffic may be down, but not a remarkable level by any means.

Recent data from the Pew Research Center says that 35 percent of Americans over the age of 16 own a tablet, and clearly the iPad is the most popular option for browsing the internet. However, the Kindle has proven to be the most successful Android tablet in North America, so it may be that consumers are simply choosing the tablet most suited for their needs: e-books or the internet.

If there is one way to concisely explain the changes Google’s search algorithms have gone through in the past couple years, it would boil down to “bigger is not always better.” Gone are the days that you can jam as many keywords as you could fit into a paragraph of text, or buy up countless thousands of links and hope to rank highly.

However, the more you do to offer quality content and information to your users while staying in line with Google’s practices, the more success you’ll see.

Those two ideas are fairly common knowledge now, but they have created their own fair share of questions. Where should the balance between quantity and quality lie? How is this content evaluated? Does quantity of content outweigh quality of content?

Google has given some insight into how content is evaluated in the past, and it is clear that you won’t get far with an excessive amount of paper-thin content. Still, the number of indexed pages your site has does indeed have an effect on your ranking. So how exactly does this work and what is the balance?

Matt Cutts, Google’s head of Webspam, addressed this type of issue head-on in his most recent Webmaster Chat video. He was asked, “Does a website get a better overall ranking if it has a large amount of indexed pages?”

Cutts explained that having more indexed pages isn’t a magic ticket to higher rankings. He said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

However, having more indexed pages does have some clear benefits. The more pages you have, the more opportunities you have to rank for different keywords. But, this is only because you should be covering a larger variety of keywords and topics across that larger group of pages.

A larger number of indexed pages is also likely to improve your overall links and PageRank, which can affect your ranking. But, the link isn’t direct. Simply having more pages won’t improve much for you. Instead, you have to use those extra pages to deliver valuable content and information to your users. If you’re just filling your site with a meaningless wealth of pages to be indexed, you won’t be seeing any improvement anytime soon.

Last week some people began noticing that large banner ads were appearing on Google for a select few branded search results. This test of huge banner ads has caused quite a bit of a stir across the internet, especially because it seems to break a promise Google made all the way back in 2005.

When Google partnered with AOL eight years ago, Marissa Mayer, then Google VP of search products and user experience, issued a promise that users would never see banner ads on their results. She said:

“There will be no banner ads on the Google homepage or web search result pages. There will not be crazy, flashy, graphical doodads flying and popping all over the Google site. Ever.”

One could argue that some of the Google Doodle homepage logos commemorating special events would qualify as “crazy, flashy, graphical doodads”, those have never caused any worry because they are simply a flourish added to the homepage logo. However, it is indisputable that the new ad tests Google is running breaks their “no banner ads” promise outright. But, is it a bad thing?

The most notable aspect of the banner ads is that they only appear for branded searches. That means, if you search for Crate & Barrel, you might be shown the banner for Crate & Barrel. You won’t, however, be seeing any ads for random companies unrelated to your search, as you would normally associate with the term ‘banner ad’.

These ads are also linked to the brand’s website, providing users with an obvious, visually pleasing way to immediately find the business they are looking for. With careful moderation of banners, they could potentially allow businesses to essentially own their branded searches.

One of the biggest concerns for consumers regarding these ads is how they are used. Few users will be upset for the easily identifiable link with an aesthetically pleasing image showing when they search for a specific brand. However, if this test expands and advertisers are ever allowed to use these banners to advertise sales or other more advertising-styled banners, there may be a backlash.

Currently, it is estimated that 30 advertisers are currently being involved in the test, including Southwest Airlines, Virgin America, and Crate & Barrel. The test banner ads are also only being shown for 5 percent or less of search queries, so it is entirely possible you won’t run into one for quite a while.

Search Engine Land has created a FAQ for advertisers curious how this might affect the future of Google marketing, and Google released a statement on Friday, which read:

“We’re currently running a very limited, US-only test, in which advertisers can include an image as part of the search ads that show in response to certain branded queries. Advertisers have long been able to add informative visual elements to their search ads, with features like Media Ads, Product Listing Ads and Image Extensions.”

Now that the dust has settled after some extended debate, it seems clear that responsive design is here to stay. It won’t last forever, but it certainly isn’t a flashy trend that is going to fade away soon. It makes sense responsive design would catch on like it has, as it makes designing for the multitude of devices used to access the internet much easier than ever before.

Almost as many people accessing the internet right this moment are doing so using a smartphone or tablet, but they aren’t all using the same devices. A normal website designed to look great on a desktop won’t look good on a smartphone, but similarly a site designed to work well on the new iPhone won’t have the same results on a Galaxy Note 3.

This problem has two feasible solutions for designers. Either you can design multiple versions of a website, so that there is a workable option for smartphones, tablets, and desktops, or you can create a responsive website which will look good on every device. Both options require you to test your site on numerous devices to ensure it actually works great across the board, but a responsive site means you only have to actually design one site. The rest of the work is in the tweaking to optimize the site for individual devices.

That all explains why designers love responsive design as a solution for the greatly expanding internet browsing options, but we have to please other people with our designs as well. Thankfully, responsive design has benefits for everyone involved. The design solution is even great for search engine optimization, which is normally not the case with design and optimization working together. Saurabh Tyagi explains how responsive design benefits SEO as much as it does consumers.

Google Favors Responsive Sites

SEO professionals spend a lot of their time and efforts simply trying to appease the Google Gods, or trying to follow the current best practices while also managing to outplay their competition. Google has officially included responsive design into its best practice guidelines, as well as issuing public statements calling for websites to adopt the design strategy, so naturally SEOs have come to love it.

One of the biggest reasons Google loves responsive sites is that it allows websites to use the same URL for a mobile site as they do for a desktop site, instead of redirecting users. A site with separate URLs will have a harder time gaining in the rankings than one with a single functional URL.

Improves the Bounce Rate

Getting users to stay on your page is actually easier than you might think. If you represent yourself honestly to search engines, and offer a functional, readable, and generally enjoyable website, users that click on your page are likely to stay there. By ensuring your website is functional and enjoyable on nearly every device, you ensure users are less likely to hit the back button.

Save on SEO

Having a separate mobile site from your desktop site means double the SEO work. Optimization is neither cheap, fast, or easy, so it doesn’t make sense to waste all that extra time and work on basically duplicate efforts. Instead of having to optimize two sites, responsive websites allow SEOs to put all their efforts into one site, saving you money and providing a more focused optimization effort.

Avoids Duplicate Content

When you’re having to manage running two sites for the same business, it is highly likely you will eventually end up accidentally placing duplicate content on one of the sites. If this becomes a regular problem, you can expect punishments from search engines which could be easily avoidable by simply having one site. Responsive design also makes it easier to direct users to the right content. One of Google’s biggest mobile pet peeves of the moment is the practice of consistently redirecting mobile users to the front page of the mobile site, rather than to the mobile version of the content they asked for. Responsive design avoids these types of issues altogether.

If you’ve spent much time online in the past year or two, it is almost certain you’ve come across an infographic. They are highly enjoyed by the public, as well as being educational. This is why more companies and content creators are using infographics to communicate and share knowledge with the public than ever before. Some may say it is just a trend, but either way the data shows that searches for infographics have risen over 800 percent in just two years, from 2010 to 2012.

Even if you don’t know what an infographic is, the chances still favor that you have seen one either in your Facebook feed, a news article, or maybe even your email. Infographics are images intended to share information, data, or knowledge in a quick and easily comprehensible way. They turn boring information into interesting visuals which not only make the information easier to understand, but also make the average viewer more interested in what is being communicated.

According to Albert Costill, multiple studies have found that 90 percent of the information we retain and remember is based on visual impact. Considering how much information take in on a day to day basis, and that means you’re content should be visually impressive if you want to have a hope of viewers remembering it. If you’re still unsure about infographics, there are several reasons you should consider at least including them occasionally within your content strategy.

  1. Infographics are naturally more eye-catching than printed words, and a well laid-out infographic will catch viewers attention in ways standard text can’t. You’re free to use more images, colors, and even movement which are more immediately visually appealing.
  2. The average online reader tends to scan text rather than reading every single word. Infographics combat this tendency by making viewers more likely to engage all of the information on the screen, but they also make it easier for those who still scan to find the information most important to them.
  3. Infographics are more easily sharable than most other types of content. Most social networks are image friendly, so users are given two very simple ways to show their friends their favorite infographics. Readers can share a link directly to your site, or they can save the image and share it directly. The more easily content can be shared, the more likely it is to go viral.
  4. Infographics can subliminally help reinforce your brand image, so long as you are consistent. Using consistent colors, shapes, and messages, combined with your logo all work to raise your brand awareness. You can see how well this works when you notice that every infographic relating to Facebook naturally uses “Facebook Blue” and reflects the style of their brand.

Obviously you shouldn’t be putting out an infographic every day. Blog posts still have their place in any content strategy. Plus, if you are creating infographics daily, it is likely their quality will suffer. Treat infographics as a tool that can be reserved for special occasions or pulled out when necessary. With the right balance, you’ll find your infographics can be more powerful and popular than you ever imagined.

There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.