Google is making it easier for webmasters to identify and address smartphone specific errors they might not have known about in the past. Previously, detecting and fixing errors that happen on smartphone errors was complicated, so the search engine added a section to the crawl errors report in Webmaster Tools that displays the more common errors Google sees webmasters make in regards to how mobile users access their site.

Pierre Far, Google Webmaster Trends Analyst announced the feature earlier today, saying that some of the errors may “significantly hurt your website’s user experience and are the basis of some of our recently-announced ranking changes for smartphone search results.” While Google is trying to help make it easier for webmasters to solve problems with their site, the search engine is also using this as another means to push webmasters towards making their sites more mobile friendly.

The new report for smartphone errors looks like this:

Smartphone Errors

Some of the errors included are:

  • Server errors: A server error is when Googlebot got an HTTP error status code when it crawled the page.
  • Not found errors and soft 404’s: A page can show a “not found” message to Googlebot, either by returning an HTTP 404 status code or when the page is detected as a soft error page.
  • Faulty redirects: A faulty redirect is a smartphone-specific error that occurs when a desktop page redirects smartphone users to a page that is not relevant to their query. A typical example is when all pages on the desktop site redirect smartphone users to the homepage of the smartphone-optimized site.
  • Blocked URLs: A blocked URL is when the site’s robots.txt explicitly disallows crawling by Googlebot for smartphones. Typically, such smartphone-specific robots.txt disallow directives are erroneous. You should investigate your server configuration if you see blocked URLs reported in Webmaster Tools.

Not only are these errors capable of ruining the user experience for visitors on mobile devices, they can severely damage your site’s visibility if you don’t resolve the issues quickly. At least now there is a convenient way for you to find the problems.

Google's John Mueller Courtesy of Google+

John Mueller

Recently I discussed a common issue sites have where a misplaced noindex tag on the front page of a site can keep search engines from crawling or indexing your site. It happens all the time, but it isn’t the only reason your site might not be crawled. The good news is there is little to no long term damage done to your site or your SEO, according to a recent statement from Google’s John Mueller.

Barry Schwartz noticed Mueller had responded to a question on the Google Webmaster Help forums from an employee for a company who had accidentally blocked GoogleBot from crawling and indexing their site. In John Mueller’s words:

From our point of view, once we’re able to recrawl and reprocess your URLs, they’ll re-appear in our search results. There’s generally no long-term damage caused by an outage like this, but it might take a bit of time for things to get back to “normal” again (with the caveat that our algorithms change over time, so the current “normal” may not be the same state as it was before).

So don’t worry to much if you discover you find your site has been having problems with crawling or indexing. What matters is how quickly you respond and fix the problem. Once the issue is solved, everything should return to relatively normal. Of course, as Mueller mentions, you might not return back to your exact same state because these things are always fluctuating.

Local SEO Infographic Banner

It constantly surprises me how many local businesses don’t believe in investing in proper online marketing and optimization. Given, I see every day how establishing a quality online presence and optimizing it for higher visibility can benefit a business. Still, many local businesses hold the conception that online marketing is only important for national level businesses, and they couldn’t be more wrong.

Current estimates say that more than 2.6 billion local searches are conducted every month. More importantly, statistics show that these local searchers are becoming more and more mobilized to quickly go from search to purchase thanks to the use of smartphones to search on the go. Nearly 86 million people are regularly using their mobile phones to look up local business information, and these searchers are highly primed to convert. Simply put, without an online presence and the optimization to make your brand visible you are missing out on a large chunk of potential customers.

Hubshout recently created an infographic to illustrate how important local search engine optimization (SEO) really is for your business. Not only does the infographic show what you are missing out on by neglecting your online presence, it also shows how many many businesses have yet to establish themselves online in a meaningful way. There is still a lot of untapped opportunity online, you just have to make the leap.

Local SEO Infographic

Source: Hubshout

 

 

Stop Sign

Source: Sharon Pruitt

Sometimes the source of the problem is so glaringly simple that you would never consider it. This is the case of many webmasters frustrated with their sites not being indexed or ranked by search engines. While there are numerous more technical reasons search engines might refuse to index your page, a surprising amount of time the problem is caused by you telling the search engine not to index your site with a noindex tag.

This is frequently overlooked, but it can put a complete halt to your site’s rankings and visibility. Thankfully it is also very easy to fix. The biggest hassle is trying to actually find the redirect, as they can be hard to spot due to redirects. But, you can use a http header checker tool to verify before the site page redirects.

Don’t be embarrassed if this small mistake has been keeping you down. As Barry Schwartz mentions on SEO Roundtable, there have been large Fortune 500 companies with these same problems. John Mueller also recently ran into someone with a noindex on their homepage. He noticed a thread in the Google Webmaster Help forums where a site owner had been working to fix his problem all day with the help of the other forum members. John explained the problem wasn’t nearly as complex as everyone else had suggested. It was much more obvious:

It looks like a lot of your pages had a noindex robots meta tag on them for a while and dropped out because of that. In the meantime, that meta tag is gone, so if you can keep it out, you should be good to go :).

When you encounter a problem with your site ranking or being indexed, it is always best to start with the most obvious possible causes before going to the bigger and more difficult mistakes. While we all like to think we wouldn’t make such a simple mistake, we all also let the small things slip by.

Matt CuttsUsually Matt Cutts, esteemed Google engineer and head of Webspam, uses his regular videos to answer questions which can have a huge impact on a site’s visibility. He recently answered questions about using the Link Disavow Tool if you haven’t received a manual action, and he often delves into linking practices which Google views as spammy. But, earlier this week he took to YouTube to answer a simple question and give a small but unique tip webmasters might keep in mind in the future.

Specifically, Cutts addressed the need to have a unique meta tag description for every individual page on your site. In an age where blogging causes pages to be created every day, creating a meta tag description can seem like a fruitless time-waster, and according to Cutts it kind of is.

If you take the time to create a unique meta tag description for every page, you might see a slight boost in SEO over your competitors, but the difference will be negligible compared to the other aspects of your site you could spend that time improving. In fact, overall it may be better to simply leave the meta description empty than to invest your time paying attention to such a small detail. In fact, on his own blog, Cutts doesn’t bother to use meta descriptions at all.

Cutts does say that you shouldn’t try to skimp on the meta tag descriptions by using copy directly from your blog. It is better to have no meta tag description than to possibly raise issues with duplicate content, and Google automatically scans your content to create a description any time you don’t make one.

Page Rank

Source: Felipe Micaroni Lalli

Ever since the roll-out of Google’s Penguin algorithm there has been a substantial amount of confusion regarding the current state of link building within the search marketing community. Thanks to Google’s vague practices everyone has an opinion on an algorithm which few actually understand in depth. Everything we know on this side comes from what Google has told us and what we’ve seen from data and analysis in the two years since Penguin came out.

The fact of the matter is that link building in the post-Penguin climate is risky business, but it is important for your online presence. If anything, links are more potent for your visibility than ever before. The problem is the rules are stricter now. You can’t buy and sell wholesale links, and bad links can be heavily damaging to your traffic and profits.

If you acquire quality links, your site is likely excelling in numerous areas and seeing success in both web traffic and search engine visibility. However, getting the wrong types of inbound links is almost certain to result in penalties from Google. In fact, Jayson DeMers from Search Engine Land says it is often more expensive to clean up the mess from bad backlinks than it would be to just acquire good links to begin with.

So what exactly constitutes a bad link? A bad link is any which is gained through questionable methods or goes against Google’s best practices. DeMers pinpointed six of these link building tactics which are likely to cause you problems if you attempt them.

Paid Links – Buying or selling links in the post-Penguin market is the same as putting a target on your website’s metaphorical back. Your site will get seen and penalized. Google has openly stated multiple times that buying or selling links is a huge no-no, and even links from long ago can come back to haunt you.

Article Directory Links – Article directory links were once a staple of link building because they were easy to get and they worked. But, low-quality spun content and distribution software relegated to the spammy category. At this point, Google has outright penalized many article directories, and this practice won’t help your SEO anymore.

Link Exchanges – For years link exchanges were a highly popular form of link building. It almost seemed like common courtesy to practice the concept of “you link to me and I’ll link back to you”, but of course many began to abuse the system. Once it was compromised and turned into a large scale pattern of link scheming, Google shut it down.

Low-Quality Press Releases – A press release is still a popular means of announcing important company information to the public, but don’t expect them to help your SEO. Most free press release submission websites are entirely ignored by Google.

Low Quality Directory Links – There are still a small number of industry-specific directories that are great for helping certain industries gain good links and traffic, the majority of old, free directory sites have been de-indexed by Google, and the search engine has publicly denounced the practice. In general, you should be staying away from low-quality directory links.

Link Pyramids, Wheels, Etc., – Over time, many SEOs came to believe they could get around Google’s watchful eye by using methods to artificially pass page rank through multiple layers of links, obscuring the distribution patter. But, in May, Matt Cutts, Google’s head of Webspam mentioned how the new version of Pengion has been refined to further fight link spammers and more accurately measure link quality. While we don’t know for sure what practices Cutts was referencing, it is widely believed he was talking about link pyramids and wheels.

Top 20 Local Search Ranking Factors

Local ranking has grown into its own over the past couple years. A combination of increased visibility and more shoppers using their smartphones to find local business on the go has made local SEO a significant part of online marketing and it can almost be treated entirely seperate from traditional SEO practices. By that I mean that while traditional SEO will still help your local optimization efforts, local SEO has its own list of unique ranking factors that local marketers have to keep in mind.

Starting in 2008, David Mihm began identifying and exploring these unique local SEO ranking factors. After 5 years, Local Search Ranking Factors 2013 has found 83 foundational ranking factors. Each factor helps decide your placement in online search results and how well you manage all of these individual factors help how you end up ranking. They can be the difference between a boost in business and a heightened profile in your market or a wasted investment and floundering online presence.

While you can find the full list of ranking factors on the Moz page for Local Search Ranking Factors 2013, the Moz team also took the time to create an illustrated guide to the 20 most important ranking factors for local businesses. While none of the factors they illustrate will come as a surprise to an experienced local marketer, they will help new website owners get their business out of the middle and in the top of the local market.

matt-cutts

Google’s Matt Cutts

With the big crackdown on spammy link building practices over the past two years at Google, there are still many webmasters left with questions about what exactly constitutes a spammy practice. Google has previously advised against using links in forum “signatures” as a means of link building, but what about using a link in a comment when it is topically relevant and contributes to the conversation? That is exactly the question Matt Cutts answered in a Webmaster Chat video on Wednesday.

The short answer is that using links to your site in your comments is fine the majority of the time. Everyone who actually contributes to forums has a habit of linking to relevant information, and that often includes their own blogs. But, like everything, it can be abused.

Matt gave some tips to ensure your comments don’t get flagged as spammy by Google or the sites you are commenting on.

  • If you can, use your real name when commenting. Using a company name or anchor text you want to rank for gives the appearance of commenting for commercial marketing purposes, which raises the spam alarm.
  • If you are using leaving links in blog post comments as your primary means for linkbuilding and the majority of your links come from blog comments, Google will probably flag you.

You can see the video below.

 

Google Hangouts IconThis Monday, site owners looking for advice will have the opportunity to have their website briefly reviewed by Google, as John Mueller announced on Google+. The short site reviews will be taking place November 18th at 10am EDT and will last one-hour. Search Engine Land suggests the event will be lead by Mueller, though no one is quite sure the format this event will be in.

To have your site reviewed, you have to add the site to this Google Moderator page. Then, if Google has the time and chooses your site, it will be reviewed live this upcoming Monday via Google+ Hangouts.

You can also RSVP for the event by going to this page and add it to your calendar.
John’s statement explained the event, saying:

For this hangout, we’ll review sites that are submitted via the moderator page and give a short comment on where you might want to focus your efforts, assuming there are any issues from Google’s point of view :).

Bing Featured VideoOn Monday, Bing rolled out a brand new music video search results page. The new feature allows you to search for a music video by song title, artist, or album, and users will see a box at the top of the results that highlights the most popular music videos related to the search, and a list of “Top Songs” for the query.

Bing’s result page collects videos from “leading sites including YouTube, Vimeo, MTV, Artist Direct, and more.” The videos listed beneath the featured video are ranked based on relevancy to the search, so an artist’s name will only mostly show their videos, while a search for a specific song returns more covers and amateur music videos.

Bing Videos Screenshot

Users are able to preview song’s without clicking by simply mousing over.

You will also notice a sidebar to the music video search results page which includes a related artist or related albums list so you can more easily find music in the same vein as you enjoy.

One nice little feature is that Bing has collected certain videos as they were originally ordered on an album. Search Engine Land reports a search for Pink Floyd’s Dark Side of the Moon results in Bing listing the songs in the original order along with the featured video.

Bing-music-video-Dark-side-of-the-moon-600x192