Tag Archive for: SEO

Matt CuttsUsually Matt Cutts, esteemed Google engineer and head of Webspam, uses his regular videos to answer questions which can have a huge impact on a site’s visibility. He recently answered questions about using the Link Disavow Tool if you haven’t received a manual action, and he often delves into linking practices which Google views as spammy. But, earlier this week he took to YouTube to answer a simple question and give a small but unique tip webmasters might keep in mind in the future.

Specifically, Cutts addressed the need to have a unique meta tag description for every individual page on your site. In an age where blogging causes pages to be created every day, creating a meta tag description can seem like a fruitless time-waster, and according to Cutts it kind of is.

If you take the time to create a unique meta tag description for every page, you might see a slight boost in SEO over your competitors, but the difference will be negligible compared to the other aspects of your site you could spend that time improving. In fact, overall it may be better to simply leave the meta description empty than to invest your time paying attention to such a small detail. In fact, on his own blog, Cutts doesn’t bother to use meta descriptions at all.

Cutts does say that you shouldn’t try to skimp on the meta tag descriptions by using copy directly from your blog. It is better to have no meta tag description than to possibly raise issues with duplicate content, and Google automatically scans your content to create a description any time you don’t make one.

Page Rank

Source: Felipe Micaroni Lalli

Ever since the roll-out of Google’s Penguin algorithm there has been a substantial amount of confusion regarding the current state of link building within the search marketing community. Thanks to Google’s vague practices everyone has an opinion on an algorithm which few actually understand in depth. Everything we know on this side comes from what Google has told us and what we’ve seen from data and analysis in the two years since Penguin came out.

The fact of the matter is that link building in the post-Penguin climate is risky business, but it is important for your online presence. If anything, links are more potent for your visibility than ever before. The problem is the rules are stricter now. You can’t buy and sell wholesale links, and bad links can be heavily damaging to your traffic and profits.

If you acquire quality links, your site is likely excelling in numerous areas and seeing success in both web traffic and search engine visibility. However, getting the wrong types of inbound links is almost certain to result in penalties from Google. In fact, Jayson DeMers from Search Engine Land says it is often more expensive to clean up the mess from bad backlinks than it would be to just acquire good links to begin with.

So what exactly constitutes a bad link? A bad link is any which is gained through questionable methods or goes against Google’s best practices. DeMers pinpointed six of these link building tactics which are likely to cause you problems if you attempt them.

Paid Links – Buying or selling links in the post-Penguin market is the same as putting a target on your website’s metaphorical back. Your site will get seen and penalized. Google has openly stated multiple times that buying or selling links is a huge no-no, and even links from long ago can come back to haunt you.

Article Directory Links – Article directory links were once a staple of link building because they were easy to get and they worked. But, low-quality spun content and distribution software relegated to the spammy category. At this point, Google has outright penalized many article directories, and this practice won’t help your SEO anymore.

Link Exchanges – For years link exchanges were a highly popular form of link building. It almost seemed like common courtesy to practice the concept of “you link to me and I’ll link back to you”, but of course many began to abuse the system. Once it was compromised and turned into a large scale pattern of link scheming, Google shut it down.

Low-Quality Press Releases – A press release is still a popular means of announcing important company information to the public, but don’t expect them to help your SEO. Most free press release submission websites are entirely ignored by Google.

Low Quality Directory Links – There are still a small number of industry-specific directories that are great for helping certain industries gain good links and traffic, the majority of old, free directory sites have been de-indexed by Google, and the search engine has publicly denounced the practice. In general, you should be staying away from low-quality directory links.

Link Pyramids, Wheels, Etc., – Over time, many SEOs came to believe they could get around Google’s watchful eye by using methods to artificially pass page rank through multiple layers of links, obscuring the distribution patter. But, in May, Matt Cutts, Google’s head of Webspam mentioned how the new version of Pengion has been refined to further fight link spammers and more accurately measure link quality. While we don’t know for sure what practices Cutts was referencing, it is widely believed he was talking about link pyramids and wheels.

Top 20 Local Search Ranking Factors

Local ranking has grown into its own over the past couple years. A combination of increased visibility and more shoppers using their smartphones to find local business on the go has made local SEO a significant part of online marketing and it can almost be treated entirely seperate from traditional SEO practices. By that I mean that while traditional SEO will still help your local optimization efforts, local SEO has its own list of unique ranking factors that local marketers have to keep in mind.

Starting in 2008, David Mihm began identifying and exploring these unique local SEO ranking factors. After 5 years, Local Search Ranking Factors 2013 has found 83 foundational ranking factors. Each factor helps decide your placement in online search results and how well you manage all of these individual factors help how you end up ranking. They can be the difference between a boost in business and a heightened profile in your market or a wasted investment and floundering online presence.

While you can find the full list of ranking factors on the Moz page for Local Search Ranking Factors 2013, the Moz team also took the time to create an illustrated guide to the 20 most important ranking factors for local businesses. While none of the factors they illustrate will come as a surprise to an experienced local marketer, they will help new website owners get their business out of the middle and in the top of the local market.

It is hard to ignore how quickly mobile traffic has grown to become an essential part of how people access the internet, but there are still a fair amount of brands burying their heads in the sand and pretending nothing has really changed. It is almost astounding to see how many are stuck in the past and refuse to invest in going mobile. With some brands estimating that half of their traffic comes from mobile devices, it is clear that brands who refuse to step-up are going to begin suffering very soon.

We know how popular smartphones and tablets are now, but we don’t actually know how much of all online traffic comes from these devices. Some analysts estimate as low as 15 percent of all traffic is coming from mobile devices, while others have said that as much as a third is coming from non-desktop devices. With such a large range, it has difficult to discern what the exact amount of mobile traffic is, but these studies do give us insight into the direction things are going.

Mobile Traffic Report

For example, Greg Sterling reports that public relations firm Walker Sands released their latest quarterly index of mobile traffic to their clients’ websites, and they estimate 28 percent of their clients’ traffic is coming from smartphones and tablets. The problem is their sample is too small for their estimate to be very relevant when dealing with the big picture. However, because of how regularly they compile and release this data, we can use their report to see the direction the market is going, and the market is largely going mobile.

Walker Sands actually found a small drop from 29 percent of traffic coming from mobile devices to 28 percent, but those numbers are a big leap from 17.5 percent at this time last year, and a one percent drop in mobile traffic isn’t large enough to draw any conclusions that mobile traffic is faltering.

It becomes even more apparent that mobile is becoming a hugely important consideration for online marketing when you consider that Facebook currently estimates that a third of their users access the site strictly from mobile devices and Yelp says that 59 percent of their searches are now coming from mobile.

The big takeaway, as Sterling points out, is that marketers are doing themselves a massive disservice by ignoring mobile traffic or even by just treating mobile traffic as secondary. Every marketer should be taking mobile traffic seriously, and not treating it as secondary. For some markets, it may even be best to put mobile ahead of desktop in their priorities.

Android

Source: Google

Smartphones have revolutionized how we browse the web, but most browsing still happens within the same web browsers we have all grown accustomed to. For the most part, we do our searches and actual browsing from Chrome, Safari, or Firefox, while we limit our apps to games, reading the news, or taking care of business. But, that all could change in the near future.

Google announced late last week that they would begin allowing Android app developers to have their app content indexed. That content will then be able to be opened directly through apps on Android devices. It is a large step towards a more seamless user experience on smartphones and tablets, rather than the disjointed experience we currently enjoy.

Googlebot has been improved to be able to index the content of apps, either through a sitemap file or through Google’s Webmaster Tools, though the feature is currently only in the testing phase. This means the indexing is only currently available to a small selection of developers, and signed-in users won’t begin to see the app content in their result for a few weeks.

The update means that searches will be able to return information from app content, which will then open directly in the intended app. For websites which tend to offer the same content on both their website and their app, such as news sites, it means users will be able to pick their desired experience, whether it be from within the browser or within the app.

Jennifer Slegg reports that app developers can sign up to let Google know they are interested in having their apps indexed by filling out an application of interest. Before you do though, you should know that your app must have deep linking enabled, and you will have to provide Google with information about alternate URLs either within their sitemap or in a link element within the pages of their site.

Indexing is only available for Android apps currently, and Google has yet to comment on when or if they will extend the capability to iPhone or Windows apps.

No matter how bad of shape your website is in, Google will crawl it. Google crawls and indexes seemingly the entire internet. Though we know they may not look as deep into low-quality websites, that doesn’t mean they haven’t at least crawled and indexed the landing page. It takes something truly special to keep Google from crawling and indexing a page, but there are two common mistakes that can actually manage to keep Google away.

Technical SEO is one of the most difficult aspects of optimization to grasp, but if you are making these two simple mistakes, it can keep search engines, especially Google, from correctly indexing your websites. If your site isn’t getting correctly indexed, you have absolutely no chance of ranking well. Until you fix the problem your site is going to be severely crippled, so it is imperative you aren’t ignoring these issues.

1. The 301 Redirects on Your Website are Broken

It is a commonly accepted practice to use 301 redirects after a website redesign. As Free-SEO-News mentioned in their latest newsletter, using these redirects properly allows you to retain the ranking equity you’ve built with your website, rather than having to start again from the bottom.

The problem is when these 301 redirects aren’t implemented properly. Even worse, sometimes properly working redirects can suddenly falter, so you can’t place your faith in the redirects working correctly forever. Code changes, new plugins, or broken databases can cause your working 301’s to begin linking to non-existing pages.

Broken links are an automatic wrecking ball to all your efforts building a solid link portfolio. The best way to ensure that all your links are working is to download a website audit tool, such as SEOprofiler, which automatically checks all of your links and redirects. If your links or redirects suddenly stop working, you will be warned before you start getting punished by the search engines.

2. Rel=canonical Attributes Are Causing Problems

Just as with 301 redirects, the rel=canonical attribute serves a legitimate purpose when used correctly. The attribute can help you avoid problems with duplicate content, but those using the tag without knowing what they are doing can find themselves with some major issues.

Two of the biggest faux pas that we see regularly committed by site owners are to add a rel=canonical attribute which points to the index page to all web pages or to other pages that use the ‘noindex’ attribute. In both scenarios, Google won’t index the web pages at all.

The best advise is to simply stay away from the rel=canonical attribute unless you are absolutely sure what you’re doing. The only proper time to use the attribute is on duplicate pages, and anywhere else will result in significant problems. The problems that can come from using the attribute incorrectly are much worse than those you might see by failing to use the tag on duplicate pages.

You may remember that Google recently started testing large banner ads on branded searches. It raised quite a stir in the online community, mostly because it seemed that Google blatantly broke an older promise to never show banner ads. But, Bing is taking branded search result ads to the next level.

Larry Kim reports that last week, at the Bing Ads Next conference, Bing Ads announced their new ad format for exact match keyword searches, specifically those done within the latest Windows 8 update. Instead of a relatively small banner ad, Bing Ads are rolling out Bing Hero Ads, a full landing-page like layout that aggressively promotes the exact brand.

Just as with Google’s banner ads, Bing Hero Ads are only starting with a small number of prominent brand advertisers, such as Disney, Home Depot, Land Rover, and Volkswagen. It will also be a while before you can expect to see Hero Ads on your average search. For the moment, they are only appearing in a small selection of searches done in Windows 8.1 within the US.

It will be interesting to see how the public reacts to these types of branded semi-landing pages. Google’s banner ads looked fairly customized for each brand, , and only take up a relatively small amount of on-page real estate. A full-page ad experience for exact match branded searches may be welcomed as a quick and efficient way to connect with the brand searchers are looking for. It is also possible that consumers will be turned off by the seemingly uniform ad experience.

The one clear advantage Bing’s Hero Ads have over Google’s banner ads is their ability to deep link directly to a larger amount of pages on a site. They offer links such as “contact us”, “find a store” and “request a quote” which speed up users experiences and allow them to convert more quickly.

Last week, Apple announced their new iPads, the iPad Air, a thinner, lighter, and more powerful version of their full-size tablet, as well as an updated iPad mini with Retina Display. The broad public response to Apple’s latest products seems to be underwhelming, but it hasn’t seemed to sway how popular Apple products are with consumers.

The day of Apple’s big announcement, ad network Chitika released their analysis of tablet traffic from North America, and it appears the negative market analysis has done little to diminish Apple’s grip on mobile traffic. The iPhone owned mobile traffic through the entire rise of smartphones, and now it seems the iPad has just as strong of a stranglehold on tablet traffic, raking in 81 percent of the market.

According to Marketing Land, this is actually a decrease from their 84 percent traffic share in June, but Chitika says no other single competitor has directly benefited. Their traffic may be down, but not a remarkable level by any means.

Recent data from the Pew Research Center says that 35 percent of Americans over the age of 16 own a tablet, and clearly the iPad is the most popular option for browsing the internet. However, the Kindle has proven to be the most successful Android tablet in North America, so it may be that consumers are simply choosing the tablet most suited for their needs: e-books or the internet.

If there is one way to concisely explain the changes Google’s search algorithms have gone through in the past couple years, it would boil down to “bigger is not always better.” Gone are the days that you can jam as many keywords as you could fit into a paragraph of text, or buy up countless thousands of links and hope to rank highly.

However, the more you do to offer quality content and information to your users while staying in line with Google’s practices, the more success you’ll see.

Those two ideas are fairly common knowledge now, but they have created their own fair share of questions. Where should the balance between quantity and quality lie? How is this content evaluated? Does quantity of content outweigh quality of content?

Google has given some insight into how content is evaluated in the past, and it is clear that you won’t get far with an excessive amount of paper-thin content. Still, the number of indexed pages your site has does indeed have an effect on your ranking. So how exactly does this work and what is the balance?

Matt Cutts, Google’s head of Webspam, addressed this type of issue head-on in his most recent Webmaster Chat video. He was asked, “Does a website get a better overall ranking if it has a large amount of indexed pages?”

Cutts explained that having more indexed pages isn’t a magic ticket to higher rankings. He said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

However, having more indexed pages does have some clear benefits. The more pages you have, the more opportunities you have to rank for different keywords. But, this is only because you should be covering a larger variety of keywords and topics across that larger group of pages.

A larger number of indexed pages is also likely to improve your overall links and PageRank, which can affect your ranking. But, the link isn’t direct. Simply having more pages won’t improve much for you. Instead, you have to use those extra pages to deliver valuable content and information to your users. If you’re just filling your site with a meaningless wealth of pages to be indexed, you won’t be seeing any improvement anytime soon.

Now that the dust has settled after some extended debate, it seems clear that responsive design is here to stay. It won’t last forever, but it certainly isn’t a flashy trend that is going to fade away soon. It makes sense responsive design would catch on like it has, as it makes designing for the multitude of devices used to access the internet much easier than ever before.

Almost as many people accessing the internet right this moment are doing so using a smartphone or tablet, but they aren’t all using the same devices. A normal website designed to look great on a desktop won’t look good on a smartphone, but similarly a site designed to work well on the new iPhone won’t have the same results on a Galaxy Note 3.

This problem has two feasible solutions for designers. Either you can design multiple versions of a website, so that there is a workable option for smartphones, tablets, and desktops, or you can create a responsive website which will look good on every device. Both options require you to test your site on numerous devices to ensure it actually works great across the board, but a responsive site means you only have to actually design one site. The rest of the work is in the tweaking to optimize the site for individual devices.

That all explains why designers love responsive design as a solution for the greatly expanding internet browsing options, but we have to please other people with our designs as well. Thankfully, responsive design has benefits for everyone involved. The design solution is even great for search engine optimization, which is normally not the case with design and optimization working together. Saurabh Tyagi explains how responsive design benefits SEO as much as it does consumers.

Google Favors Responsive Sites

SEO professionals spend a lot of their time and efforts simply trying to appease the Google Gods, or trying to follow the current best practices while also managing to outplay their competition. Google has officially included responsive design into its best practice guidelines, as well as issuing public statements calling for websites to adopt the design strategy, so naturally SEOs have come to love it.

One of the biggest reasons Google loves responsive sites is that it allows websites to use the same URL for a mobile site as they do for a desktop site, instead of redirecting users. A site with separate URLs will have a harder time gaining in the rankings than one with a single functional URL.

Improves the Bounce Rate

Getting users to stay on your page is actually easier than you might think. If you represent yourself honestly to search engines, and offer a functional, readable, and generally enjoyable website, users that click on your page are likely to stay there. By ensuring your website is functional and enjoyable on nearly every device, you ensure users are less likely to hit the back button.

Save on SEO

Having a separate mobile site from your desktop site means double the SEO work. Optimization is neither cheap, fast, or easy, so it doesn’t make sense to waste all that extra time and work on basically duplicate efforts. Instead of having to optimize two sites, responsive websites allow SEOs to put all their efforts into one site, saving you money and providing a more focused optimization effort.

Avoids Duplicate Content

When you’re having to manage running two sites for the same business, it is highly likely you will eventually end up accidentally placing duplicate content on one of the sites. If this becomes a regular problem, you can expect punishments from search engines which could be easily avoidable by simply having one site. Responsive design also makes it easier to direct users to the right content. One of Google’s biggest mobile pet peeves of the moment is the practice of consistently redirecting mobile users to the front page of the mobile site, rather than to the mobile version of the content they asked for. Responsive design avoids these types of issues altogether.