Sometimes the source of the problem is so glaringly simple that you would never consider it. This is the case of many webmasters frustrated with their sites not being indexed or ranked by search engines. While there are numerous more technical reasons search engines might refuse to index your page, a surprising amount of time the problem is caused by you telling the search engine not to index your site with a noindex tag.
This is frequently overlooked, but it can put a complete halt to your site’s rankings and visibility. Thankfully it is also very easy to fix. The biggest hassle is trying to actually find the redirect, as they can be hard to spot due to redirects. But, you can use a http header checker tool to verify before the site page redirects.
Don’t be embarrassed if this small mistake has been keeping you down. As Barry Schwartz mentions on SEO Roundtable, there have been large Fortune 500 companies with these same problems. John Mueller also recently ran into someone with a noindex on their homepage. He noticed a thread in the Google Webmaster Help forums where a site owner had been working to fix his problem all day with the help of the other forum members. John explained the problem wasn’t nearly as complex as everyone else had suggested. It was much more obvious:
It looks like a lot of your pages had a noindex robots meta tag on them for a while and dropped out because of that. In the meantime, that meta tag is gone, so if you can keep it out, you should be good to go :).
When you encounter a problem with your site ranking or being indexed, it is always best to start with the most obvious possible causes before going to the bigger and more difficult mistakes. While we all like to think we wouldn’t make such a simple mistake, we all also let the small things slip by.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/11/seo-3559564_1280.jpg7971280Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-25 13:14:412013-11-25 13:14:41Is Your Site Struggling Due to a Misplaced NoIndex Tag?
Usually Matt Cutts, esteemed Google engineer and head of Webspam, uses his regular videos to answer questions which can have a huge impact on a site’s visibility. He recently answered questions about using the Link Disavow Tool if you haven’t received a manual action, and he often delves into linking practices which Google views as spammy. But, earlier this week he took to YouTube to answer a simple question and give a small but unique tip webmasters might keep in mind in the future.
Specifically, Cutts addressed the need to have a unique meta tag description for every individual page on your site. In an age where blogging causes pages to be created every day, creating a meta tag description can seem like a fruitless time-waster, and according to Cutts it kind of is.
If you take the time to create a unique meta tag description for every page, you might see a slight boost in SEO over your competitors, but the difference will be negligible compared to the other aspects of your site you could spend that time improving. In fact, overall it may be better to simply leave the meta description empty than to invest your time paying attention to such a small detail. In fact, on his own blog, Cutts doesn’t bother to use meta descriptions at all.
Cutts does say that you shouldn’t try to skimp on the meta tag descriptions by using copy directly from your blog. It is better to have no meta tag description than to possibly raise issues with duplicate content, and Google automatically scans your content to create a description any time you don’t make one.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-21 12:29:182013-11-21 12:29:18Matt Cutts Says No Meta Tag Description is Better Than Duplicate Content
Ever since the roll-out of Google’s Penguin algorithm there has been a substantial amount of confusion regarding the current state of link building within the search marketing community. Thanks to Google’s vague practices everyone has an opinion on an algorithm which few actually understand in depth. Everything we know on this side comes from what Google has told us and what we’ve seen from data and analysis in the two years since Penguin came out.
The fact of the matter is that link building in the post-Penguin climate is risky business, but it is important for your online presence. If anything, links are more potent for your visibility than ever before. The problem is the rules are stricter now. You can’t buy and sell wholesale links, and bad links can be heavily damaging to your traffic and profits.
If you acquire quality links, your site is likely excelling in numerous areas and seeing success in both web traffic and search engine visibility. However, getting the wrong types of inbound links is almost certain to result in penalties from Google. In fact, Jayson DeMers from Search Engine Land says it is often more expensive to clean up the mess from bad backlinks than it would be to just acquire good links to begin with.
So what exactly constitutes a bad link? A bad link is any which is gained through questionable methods or goes against Google’s best practices. DeMers pinpointed six of these link building tactics which are likely to cause you problems if you attempt them.
Paid Links – Buying or selling links in the post-Penguin market is the same as putting a target on your website’s metaphorical back. Your site will get seen and penalized. Google has openly stated multiple times that buying or selling links is a huge no-no, and even links from long ago can come back to haunt you.
Article Directory Links – Article directory links were once a staple of link building because they were easy to get and they worked. But, low-quality spun content and distribution software relegated to the spammy category. At this point, Google has outright penalized many article directories, and this practice won’t help your SEO anymore.
Link Exchanges – For years link exchanges were a highly popular form of link building. It almost seemed like common courtesy to practice the concept of “you link to me and I’ll link back to you”, but of course many began to abuse the system. Once it was compromised and turned into a large scale pattern of link scheming, Google shut it down.
Low-Quality Press Releases – A press release is still a popular means of announcing important company information to the public, but don’t expect them to help your SEO. Most free press release submission websites are entirely ignored by Google.
Low Quality Directory Links – There are still a small number of industry-specific directories that are great for helping certain industries gain good links and traffic, the majority of old, free directory sites have been de-indexed by Google, and the search engine has publicly denounced the practice. In general, you should be staying away from low-quality directory links.
Link Pyramids, Wheels, Etc., – Over time, many SEOs came to believe they could get around Google’s watchful eye by using methods to artificially pass page rank through multiple layers of links, obscuring the distribution patter. But, in May, Matt Cutts, Google’s head of Webspam mentioned how the new version of Pengion has been refined to further fight link spammers and more accurately measure link quality. While we don’t know for sure what practices Cutts was referencing, it is widely believed he was talking about link pyramids and wheels.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-20 11:39:282013-11-20 11:39:28Six Link Building Practices That Will Hurt Your Site
Local ranking has grown into its own over the past couple years. A combination of increased visibility and more shoppers using their smartphones to find local business on the go has made local SEO a significant part of online marketing and it can almost be treated entirely seperate from traditional SEO practices. By that I mean that while traditional SEO will still help your local optimization efforts, local SEO has its own list of unique ranking factors that local marketers have to keep in mind.
Starting in 2008, David Mihm began identifying and exploring these unique local SEO ranking factors. After 5 years, Local Search Ranking Factors 2013 has found 83 foundational ranking factors. Each factor helps decide your placement in online search results and how well you manage all of these individual factors help how you end up ranking. They can be the difference between a boost in business and a heightened profile in your market or a wasted investment and floundering online presence.
While you can find the full list of ranking factors on the Moz page for Local Search Ranking Factors 2013, the Moz team also took the time to create an illustrated guide to the 20 most important ranking factors for local businesses. While none of the factors they illustrate will come as a surprise to an experienced local marketer, they will help new website owners get their business out of the middle and in the top of the local market.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-18 13:16:262013-11-18 13:16:26An Illustrated Guide to the 20 Best Local Search Ranking Factors
With the big crackdown on spammy link building practices over the past two years at Google, there are still many webmasters left with questions about what exactly constitutes a spammy practice. Google has previously advised against using links in forum “signatures” as a means of link building, but what about using a link in a comment when it is topically relevant and contributes to the conversation? That is exactly the question Matt Cutts answered in a Webmaster Chat video on Wednesday.
The short answer is that using links to your site in your comments is fine the majority of the time. Everyone who actually contributes to forums has a habit of linking to relevant information, and that often includes their own blogs. But, like everything, it can be abused.
Matt gave some tips to ensure your comments don’t get flagged as spammy by Google or the sites you are commenting on.
If you can, use your real name when commenting. Using a company name or anchor text you want to rank for gives the appearance of commenting for commercial marketing purposes, which raises the spam alarm.
If you are using leaving links in blog post comments as your primary means for linkbuilding and the majority of your links come from blog comments, Google will probably flag you.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-15 10:30:232013-11-15 10:30:23Matt Cutts Weighs In On How To Use Links In Comments and Not Look Spammy
This Monday, site owners looking for advice will have the opportunity to have their website briefly reviewed by Google, as John Mueller announced on Google+. The short site reviews will be taking place November 18th at 10am EDT and will last one-hour. Search Engine Land suggests the event will be lead by Mueller, though no one is quite sure the format this event will be in.
To have your site reviewed, you have to add the site to this Google Moderator page. Then, if Google has the time and chooses your site, it will be reviewed live this upcoming Monday via Google+ Hangouts.
You can also RSVP for the event by going to this page and add it to your calendar.
John’s statement explained the event, saying:
For this hangout, we’ll review sites that are submitted via the moderator page and give a short comment on where you might want to focus your efforts, assuming there are any issues from Google’s point of view :).
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-14 13:37:372013-11-14 13:37:37Google Offers Short Site Reviews This Monday
On Monday, Bing rolled out a brand new music video search results page. The new feature allows you to search for a music video by song title, artist, or album, and users will see a box at the top of the results that highlights the most popular music videos related to the search, and a list of “Top Songs” for the query.
Bing’s result page collects videos from “leading sites including YouTube, Vimeo, MTV, Artist Direct, and more.” The videos listed beneath the featured video are ranked based on relevancy to the search, so an artist’s name will only mostly show their videos, while a search for a specific song returns more covers and amateur music videos.
Users are able to preview song’s without clicking by simply mousing over.
You will also notice a sidebar to the music video search results page which includes a related artist or related albums list so you can more easily find music in the same vein as you enjoy.
One nice little feature is that Bing has collected certain videos as they were originally ordered on an album. Search Engine Land reports a search for Pink Floyd’s Dark Side of the Moon results in Bing listing the songs in the original order along with the featured video.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-13 11:44:292013-11-13 11:44:29Bing Adds New Music Video Search Results Page
There’s a new manual action showing up in Google Webmaster Tools, according to Jessica Lee from Search Engine Watch. Webmaster Tools was updated over the summer so that site owners could be notified when a specific type of manual action had been taken against the site, and since then the waters have been fairly quiet. This new type of manual action, referred to as “image mismatch” is the first change we’ve seen since then.
If you see this message on the Manual Actions page, it means that some of your site’s images may be displaying differently on Google’s search results pages than they are when viewed on your site.
As a result, Google has applied a manual action to the affected portions of your site, which will affect how your site’s images are displayed in Google. Actions that affect your whole site are listed under Site-wide matches. Actions that affect only part of your site are listed under Partial matches.
If you end up receiving that message, it is up to you to ensure that your site is showing the same images to users both on your site and within Google image search results. It is possible “anti-hotlinking” tools can cause the issue, so you may have to look through your site’s code on the server.
As with all manual penalties, once the problem is fixed you have to submit your site for reconsideration and wait. And wait. And wait. Eventually, after you’ve waited for what seems like forever, you’ll get a message in your Webmaster Tools account informing whether the manual action will be revoked after review.
Manual actions are penalties at real, living Google employees have placed against your site after determining that you are violating Google’s guidelines. The majority of manual penalties have related to outright spammy practices such as user-generated spam, hidden text, and unnatural links.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-12 13:45:432013-11-12 13:45:43New Type of Manual Action Appears in Webmaster Tools
It is hard to ignore how quickly mobile traffic has grown to become an essential part of how people access the internet, but there are still a fair amount of brands burying their heads in the sand and pretending nothing has really changed. It is almost astounding to see how many are stuck in the past and refuse to invest in going mobile. With some brands estimating that half of their traffic comes from mobile devices, it is clear that brands who refuse to step-up are going to begin suffering very soon.
We know how popular smartphones and tablets are now, but we don’t actually know how much of all online traffic comes from these devices. Some analysts estimate as low as 15 percent of all traffic is coming from mobile devices, while others have said that as much as a third is coming from non-desktop devices. With such a large range, it has difficult to discern what the exact amount of mobile traffic is, but these studies do give us insight into the direction things are going.
For example, Greg Sterling reports that public relations firm Walker Sands released their latest quarterly index of mobile traffic to their clients’ websites, and they estimate 28 percent of their clients’ traffic is coming from smartphones and tablets. The problem is their sample is too small for their estimate to be very relevant when dealing with the big picture. However, because of how regularly they compile and release this data, we can use their report to see the direction the market is going, and the market is largely going mobile.
Walker Sands actually found a small drop from 29 percent of traffic coming from mobile devices to 28 percent, but those numbers are a big leap from 17.5 percent at this time last year, and a one percent drop in mobile traffic isn’t large enough to draw any conclusions that mobile traffic is faltering.
It becomes even more apparent that mobile is becoming a hugely important consideration for online marketing when you consider that Facebook currently estimates that a third of their users access the site strictly from mobile devices and Yelp says that 59 percent of their searches are now coming from mobile.
The big takeaway, as Sterling points out, is that marketers are doing themselves a massive disservice by ignoring mobile traffic or even by just treating mobile traffic as secondary. Every marketer should be taking mobile traffic seriously, and not treating it as secondary. For some markets, it may even be best to put mobile ahead of desktop in their priorities.
00Taylor Ballhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTaylor Ball2013-11-11 13:14:172013-11-11 13:14:17New Report Estimates 28 Percent of All Traffic Now Mobile
Smartphones have revolutionized how we browse the web, but most browsing still happens within the same web browsers we have all grown accustomed to. For the most part, we do our searches and actual browsing from Chrome, Safari, or Firefox, while we limit our apps to games, reading the news, or taking care of business. But, that all could change in the near future.
Google announced late last week that they would begin allowing Android app developers to have their app content indexed. That content will then be able to be opened directly through apps on Android devices. It is a large step towards a more seamless user experience on smartphones and tablets, rather than the disjointed experience we currently enjoy.
Googlebot has been improved to be able to index the content of apps, either through a sitemap file or through Google’s Webmaster Tools, though the feature is currently only in the testing phase. This means the indexing is only currently available to a small selection of developers, and signed-in users won’t begin to see the app content in their result for a few weeks.
The update means that searches will be able to return information from app content, which will then open directly in the intended app. For websites which tend to offer the same content on both their website and their app, such as news sites, it means users will be able to pick their desired experience, whether it be from within the browser or within the app.
Jennifer Slegg reports that app developers can sign up to let Google know they are interested in having their apps indexed by filling out an application of interest. Before you do though, you should know that your app must have deep linking enabled, and you will have to provide Google with information about alternate URLs either within their sitemap or in a link element within the pages of their site.
Indexing is only available for Android apps currently, and Google has yet to comment on when or if they will extend the capability to iPhone or Windows apps.
00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-11-11 13:10:032013-11-11 13:10:03Android App Content Begins to Be Indexed By Google