By now, the hacker craze of the 90’s and early 2000’s has died down quite a bit. Most people don’t worry about hackers all that much, so long as you use some solid anti-virus and keep your router protected. Big businesses may have to worry about Anonymous’ hi jinks, but the common person don’t tend to concern themselves with the issue. Hacking especially doesn’t seem like that big of an issue for SEO, at first.
But, hackers can actually do your site some damage, and can even get your site entirely dropped from the Google search index. Sites get blacklisted when hackers inject malicious code onto servers, as Google seeks to protects searchers’ computers from any sort of compromising.
While Google doesn’t immediately drop sites from their index, being blacklisted leads to a complete drop in organic traffic and can be a crisis for SEO. Blacklisting starts as a warning to searchers that a site may be compromised, and few will continue past that alarm.
This has become a rather significant problem for Google. To help provide wide support for the increasing number of webmasters dealing with compromised servers, Google has launched the ‘Webmasters Help for Hacked Sites‘ support center. They give detailed information on how to clean and repair your server and prevent your site from getting entirely dropped from the Google index.
If you think this sort of hacking isn’t a big deal, check out the charts below. They show just how frequent this type of malicious activity has become. It isn’t just banks and large corporations dealing with it. Small businesses are just as at risk as international franchises. The most common form of attack is an automated set of processes that indiscriminately discover and exploit vulnerabilities on servers, which are often left completely unprotected.
Search Engine Journal recently explored the issue more in depth, unpacking why the issue is such a large concern to Google and webmasters alike. Compromised sites can destroy a search engine’s credibility just as your own, so the problem has to be taken very seriously.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-08-05 11:14:082013-08-05 11:14:08Can Hacking Get You Blacklisted by Google?
Everyone working in SEO knows that Google has a multitude of factors they use to determine the order of search engine results, and the majority of these ranking factors are based on either the content of the webpage or signs of authenticity or reputability. That was the case for the longest time, but since 2010, Google has made significant shifts towards a focus on usability, and the harbinger of this change was the inclusion of website speed to ranking factors.
The problem is, website speed and other usability issues aren’t exactly objectively defined. What exactly is a slow loading site? What is the cutoff? No one has gotten a definitive answer from Google, but in June Matt Cutts explicitly stated that slow loading sites, especially on mobile platforms will begin seeing search rank penalties soon.
Obviously these changes are good for searchers. Searchers want sites that load quickly, offer quality user experience, and deliver great content. And, the emphasis on speed is certainly highlighted on mobile platforms where on-the-go users are likely to go back to the results if the site takes too long for their liking. The issue we face as search optimization professionals is trying to figure out exactly what Google is measuring and how that information is being used.
Matt Peters from Moz decided to break through Google’s intentionally vague information to figure out exactly how site speed affects rankings with the help of Zoompf. They can’t explicitly disprove causation between site speed and rankings, due to the number of other algorithmic ranking factors that complicate the study. But, their results did show very little to no correlation between page load time and ranking.
I wouldn’t take this information as gospel, but it does suggest that loading time isn’t a huge consideration into long tail searches and doesn’t need to be worried about too much. If your site is loading quickly enough to please the people coming to it, your site will also likely pass Google’s expectations.
00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-08-01 11:13:562013-08-01 11:13:56How Does Site Speed Really Affect Search Rankings?
Internet security and privacy has been at the forefront of many people’s minds with the recent headlines about the NSA keeping data on the public’s online activity, and the issue has had subtle affects on search engines. We’ve seen a small group of searchers migrating to search engines with stricter privacy policies. Of course, those who are truly outraged by the NSA news would expect to see a pretty large shift, but so far the change has been slow. But, it is picking up momentum.
More and more people are learning about how Google actually decides which results to show you, as an individual, and many are a little concerned. While Google sees the decision to collect data on users as an attempt to individually tailor results, a few raise their eyebrows at the idea that a search engine and huge corporation is keeping fairly detailed tabs on the internet activities of users. The internet comes with an assumption that our activity is at least fairly private, though that notion is getting chipped away at daily. But, there is still the widespread assumption that our e-mails or simple search habits are our business alone, an assumption that is also being proved wrong.
These privacy issues have a fair number of people looking for search engines that keep our searches completely anonymous and don’t run data collection processes. The most notable solution people seem to be moving to is DuckDuckGo.com, a search engine whose privacy policy claims will not retain any personal information or share that information with other sites. The search engine has been seeing a traffic rise by close to 2 million searches per day since the NSA scandal broke.
There are numerous debates surrounding these issues. Political discourse focuses on the legality and ethical aspects of the government and large corporations working together to collect information on every citizen of the United States (other companies included in the NSA story include Yahoo, Facebook, and Microsoft). But, as SEO professionals, the bigger question is the ethical and practical reality of individually tailored results which rely entirely on data collection.
If you’ve ever taken a look at the ads on the edges of websites, you’ve probably noticed that the ads are loosely based on your personal information. The ads reflect your gender, age, location, and sometimes loose search histories. The ads you are shown are chosen based on information your computer relays to almost every site you access. Google acts the same way, but they collect this data and combine it extended data of your search history to deliver search results they believe are more relevant to you.
There is a practicality to this. We all have fine tuned personal tastes, and innately we desire for search engines to show us exactly what we want with the first search result, every time. While poll responses say that the majority of people don’t want personalized search results, are online actions belie our true desires for efficient search. The best way to do this is to gather data and use the data to fine-tune results. On a broad scale, we don’t want results for a grocery store in Los Angeles when we are physically situated in Oklahoma. On a smaller scale, we don’t want Google showing us sites we never go to when our favorite resource for a topic is a few results down the page.
In this respect, the move towards search engines like DuckDuckGo is actually a step back. These privacy-focused search engines are essentially acting how Google used to. They use no personal information, and simply try to show the best results for a specific search. It is a trade of privacy for functionality, and this could possibly explain the slow uptake or migration to these types of search engines. But, people are moving.
The longer the NSA story stays in the news, the more searches DuckDuckGo receives, and this could potentially have a significant affect on the search market in the future. The question is, do we want to sacrifice personal privacy and assumed online anonymity for searches that match our lives? Andrew Lazaunikas recently wrote an article on the debate for Search Engine Journal. He admits DuckDuckGo delivers excellent, unbiased results, but in the end, “when I want to know the best pizza place or car dealer in my area, the local results that Google and Bing shows are superior.”
Lazaunikas isn’t deterred by the aspect, and notes, “I can still get the information I need from DuckDuckGo by modifying my search.” He ends his statement by vowing to use DuckDuckGo more in the future, but the question is whether the public at large will follow. For the moment, it seems as though most people prefer quick easy searches and familiarity to trying out these new search engines.
00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-31 11:41:292013-07-31 11:41:29Will You Alter Your Searches After The Internet Privacy Scandal?
After two fairly explicit warnings about advertorials this year, Google has added advertorials to their webmaster guidelines, as well as other popular spammy linking techniques in the Link Schemes help document.
Google Continues To Downplay Links
The biggest change is the removal of the entire first paragraph from the help article, which addressed how incoming links influence rankings. Search Engine Journal says the removed paragraph read:
Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.
Links have been steadily falling out of favor throughout the past few years, and it appears we are finally reaching a tipping point for Google’s reduction of linking’s role in search algorithms. Or, as Google has been advising, high-quality sites matter much more than links of any quality.
Keyword-Rich/Optimized Anchor Text Links
Google also tackled heavily-optimized anchor text used in press releases that are usually distributed across other sites. The technique has enjoyed a quick rise in highly competitive markets, and Google appears to finally be putting the squash on the practice. They did note that guest posting is still a popular practice, which can be valuable when done correctly. However, sites that accept guest blogging have been using nofollow or an optimized URL link to avoid issues.
Advertorials
And of course, the final change is the addition of advertorials as an example of unnatural links that violate Google guidelines.
Advertorials or native advertising where payment is received for articles that include links that pass PageRank.
Google has been making swift changes to linking policy and practice, so it is highly likely changes like this will keep occurring. Links can still be a strong weapon in your SEO strategy, but you have to tread carefully, and they maybe shouldn’t be your highest priority when optimizing.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-30 13:39:112013-07-30 13:39:11Google Makes Changes To Their Link Schemes Help Document
Site audits can be ugly work. Nothing can be more disastrous to a client-SEO relationship than informing someone of all the issues wrong with their site in too harsh of a way. They’ve spent time and money having a site created that they think works well for their business, and then we audit the page and everything under the hood and have to break the news that their site is sick or badly put together.
The process is similar to playing “website mechanic” as Stuntdubl SEO put it. We offer diagnostic information and recommendations that are absolutely critical to keeping a site relevant and valuable to organic search, but many people don’t want to hear how bad of shape their car or website are in. Not only will it be costly to fix, but they’ve developed a sentimental relationship to the site they have.
To be able to break the news in the best way possible, we have to be as prepared and informed as possible, which means running extensive auditing and answering a lot of questions. This also means understanding all of the tools at your disposal so that you can get the best answers in the fastest way possible.
Todd Malicoat took 50 of the most important questions for site audits, and identified the best tool available for answering every inquiry. Not only will this speed up your data collection and auditing, it will make you more prepared to create a better site and communicate properly with the clients you are working with.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-29 13:41:552013-07-29 13:41:55Finding The Best Tools To Answer All Your Site Audit Questions
Everybody talks about SEO as if it is a monolithic entity. At most, you might hear conversation about local SEO and every few weeks someone will chime in to remind us about international SEO, but the vast majority of the dialogue just refers to SEO as a whole.
But, ignoring its constantly changing nature, SEO is also a lot harder to pin down. Great optimization bends and molds to match the client and the unique needs of a market. What works for a nearby plumbing company may not translate to a small tech startup or a healthcare provider. The absolute basics are the same, but all of these companies have different online needs that can’t be handled with a “one-size-fits-all” mentality.
Hotels are one market with especially unique needs, and now that summer is winding down and many people are trying to squeeze in a vacation before the kids return to school, now is as relevant a time to talk about SEO as any. Aleh Barvsevich broke down the topic in detail, covering how search results for hotels are chosen and displayed and what opportunities hotel clients have in PPC and SEO.
Duplicate content has always been viewed as a serious no-no for webmasters and search engines. In general, it is associated with spamming or low-quality content, and thus Google usually penalizes sites with too much duplicate content. But, what does that mean for necessary duplicate content like privacy policies, terms and conditions, and other types of legally required content that many websites must have?
This has been a bit of a reasonable point of confusion for many webmasters, and those in the legal or financial sectors especially find themselves concerned with the idea that their site could be hurt by the number of disclaimers.
Well of course Matt Cutts is here to sweep away all your concerns. He used his recent Webmaster Chat video to address the issue, and he clarified that unless you’re actively doing something spammy like keyword stuffing within these sections of legalese, you shouldn’t worry about it.
He said, “We do understand that a lot of different places across the web require various disclaimers, legal information, and terms and conditions, that sort of stuff, so it’s the sort of thing where if were to not to rank that stuff well, that would hurt our overall search quality. So, I wouldn’t stress out about that.”
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-25 12:27:392013-07-25 12:27:39How Does Google Handle Legally Required Duplicate Content?
It isn’t uncommon for webmasters or SEOs who operate numerous sites in a network to ask how many of them they can link together without bringing down the ax of Google. Finally that question made its way to Google’s head of Webspam who responded in one of his regular YouTube videos.
The question was phrased “should a customer with twenty domain names link it all together or not?” While blog networks can easily find legitimate reasons to link together twenty or more sites (though Cutts advises against it), it seems interesting to use the number in question to discuss normal webpages. As Cutts put it, “first off, why do you have 20 domain names? […] If it is all, you know, cheap-online-casinos or medical-malpractice-in-ohio, or that sort of stuff… having twenty domain names can look pretty spammy.”
When I think of networks with numerous full sites within them, I think of Gawker or Vice, two online news sources who spread their news out across multiple sites that are more focused on unique topics. For example, Vice also runs Motherboard, a tech focused website, as well as Noisey, a site devoted to music. Gawker on the other hand runs Deadspin, Gizmodo, iO9, Kotaku, and Jezebel, among a couple others. Note, at most those networks run 8 unique sites. There is little reason any network of unique but connected sites with more parts than that.
However, there are times when having up to twenty distinct domain names could make sense without being spammy. Cutts points out that when you have many different domain names that are all localized versions of your site, it is ok to be linking to them. Even in that scenario however, you shouldn’t be linking them in the footer. The suggested fix is to place them in a drop down menu where users have access.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-24 10:47:202013-07-24 10:47:20At What Point Does Linking Domains Together Become a Linking Scheme?
Despite telling us that Google would no longer confirm when new Panda updates occur, they announced today that they were rolling out a new update that is “more finely targeted” than the original release of Penguin 2.0.
Unlike many Penguin updates, most webmasters actually seem happy to see the new version, as they are already claiming recovery from the original algorithm.
Google has said that their plan is to release Panda algorithm updates monthly over a ten day period, but Matt Cutts, head of Google’s Webspam team, implied there way a delay for this refresh because they wanted to ensure the signals would be loosened up a little from the last release.
The official statement from Google simply says, “In the last few days we’ve been pushing out a new Panda update that incorporates new signals so it can be more finely targeted.”
00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-23 10:54:462013-07-23 10:54:46The Newest Penguin 2.0 Refresh is Here
Google has been very clear about their stance on manipulative or deceptive behavior on websites. While they can’t tackle every shady practice sites have been enacting, they have narrowed their sites on a few manipulative acts they plan on taking down.
The first warning came when Google directly stated their intention to penalize sites who direct mobile users to unrelated mobile landing pages rather than the content they clicked to access. While that frustrating practice isn’t exactly manipulative, it is an example of sites redirecting users without their consent and can be terrible to try to get out of (clicking back often just leads to the mobile redirect page, ultimately placing you back at the page you didn’t ask for in the first place).
Now, Google is aiming at a similar tactic where site owners have been inserting fake pages into the browser history, so that when users attempt to exit, they are directed to a fake search results page that is entirely filled with ads or deceptive links, like the one below. It is basically a twist on the tactic which keeps placing users trying to exit back on the page they clicked to. The only way out is basically a flurry of clicks which end up putting you much further back in your history than you intended. You may not have seen it yet, but it has been popping up more and more lately.
The quick upswing is probably what raised Google’s interest in the tactic. As Search Engine Watch explains, deceptive behavior on sites has pretty much always been against Google’s guidelines and for them to make a special warning to sites adopting the practice suggests this practice is undergoing widespread dissemination to sites that are okay pushing Google’s limits.