As part of its #NoHacked campaign to raise awareness and prevent site hacking, Google released its latest annual review of hacked sites this week. As the data shows, site hacks will continue to be a major issue for webmasters for the foreseeable future.

From 2015 to 2016, the number of hacked sites grew by 32%. According to Google, hackers are becoming more aggressive but many webmasters are also letting down their guards. Instead of proactively keeping their site and security up to date, a significant number of webmasters are letting their sites become vulnerable and outdated. These sites are easy targets for hackers.

While the number of sites getting hacked is on the rise, Google is willing to show forgiveness to those affected. The company says it approved 84% of reconsiderations requests from webmasters who have cleaned up their site from any hacking. However, Google also says it was unable to inform over half (61%) of affected site owners because their sites were not verified in Search Console.

What To Do If Your Site Has Been Hacked

In addition to the report, Google has also released several new documents aimed at educating webmasters about what to do if your site gets hacked and how to protect yourself.

These new help documents recently released by Google include:

The company has also released help documents focused on specific types of common site hacks, such as Gibberish Hacks, Japanese Keyword Hacks, and Cloaked Keywords Hacks.

How To Prevent Site Hacks

As always, an ounce of prevention is worth a pound of cure. Google’s top recommendation for facing the epidemic of site hacking is to avoid letting it happen in the first place. Specifically, they suggest keeping all software and plug-ins on your site up-to-date and keeping an eye on any announcements from your Content Management System (CMS) provider.

Also, be sure your site is verified in Search Console so Google can notify you in the event your website does get hacked.

Giving your visitors a place to comment on content or in a forum on your site is a great way to encourage interaction and build a bond with potential customers. But, it can be a headache trying to keep any sort of open comment area clean from spammers, trolls, and other sorts of nogoodniks.

This creates two different problems. If visitors see your pages and blog posts are followed by nothing but spam and other types of website vandalism, they’re likely to think less of your brand and potentially move on to someone else. Additionally, you can even get penalized by search engines like Google if it detects an abundance of spam or malicious links or code on your site.

So what can you do to keep your forums and blog comments clean of those seeking to use the opportunity for their own ends without shutting it all down? Google recently offered a few tips to make sure the only comments and posts your visitors see are from real humans interested in building a valuable discussion around your brand and products:

  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.
  • Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they’re a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha .
  • Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.
  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.
  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.
  • If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.
  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.
  • If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they’re publicly visible.
  • Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum’s new users by looking on their posts and activities on your forum.
  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.
  • Use the “nofollow” attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you.

Google HTTPS Warning

Google is making some changes to protect users’ sensitive information online, and it could lead to your site being marked as non-secure by Google’s web browser at the end of this month.

Google released a warning that as of the end of January 2017, Chrome will mark sites without HTTPS as non-secure if they collect private information like passwords or credit cards.

Google #NoHacked HTTPS

“Enabling HTTPS on your whole site is important, but if your site collects passwords, payment info, or any other personal information, it’s critical to use HTTPS.”

The company has encouraged implementing HTTPS in the past by making it a (very minor) search ranking signal. Now, from the sound of the alert, the company says an entire site will need to be HTTPS if any pages collect payment or sensitive information.

Switching over to HTTPS is an easy process, but you should begin preparing to make the switch now if your site fits the criteria. Otherwise, you are likely to be flagged as non-secure in February and lose a large amount of your web traffic.

Negative SEO alert

There’s a new malicious SEO tactic making the rounds and your Google My Business listings could easily be the victim, according to web security company Sucuri. The company says individuals are sneaking inappropriate or damaging photos into GMB listings with the intent of damaging a business’s reputation and image.

What makes this type of exploit unique, however, is that it doesn’t take any hacking skills to do. Unlike other negative SEO tactics, this specific technique does not include hosting images on a client server, malicious code, or even breaking into an account.

Ultimately, the attack is taking advantage of Google’s lax rules for uploading photos to a business’s location in Google Maps. Anyone can upload images to a business’s listing, and any of these images can be used for Knowledge Graph data about the business.

While Sucuri doesn’t have evidence of this, it is possible for a person to spam a business’s listing with lewd images and then send fake hits to them to increase their perceived popularity – all with the end goal of making sure they come up when people see your business online.

How to Protect Your Listings

Unfortunately, the nature of this type of attack makes it difficult to guard against. There is no way to limit who can upload photos to your listings or determine which image gets used in Knowledge Graphs. The best you can really do is to actively keep an eye on your listings and which photos are appearing next to your listings.

You can also watch to make sure no one is uploading inappropriate pictures to your Google My Business photos. While you can’t stop people from uploading lewd images, you can easily remove any associated with your location.

After months of testing a new verification in the San Francisco area exclusively for locksmiths and plumbers, the search engine has officially launched the “Google guaranteed” verification process.

If your business gets “Google guaranteed”, you get a special green badge next to your business in the search results – and customers get a few perks and protections through Google.

img_3099-760x1242

If you tap on any of the results, you are then taken to the home service ad specifically for that business, along with some extra details about what the Google guarantee really means:

img_3100-760x342

Tapping on Learn More goes even more in-depth, with a full page of details about what the Google guarantee covers and how it works:

”When you book an eligible home service pro on Google, you are protected by the Google guarantee. If you’re not satisfied with the work quality, Google may refund up to the amount paid for the job.”

What Businesses Need To Know

Currently, the Google guarantee is limited to just locksmiths and plumbers. This is because both industries have had recent issues with ad fraud and abusive advertising practices which Google is attempting to clean up.

There does not appear to be a public sign-up process for businesses hoping to be verified, and it is unclear what the verification process includes. However, this is likely to become more transparent as the verification process is extended nationwide.

What Customers Need To Know

To activate the Google guarantee, fill out this form before your first appointment. You can also call customer support at (844) 885-0761 to submit a claim or ask questions about your coverage.

Coverage

  • If you’re unhappy with the work performed, you can submit a claim and Google will cover the invoice amount up to a lifetime cap of $2,000.
  • The job must be booked through Google Home Services. Any future work completed by the same provider, unless booked through Home Services, is not covered.
  • Jobs completed before September 14, 2016, are not covered.
  • Currently only locksmith and plumbing jobs are covered.

HTTPS

It has now been two years since Google announced it would be making HTTPS a minor ranking signal, and a recent study from Moz shows just how many sites have made the switch since then.

After Google’s announcement, there was an initial surge in sites changing from HTTP to HTTPS, but many held back to assess just how important the security protocol was to the search engine and ultimately decided it wasn’t worth the risk. Google only considers HTTPS a minor factor in their ranking algorithm and there has been concern about potential risks when making the switch.

To check up how far along the transition is, Dr. Pete Meyer from Moz compiled data to see just how close is Google is to changing the web over to HTTPS.

Before Google started including HTTPS in its algorithm, Meyer says only around 7% of all pages featured on the first page of Google search results used the more secure protocol. A week after the switch that number had climbed to 8%. Since then, the number has steadily been rising, reaching over 30% this year.

Moz reports that “as of late June, our tracking data shows that 32.5% (almost one-third of page-1 Google results now use the “https:” protocol.”

However, Meyer says he is not convinced everyone that has made the switch was motivated by algorithms and ranking signals. Instead, he believes it is a sign that Google’s PR campaign to make HTTPS more attractive and desirable for sites is working.

Meyer also says that in another 1 to 1.5 years we are likely to see 50% of the sites shown on the first page of search results to use HTTPS, which he predicts will lead Google to strengthen the ranking signal.

Ultimately, many are still hesitant about changing their entire site’s HTML structure to HTTPS and the risks that come along with site-wide changes like this. However, Dr. Meyers says it is wise to keep an eye on how many sites in your industry are using the protocol and to be watchful for any upcoming algorithm updates that may make HTTPS even more prominent in search results.

ad quality lookback

Every year Bing likes to highlight its efforts to keep the web safe with its “Bad Ads Report” and this year shows that the endless war against online scammers and hackers has remained largely consistent recently.

Despite constant efforts to derail the malicious actors on their platform, tech support scams and purposely misleading ads remain the biggest problems on Bing Ads. The company blocked over 15 million ads for running tech support scams alone.

Overall, Bing says it has rejected over 250 million ads in the past year, as well as blocking 50,000 sites, and banning 150,000 ads for breaking their guidelines.

Considering Bing’s trademark usage policies are relatively loose compared to competitors like Google, it comes as a surprise that the company says it dismissed more than 50 million ads in 2015 for trademark infringements.

The rest of the report is less surprising. Phishing attacks remain a relatively minor issue, and pharma and counterfeit goods are still being delisted by the hundreds of thousands.

Find out more from Bing’s ad report here.

banner-1000935_640

Google is continuing its efforts to combat online display advertising fraud, with new defenses against a scam technique known as clickjacking.

If you’ve ever tried to press play on a video, open a link, or start a song and wound up on another page unexpectedly, clickjacking is most likely the culprit.

This is done by overlaying an essentially transparent layer over a legitimate web page. This way everything looks normal, but as soon as you try to take any form of action you trigger a behavior on the transparent overlay. The action may be used to trigger one-click orders from Amazon, take you to malware-laden sites, gain Facebook or Twitter likes, commit ad fraud, or any number of malicious behavior.

To fight back against this, Google is removing publishers engaged in clickjacking from its network entirely. The company has also developed a new filter specifically to exclude invalid traffic on display ads from clickjacked pages on both mobile and desktop.

In a blog post about the new efforts to fight clickjacking, Andres Ferrate, Chief Advocate of Ad Traffic Quality at Google, explained:

When our system detects a Clickjacking attempt, we zero-in on the traffic attributed to that placement, and remove it from upcoming payment reports to ensure that advertisers are not charged for those clicks.

Source: Robert Scoble / Flickr

Source: Robert Scoble / Flickr

Google has released its annual “bad ads” report, though they’ve changed the name a bit. Every year Google uses its bad ads report to highlight the efforts they are taking to rid AdWords of scammers, malware, and fraudulent ads. This year, they covered pretty much the same areas but chose to focus on the positive, calling its annual report the “Better Ads Report.”

This year’s report says the search giant disabled over 780 million ads last year for policy violations, up from 524 million ads disabled in 2014, and 350 million ads disabled in 2013.

Google describes “bad ads” as advertisements carrying malware, blocking the visibility of content, promoting fake or illegal goods, or leading to phishing scams. The company used a team of over 1,000 people around to world to constantly fight back against these ads. The majority of the time they are able to block the ads before they are ever seen by regular users.

Google also went in to detail, showing the most common bad ads they encountered in 2015:

  • Counterfeiters: Over 10,000 sites and 18000 accounts were suspended for attempting to sell counterfeit goods (imitation designer watchers for example).
  • Pharmaceuticals: Over 12.5 million ads were blocked for violating Google’s healthcare and medicines policy, such as advertising pharmaceuticals that have not been approved for use or ads that made misleading claims about the effectiveness of prescription drugs.
  • Weight loss scams: Over 30,000 sites were suspended for making weight loss promises that were dishonest and typically impossible to achieve.
  • Phishing: Over 7,000 sites were blocked for attempting to steal user information, aka phishing.
  • Unwanted software: More than 10,000 sites were disabled for forcing unwanted software and unapproved downloads via Google ads.
  • Trick-to-click: Over 17 million ads were rejected for attempting to mislead users to click an ad that would redirect them to unrelated pages.
  • Bad apps:  Google also blocked over 25,000 mobile aps from displaying Google ads due to breaking policies. Approximately 1.4 million apps were rejected from ever being able to display Google ads in the future.

Looking forward, Google says it is going to start cracking down on ads that may lead to accidental clicks. It also says it has developed technology capable of determining when mobile ad clicks are accidental, and will be able to prevent users from being taken to ad sites they didn’t intent to visit.

Google also plans to bolster efforts to cut down on weight loss ads in 2016 by adding additional restrictions on what advertisers can say is effective for weight loss.

View the full report here.

google-alerts1

Google is continuing its efforts to promote privacy in search by prioritizing indexing HTTPS pages over their HTTP equivalents.

In the announcement, Google explains its long-term aim is to eventually direct users to secure webpages with a private connection. The step to only index HTTPS pages when an HTTP equivalent exists is their most recent move in this process, following the small rankings boost given to HTTPS pages last year.

Unlike the change to Google’s algorithm in August 2014, this move will not have any effect on rankings. Instead, it simply means that Googlebot will only index the HTTPS version of a URL when both an HTTPS and HTTP version exist.

While Google’s commitment to secure search may lead to more rankings boosts for HTTPS pages in the future, this change is mostly to improve the efficiency of Google’s current indexing process. As they explain in their announcement:

“Browsing the web should be a private experience between the user and the website, and must not be subject to eavesdropping, man-in-the-middle attacks, or data modification. This is why we’ve been strongly promoting HTTPS everywhere.”