Today is International Small Business Day, and Google is celebrating the day with a new hub full of marketing resources for small business owners and marketers.

Many of the tools and features included were developed as part of the ‘Google for Small Business’ initiative and were created using feedback from small business owners around the world. 

In the announcement, Product Management Director at Google described the initiative as a personal effort to help businesses save time and grow their business more effectively:

“I’ve had the opportunity to get to know many small business owners and the challenges they face. Most of them tell me that they need help saving time at work, or that they need easy tools to help them promote their business.”

In the new Google for Small Business hub, you will find:

  • Personalized Plans: By completing a few quick questions about your business and your current goals, you will receive a customized step-by-step plan you can follow to accomplish your goals.
  • In-Person Workshops: Stay up-to-date with any upcoming free Grow with Google workshops happening in your local area.
  • Latest News: Stay updated with the latest news about tools and services for small businesses.

A recent Wall Street Journal investigation has landed Google once again in the hot seat as the report claims Google Maps is filled with millions of fake business listings. 

Over the course of the article, reporters say they found some Maps search result pages where more than half of the local results included fraudulent or misleading information characteristic of a fake listing.  

For example:

“A search for plumbers in a swath of New York City found 13 false addresses out of the top 20 Google search results. Only two of the 20 are located where they say and accept customers at their listed addresses, requirements for pushpin listings on Google Maps.”

In some cases, the fake listings are simply phantom businesses with no real purpose or to misdirect customers. However, the Journal believes others are designed to scam potential customers out of large amounts of money. 

As you would expect, all of these practices are expressly forbidden by Google, but the Wall Street Journal says the policy is poorly enforced. 

In fact, the report says hundreds of thousands of fake listings are appearing monthly:

“Hundreds of thousands of false listings sprout on Google Maps each month, according to experts. Google says it catches many others before they appear.”

How This Hurts Businesses

The fake listings do more than cause consumers unnecessary frustration or potentially scamming customers. They also hurt businesses who are pushed out of the top search results by fraudulent businesses.

Getting your business into the organic local results without paying for ads is already a gamble that can involve hours of hard work optimizing your website and listing. Adding fake competition just makes the arena even more competitive and encourages more businesses to spend money on local ads instead. 

How Google Fights Fake Listings

Google openly acknowledges that it has an issue with fake business listings, though the company says it is already taking extensive steps to fight back. 

In an article on the company’s blog, Google explained:

“It’s a constant balancing act and we’re continually working on new and better ways to fight these scams using a variety of ever-evolving manual and automated systems. But we can’t share too many details about these efforts without running the risk of actually helping scammers find new ways to beat our systems—which defeats the purpose of all the work we do.”

Specifically, the search engine says it has removed more than 3 million fake business profiles over the past year – 90% of which were removed before they could ever be seen by users. 

Approximately 85% of these profiles were removed by Google’s automated internal systems, while around 250,000 fake business listings were reported by users and then removed. 

Google may be making significant efforts to fight the problem of fake business listings, but The Wall Street Journal makes it clear there is still much to be done.

Facebook is making some changes to how it handles comments in its algorithm to better promote real discussion.

Everyone knows that Facebook uses an algorithm to help sort which posts get shown to users, but you may not be aware that the social network uses a similar system to help rank comments.

With the new update, the company says it will do a better job or highlighting comments with specific “positive” quality signals, while demoting low-quality comments.

Comment Quality Signals

According to the new announcement, Facebook will be using four types of signals to analyze comments:

  1. Integrity Signals
  2. User Indicated Preferences
  3. User Interaction Signals
  4. Moderation Signals

Integrity Signals

Facebook’s “Integrity Signals” are designed to assess the authenticity of comments. Specifically, it will be looking to see if comments violate community standards or qualify as “engagement-bait”

Engagement Bait is a practice which involves either explicitly encouraging users to react, like, share, subscribe, or take any other form of action in exchange for something else. This can even be something as innocuous as asking followers to do push-ups.

User Indicated Preferences

User Indicated Preferences are established through Facebook’s direct polling of users. By doing this, the social network is able to directly ask users what they want to see in comments and what they think promotes real discussion.

User Interaction Signals

These are pretty self-obvious. User Interaction Signals are indications whether a user has interacted with a post.

Moderation Signals

Moderation Signals are based on whether other users choose to hide or delete comments made on their post. Facebook explains this practice in a bit more detail, saying:

“People can moderate the comments on their post by hiding, deleting, or engaging with comments.

Ranking is on by default for Pages and people with a a lot of followers, but Pages and people with a lot of followers can choose to turn off comment ranking.

People who don’t have as many followers will not have comment ranking turned on automatically since there are less comments overall, but any person can decide to enable comment ranking by going to their settings.”

Why Facebook Ranks Comments

As with Facebook’s post ranking algorithms, the primary goal of Facebook’s new comment algorithm update is to promote the best quality content within people’s feeds while hiding spammy or low-quality content. As the company says in its announcement:

“To improve relevance and quality, we’ll start showing comments on public posts more prominently when:

  • The comments have interactions from the Page or person who originally posted; or

  • The comments or reactions are from friends of the person who posted.”

You can read the full announcement from Facebook here.

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below:

Google is in the process of rolling out a significant update to its broad search engine algorithm which appears to be having a big impact on search results.

The company announced the update on June 2nd, the day before the update began rolling out. This raised some eyebrows at the time because Google generally doesn’t update the public about algorithm updates beforehand, if at all.

As Danny Sullivan from Google explained recently, the only reason they decided to talk about the update is that it would be “definitely noticeable.”

While the update is seemingly still rolling out, the early indications are that the effects of this update certainly are noticeable and could have a big impact on your site’s performance.

What Does This Mean For You?

Unfortunately, Google is never too keen to go into the specifics of their algorithm updates and it is too early to definitively tell what the algorithm update has changed.

All that is clear from reports around the web is that the algorithm update has caused a seemingly dramatic shift for sites previously affected by Google algorithm updates. Some are reporting massive recoveries and improved traffic, while others are saying their rankings have tanked over the past week.

What Does Google Say To Do?

Oddly enough, Google has provided a little bit of guidance with this latest update, though it may not be what you want to here.

The company says to essentially do nothing because there is nothing to “fix.”

Some experts within Google has also suggested results may normalize somewhat in the coming weeks as the search engine releases further tweaks and updates.

In the meantime, the best course of action is to monitor your website analytics and watch Google Search Console for notifications or big changes.

If you do see a major shakeup, you might watch to see if it recovers within the coming days or conduct an assessment of your site to evaluate what your site can do better for both search engines and potential customers.

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”