Facebook is making some changes to how it handles comments in its algorithm to better promote real discussion.

Everyone knows that Facebook uses an algorithm to help sort which posts get shown to users, but you may not be aware that the social network uses a similar system to help rank comments.

With the new update, the company says it will do a better job or highlighting comments with specific “positive” quality signals, while demoting low-quality comments.

Comment Quality Signals

According to the new announcement, Facebook will be using four types of signals to analyze comments:

  1. Integrity Signals
  2. User Indicated Preferences
  3. User Interaction Signals
  4. Moderation Signals

Integrity Signals

Facebook’s “Integrity Signals” are designed to assess the authenticity of comments. Specifically, it will be looking to see if comments violate community standards or qualify as “engagement-bait”

Engagement Bait is a practice which involves either explicitly encouraging users to react, like, share, subscribe, or take any other form of action in exchange for something else. This can even be something as innocuous as asking followers to do push-ups.

User Indicated Preferences

User Indicated Preferences are established through Facebook’s direct polling of users. By doing this, the social network is able to directly ask users what they want to see in comments and what they think promotes real discussion.

User Interaction Signals

These are pretty self-obvious. User Interaction Signals are indications whether a user has interacted with a post.

Moderation Signals

Moderation Signals are based on whether other users choose to hide or delete comments made on their post. Facebook explains this practice in a bit more detail, saying:

“People can moderate the comments on their post by hiding, deleting, or engaging with comments.

Ranking is on by default for Pages and people with a a lot of followers, but Pages and people with a lot of followers can choose to turn off comment ranking.

People who don’t have as many followers will not have comment ranking turned on automatically since there are less comments overall, but any person can decide to enable comment ranking by going to their settings.”

Why Facebook Ranks Comments

As with Facebook’s post ranking algorithms, the primary goal of Facebook’s new comment algorithm update is to promote the best quality content within people’s feeds while hiding spammy or low-quality content. As the company says in its announcement:

“To improve relevance and quality, we’ll start showing comments on public posts more prominently when:

  • The comments have interactions from the Page or person who originally posted; or

  • The comments or reactions are from friends of the person who posted.”

You can read the full announcement from Facebook here.

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below:

Google is in the process of rolling out a significant update to its broad search engine algorithm which appears to be having a big impact on search results.

The company announced the update on June 2nd, the day before the update began rolling out. This raised some eyebrows at the time because Google generally doesn’t update the public about algorithm updates beforehand, if at all.

As Danny Sullivan from Google explained recently, the only reason they decided to talk about the update is that it would be “definitely noticeable.”

While the update is seemingly still rolling out, the early indications are that the effects of this update certainly are noticeable and could have a big impact on your site’s performance.

What Does This Mean For You?

Unfortunately, Google is never too keen to go into the specifics of their algorithm updates and it is too early to definitively tell what the algorithm update has changed.

All that is clear from reports around the web is that the algorithm update has caused a seemingly dramatic shift for sites previously affected by Google algorithm updates. Some are reporting massive recoveries and improved traffic, while others are saying their rankings have tanked over the past week.

What Does Google Say To Do?

Oddly enough, Google has provided a little bit of guidance with this latest update, though it may not be what you want to here.

The company says to essentially do nothing because there is nothing to “fix.”

Some experts within Google has also suggested results may normalize somewhat in the coming weeks as the search engine releases further tweaks and updates.

In the meantime, the best course of action is to monitor your website analytics and watch Google Search Console for notifications or big changes.

If you do see a major shakeup, you might watch to see if it recovers within the coming days or conduct an assessment of your site to evaluate what your site can do better for both search engines and potential customers.

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”