Facebook has announced sweeping changes to its news feed and the way it handles groups or pages that violate the company’s content policies.
The new changes, including a new algorithm signal, are aimed at reducing the reach of sites spreading content with misinformation by judging the authority of the sites the content comes from.
If Facebook believes the site producing content shared on the platform is not reputable, it will decrease its news feed reach and reduce the number of people seeing the content.
How Facebook is Changing its Algorithm
In the past, Facebook has teamed up with highly respected organizations like the Associated Press to validate sites spreading content across the platform.
Now, the company says it is introducing a “click-gap” metric designed to automatically evaluate the inbound and outbound linking patterns of a site to judge if it is authoritative.
Essentially, the click-gap signal measures the inbound and outbound linking patterns to determine if the number of links on Facebook is higher than the link’s popularity across the internet. This will allow the company to distinguish the forced spread of content rather than organic virality.
As Facebook explains in the announcement:
“This new signal, Click-Gap, relies on the web graph, a conceptual “map” of the internet in which domains with a lot of inbound and outbound links are at the center of the graph and domains with fewer inbound and outbound links are at the edges.
Click-Gap looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph. This can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content.”
Changes to Groups
Notably, this new algorithmic signal isn’t just being applied to news feeds. The company explained it will also be using these algorithms to automatically remove low-quality content posted in groups, including private groups.
The company defended the decision by saying they can now identify and remove harmful groups, whether they are public, closed, or secret.”
“We can now proactively detect many types of violating content posted in groups before anyone reports them and sometimes before few people, if any, even see them.”
Admins are Required to Police Content
Along with these changes, Facebook clarified that its algorithms will consider what posts a group’s admins approve as a way of determining if they are a harmful group or eligible for removal.
The company says it will close down groups if an admin regularly approves content that is false, misleading, or against Facebook’s content guidelines.
This is how Facebook explained the new policy:
“Starting in the coming weeks, when reviewing a group to decide whether or not to take it down, we will look at admin and moderator content violations in that group, including member posts they have approved, as a stronger signal that the group violates our standards.”
What This Means for You
As long as the pages you participate in or run are sharing content from reliable sources, the new policies should have little effect on your day-to-day operations. However, the changes could have considerable impacts on brands or influencers who go against mainstream science or other non-approved sources. These types of industries have flourished on the platform for years, but may soon be facing a reckoning if Facebook’s new content guidelines are as strict as they sound.