Posts

Facebook has announced sweeping changes to its news feed and the way it handles groups or pages that violate the company’s content policies.

The new changes, including a new algorithm signal, are aimed at reducing the reach of sites spreading content with misinformation by judging the authority of the sites the content comes from.

If Facebook believes the site producing content shared on the platform is not reputable, it will decrease its news feed reach and reduce the number of people seeing the content.

How Facebook is Changing its Algorithm

In the past, Facebook has teamed up with highly respected organizations like the Associated Press to validate sites spreading content across the platform.

Now, the company says it is introducing a “click-gap” metric designed to automatically evaluate the inbound and outbound linking patterns of a site to judge if it is authoritative.

Essentially, the click-gap signal measures the inbound and outbound linking patterns to determine if the number of links on Facebook is higher than the link’s popularity across the internet. This will allow the company to distinguish the forced spread of content rather than organic virality.

As Facebook explains in the announcement:

“This new signal, Click-Gap, relies on the web graph, a conceptual “map” of the internet in which domains with a lot of inbound and outbound links are at the center of the graph and domains with fewer inbound and outbound links are at the edges.

Click-Gap looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph. This can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content.”

Changes to Groups

Notably, this new algorithmic signal isn’t just being applied to news feeds. The company explained it will also be using these algorithms to automatically remove low-quality content posted in groups, including private groups.

The company defended the decision by saying they can now identify and remove harmful groups, whether they are public, closed, or secret.”

“We can now proactively detect many types of violating content posted in groups before anyone reports them and sometimes before few people, if any, even see them.”

Admins are Required to Police Content

Along with these changes, Facebook clarified that its algorithms will consider what posts a group’s admins approve as a way of determining if they are a harmful group or eligible for removal.

The company says it will close down groups if an admin regularly approves content that is false, misleading, or against Facebook’s content guidelines.

This is how Facebook explained the new policy:

“Starting in the coming weeks, when reviewing a group to decide whether or not to take it down, we will look at admin and moderator content violations in that group, including member posts they have approved, as a stronger signal that the group violates our standards.”

What This Means for You

As long as the pages you participate in or run are sharing content from reliable sources, the new policies should have little effect on your day-to-day operations. However, the changes could have considerable impacts on brands or influencers who go against mainstream science or other non-approved sources. These types of industries have flourished on the platform for years, but may soon be facing a reckoning if Facebook’s new content guidelines are as strict as they sound.

If you’ve spent much time trying to promote your business on Facebook, you’ve probably recognized the social platform isn’t exactly the best at transparency.

There are a lot of questions about what exactly you can and can’t post, which made it even more frustrating that there was no way to appeal the decision if Facebook decided to remove your content for violating its hidden guidelines.

That is beginning to change, however. Likely thanks to months of criticism and controversy due to Facebook’s lack of transparency and it’s reckless handling of users’ data, Facebook has been making several big changes to increase transparency and regain people’s trust.

The latest move in this direction is the release of Facebook’s entire Community Standards guidelines available to the public for the first time in the company’s history.

These guidelines have been used internally for years to moderate comments, messages, and images posted by users for inappropriate content. A portion of the Community Standards was also leaked last year by The Guardian.

The 27-page long set of guidelines covers a wide range of topics, including bullying, violent threats, self-harm, nudity, and many others.

“These are issues in the real world,” said Monika Bickert, head of global policy management at Facebook, told a room full of reporters. “The community we have using Facebook and other large social media mirrors the community we have in the real world. So we’re realistic about that. The vast majority of people who come to Facebook come for very good reasons. But we know there will always be people who will try to post abusive content or engage in abusive behavior. This is our way of saying these things are not tolerated. Report them to us, and we’ll remove them.”

The guidelines also apply to every country where Facebook is currently available. As such, the guidelines are available in more than 40 languages.

The rules also apply to Facebook’s sister services like Instagram, however, there are some tweaks across the different platforms. For example, Instagram does not require users to share their real name.

In addition to this release, Facebook is also introducing plans for an appeals process for takedowns made incorrectly. This will allow the company to address content that may be appropriate based on context surrounding the images.

If your content gets removed, Facebook will now personally notify you through your account. From there, you can choose to request a review, which will be conducted within 24 hours. If Facebook decides the takedown was enacted incorrectly, it will restore the post and notify you of the change.

Facebook has long been the favorite social media platform for sharing content, but if a report from the New York Times is any indication content creators may soon be looking for a new platform to share their content while still attracting users to their own websites.

According to the report, Facebook may be considering hosting linked content directly on its own site, and serving ads on that content, rather than linking directly to content creators’ sites. Not only does this mean a drop in traffic from Facebook users, the change could outright cause the site owners to lose revenue from declining traffic and ads on their own site.

The change is supposedly going to be limited to mobile devices, but it has already stirred up quite a controversy with content creators and marketers.

Facebook seems to believe the change could be more convenient to users, but those who create content see it more closely in line with content syndication or even content theft. No matter the convenience to users, many content creators depend on revenue from page views and ads which would be significantly impacted if Facebook does end up hosting content.

In the wake of the controversy, Facebook has even opened a discussion on the possibility of sharing revenue to websites that own the content being hosted. In the New York Times article, Facebook explained its profit sharing proposal:

Facebook hopes it has a fix for all that. The company has been on something of a listening tour with publishers, discussing better ways to collaborate. The social network has been eager to help publishers do a better job of servicing readers in the News Feed, including improving their approach to mobile in a variety of ways. One possibility it mentioned was for publishers to simply send pages to Facebook that would live inside the social network’s mobile app and be hosted by its servers; that way, they would load quickly with ads that Facebook sells. The revenue would be shared.

That kind of wholesale transfer of content sends a cold, dark chill down the collective spine of publishers, both traditional and digital insurgents alike. If Facebook’s mobile app hosted publishers’ pages, the relationship with customers, most of the data about what they did and the reading experience would all belong to the platform. Media companies would essentially be serfs in a kingdom that Facebook owns.

The real question appears to be if there will be an opt-out option available. There has been no mention of an opt-out or the potential for the hosting of content to be optional. Even if it is up to the publisher, the change could still negatively impact content creators who choose to host their own content on their page, as content which is hosted on the social platform is likely to look more attractive and convenient.