Tag Archive for: Facebook content

For the first time ever, Facebook is revealing the most clicked and most viewed pages, posts, and more across the site in a new quarterly Widely Viewed Content Report

The lists specifically focus on the pages, domains, links, and posts that have gotten the most views in the U.S. between April 1, 2021, and June 30, 2021.. 

Here’s what the report tells us:

Overall Takeaways from Facebook’s Widely Viewed Content Report

Before we get into the more detailed lists, the report also gives us some surprising takeaways about content on Facebook:

  • The most viewed content is not necessarily the content that gets the most engagement.
  • More than half (57%) of posts that people see come from their family and friends. 
  • Less than 13% of content views were on posts containing links.
  • Despite the perception that news sources dominate the platform, the most viewed news domains accounted for just 0.31% of all content views.
  • However, approximately a quarter of the most viewed posts including links came from the most viewed news publishers.

Most Viewed Domains

Facebook’s Widely Viewed Content Report lists the top 20 domains on the platform by content views. Below, we are sharing the top 10:

  1. youtube.com (181.3M views)
  2. amazon.com (134.6M views)
  3. unicef.org (134.4M views)
  4. gofundme.com (124.8M views)
  5. twitter.com (116.1M views)
  6. media1.tenor.co (115.6M views)
  7. m.tiktok.com (110.7M views)
  8. open.spotify.com (93.0M views)
  9. playeralumniresources.com (89.9M views)
  10. abcnews.go.com (88.1M views)

Most Viewed Links

The topmost viewed links include a very surprising and often confusing mishmash of landing pages, videos, store pages, news articles, and more. Here are the top 10 most viewed links on Facebook:

  1. https://www.playeralumniresources.com/ (87.2M views)
  2. https://purehempshop.com/collections/all (72.1M views)
  3. https://www.unicef.org/coronavirus/unicef-responding-covid-19-india (62.7M views)
  4. https://myincrediblerecipes.com/ (58.9M views)
  5. https://reppnforchrist.com/ (51.6M views)
  6. http://www.yahoo.com/ (51.0M views)
  7. https://64.media.tumblr.com/2d32d91bcdfa6e17f18df90f1fada473/6094b00761d82f16-76/s400x600/f0383899ecb1484b10e3420a368d871d7dc68f91.gifv (49.1M views)
  8. https://stevefmvirginia.iheart.com/ (48.2M views) 
  9. https://www.londonedge.com/index.html (44.3M views)
  10. https://subscribe.theepochtimes.com/p/?page=email-digital-referral (44.2M views)

Most Viewed Pages

The most viewed pages give a glimpse into those who are driving the most engagement and building the most connected audience:

  1. Unicef (153.2M views)
  2. Kitchen Fun With My 3 Sons (112.3M views)
  3. Sassy Media (109.5M views)
  4. The Dodo (104.5M views)
  5. LADbible (104.4M views)
  6. Woof Woof (104.1M views)
  7. A Woman’s Soul (98.3M views)
  8. 3am Thoughts (92.1M views)
  9. Lori Foster (89.5M views)
  10. World Health Organization (WHO) (88.9M views)

Top Viewed Posts

While the full report includes the top 20 posts from the platform, we aren’t going to share them here. The collection is largely made up of simple text posts with an image – some bordering on spam. The third most viewed post was even deleted or made private. If anything, this section reveals that Facebook doesn’t necessarily require the most intricately constructed content to go viral. All it takes is knowing your audience and motivating them to respond. 


As you might expect from all of this, the reaction to the report has been mixed (at best).

It is certainly interesting to see exactly what pages and content are getting the most traction across Facebook, but it doesn’t exactly paint the most impressive picture.

For better or worse, however, this is what has been most widely viewed on Facebook in the U.S. this quarter.

For the full report, click here.

Facebook is making major changes to its news feeds in a new bid to create a better experience for users in the near future. Before it can do so, though, the company is seeking feedback from users.

As the company recently announced, it is revamping parts of the news feed system to encourage four specific types of user feedback to better understand content. In the future, Facebook intends to use this information to create new ranking signals to directly decide what content users see.

Specifically, the company says it aims to gather answers to these four questions to get better at providing quality content in the future:

Is This Post Inspirational?

Facebook’s feeds have a bad reputation for highlighting negative content which can turn into a feedback loop of endless “doom scrolling.” With this in mind, the social network is looking to deliver more inspirational or uplifting content for users.

As the announcement says:

“To this end, we’re running a series of global tests that will survey people to understand which posts they find inspirational. We’ll incorporate their responses as a signal in News Feed ranking, with the goal of showing people more inspirational posts closer to the top of their News Feed.”

Is This Content Interesting?

Perhaps the most important factors for users scrolling through content is whether any of it is actually interesting to them. At times, it can feel like you can scroll for hours without seeing anything exciting or particularly relevant to their interests. 

“… we know sometimes even your closest friends and family share posts about topics that aren’t really interesting to you, or that you don’t want to see. To address this, we’ll ask people whether they want to see more or fewer posts about a certain topic, such as Cooking, Sports or Politics, and based on their collective feedback, we’ll aim to show people more content about the topics they’re more interested in, and show them fewer posts about topics they don’t want to see.”

Do You Want To See Less of This Content?

A huge part of Facebook’s reputation for negative content is the huge amount of political content shared on the social network. 

Since many turn to social media to connect with family, friends, and get away from the pressures of the real world, a large amount of political content can be tiresome and potentially make them less likely to check their feed regularly. 

Further, there are times where you might show an interest in a topic and start seeing an influx of tangentially related content that is not especially useful to you. Think clicking one particularly interesting headline and suddenly seeing tons of content on that topic, even though it’s not really that interesting to you.

To help with this, the company will start surveying users about content they have responded negatively to in order to create a ranking signal to deliver more relevant and positive content.

Was Giving Feedback Easy?

In some form or another, Facebook has given users the ability to deliver this type of feedback for several years. The problem is that finding the tools to do so was often a game of hide and seek. 

To make it easier for users to give feedback, the company is testing a new post design which will include a more prominent button to hide “irrelevant, problematic, or irritating” content and see less content like it in the future.

How This Will Affect Facebook Rankings

For now, it is unclear exactly how much this will change the content appearing in our news feeds every day. 

The company appears to know it has gained a nasty reputation for being overly political, sharing divisive information, and generally being a somewhat negative place to spend your time. 

Still, it remains to be seen whether this will lead to a massive shift or if these ranking signals will be too little to effectively change what gets highly ranked and what people are sharing on the platform in general.

“Overall, we hope to show people more content they want to see and find valuable, and less of what they don’t. While engagement will continue to be one of many types of signals we use to rank posts in News Feed, we believe these additional insights can provide a more complete picture of the content people find valuable, and we’ll share more as we learn from these tests.”

Facebook has announced sweeping changes to its news feed and the way it handles groups or pages that violate the company’s content policies.

The new changes, including a new algorithm signal, are aimed at reducing the reach of sites spreading content with misinformation by judging the authority of the sites the content comes from.

If Facebook believes the site producing content shared on the platform is not reputable, it will decrease its news feed reach and reduce the number of people seeing the content.

How Facebook is Changing its Algorithm

In the past, Facebook has teamed up with highly respected organizations like the Associated Press to validate sites spreading content across the platform.

Now, the company says it is introducing a “click-gap” metric designed to automatically evaluate the inbound and outbound linking patterns of a site to judge if it is authoritative.

Essentially, the click-gap signal measures the inbound and outbound linking patterns to determine if the number of links on Facebook is higher than the link’s popularity across the internet. This will allow the company to distinguish the forced spread of content rather than organic virality.

As Facebook explains in the announcement:

“This new signal, Click-Gap, relies on the web graph, a conceptual “map” of the internet in which domains with a lot of inbound and outbound links are at the center of the graph and domains with fewer inbound and outbound links are at the edges.

Click-Gap looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph. This can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content.”

Changes to Groups

Notably, this new algorithmic signal isn’t just being applied to news feeds. The company explained it will also be using these algorithms to automatically remove low-quality content posted in groups, including private groups.

The company defended the decision by saying they can now identify and remove harmful groups, whether they are public, closed, or secret.”

“We can now proactively detect many types of violating content posted in groups before anyone reports them and sometimes before few people, if any, even see them.”

Admins are Required to Police Content

Along with these changes, Facebook clarified that its algorithms will consider what posts a group’s admins approve as a way of determining if they are a harmful group or eligible for removal.

The company says it will close down groups if an admin regularly approves content that is false, misleading, or against Facebook’s content guidelines.

This is how Facebook explained the new policy:

“Starting in the coming weeks, when reviewing a group to decide whether or not to take it down, we will look at admin and moderator content violations in that group, including member posts they have approved, as a stronger signal that the group violates our standards.”

What This Means for You

As long as the pages you participate in or run are sharing content from reliable sources, the new policies should have little effect on your day-to-day operations. However, the changes could have considerable impacts on brands or influencers who go against mainstream science or other non-approved sources. These types of industries have flourished on the platform for years, but may soon be facing a reckoning if Facebook’s new content guidelines are as strict as they sound.

If you’ve spent much time trying to promote your business on Facebook, you’ve probably recognized the social platform isn’t exactly the best at transparency.

There are a lot of questions about what exactly you can and can’t post, which made it even more frustrating that there was no way to appeal the decision if Facebook decided to remove your content for violating its hidden guidelines.

That is beginning to change, however. Likely thanks to months of criticism and controversy due to Facebook’s lack of transparency and it’s reckless handling of users’ data, Facebook has been making several big changes to increase transparency and regain people’s trust.

The latest move in this direction is the release of Facebook’s entire Community Standards guidelines available to the public for the first time in the company’s history.

These guidelines have been used internally for years to moderate comments, messages, and images posted by users for inappropriate content. A portion of the Community Standards was also leaked last year by The Guardian.

The 27-page long set of guidelines covers a wide range of topics, including bullying, violent threats, self-harm, nudity, and many others.

“These are issues in the real world,” said Monika Bickert, head of global policy management at Facebook, told a room full of reporters. “The community we have using Facebook and other large social media mirrors the community we have in the real world. So we’re realistic about that. The vast majority of people who come to Facebook come for very good reasons. But we know there will always be people who will try to post abusive content or engage in abusive behavior. This is our way of saying these things are not tolerated. Report them to us, and we’ll remove them.”

The guidelines also apply to every country where Facebook is currently available. As such, the guidelines are available in more than 40 languages.

The rules also apply to Facebook’s sister services like Instagram, however, there are some tweaks across the different platforms. For example, Instagram does not require users to share their real name.

In addition to this release, Facebook is also introducing plans for an appeals process for takedowns made incorrectly. This will allow the company to address content that may be appropriate based on context surrounding the images.

If your content gets removed, Facebook will now personally notify you through your account. From there, you can choose to request a review, which will be conducted within 24 hours. If Facebook decides the takedown was enacted incorrectly, it will restore the post and notify you of the change.

Facebook has long been the favorite social media platform for sharing content, but if a report from the New York Times is any indication content creators may soon be looking for a new platform to share their content while still attracting users to their own websites.

According to the report, Facebook may be considering hosting linked content directly on its own site, and serving ads on that content, rather than linking directly to content creators’ sites. Not only does this mean a drop in traffic from Facebook users, the change could outright cause the site owners to lose revenue from declining traffic and ads on their own site.

The change is supposedly going to be limited to mobile devices, but it has already stirred up quite a controversy with content creators and marketers.

Facebook seems to believe the change could be more convenient to users, but those who create content see it more closely in line with content syndication or even content theft. No matter the convenience to users, many content creators depend on revenue from page views and ads which would be significantly impacted if Facebook does end up hosting content.

In the wake of the controversy, Facebook has even opened a discussion on the possibility of sharing revenue to websites that own the content being hosted. In the New York Times article, Facebook explained its profit sharing proposal:

Facebook hopes it has a fix for all that. The company has been on something of a listening tour with publishers, discussing better ways to collaborate. The social network has been eager to help publishers do a better job of servicing readers in the News Feed, including improving their approach to mobile in a variety of ways. One possibility it mentioned was for publishers to simply send pages to Facebook that would live inside the social network’s mobile app and be hosted by its servers; that way, they would load quickly with ads that Facebook sells. The revenue would be shared.

That kind of wholesale transfer of content sends a cold, dark chill down the collective spine of publishers, both traditional and digital insurgents alike. If Facebook’s mobile app hosted publishers’ pages, the relationship with customers, most of the data about what they did and the reading experience would all belong to the platform. Media companies would essentially be serfs in a kingdom that Facebook owns.

The real question appears to be if there will be an opt-out option available. There has been no mention of an opt-out or the potential for the hosting of content to be optional. Even if it is up to the publisher, the change could still negatively impact content creators who choose to host their own content on their page, as content which is hosted on the social platform is likely to look more attractive and convenient.