Facebook announced Wednesday that it is trying to combat the sharing of false news on the platform by updating a “news literacy campaign” that provides “tips” in spotting such content, among other measures.
The tech giant, in an apparent attempt at transparency, released a short film called “Facing Facts” that aims to lay out its efforts against misinformation.
“The film is one of the first pieces of content from Inside Feed, a new site dedicated to shedding light on the people and processes behind Facebooks products and algorithms,” John Hegeman, head of News Feed for Facebook, wrote in a company blog post.
The film details its process, including identifying three main sections: “bad actors,” “bad behaviors,” and “bad content.”
Bad behavior, for example, includes those that are considered “polarizing,” “misleading,” “spamming,” “impersonation,” “sensationalizing,” and “engagement bait.”
Bad content includes: “false news,” “hate speech,” “spam,” “graphic violence,” “clickbait,” and “links to low quality web experiences” like ad farms, reiterating that news content isnt Facebooks only target.
One of the main potential problems is whats behind those proprietary products and algorithms, the intricacies of which are usually kept hush due to their unique, likely exclusive features and components.
Google, for example, launched a fake news-fighting initiative and months later suspended it in January after an investigation by The Daily Caller News Foundation proved the algorithms powering the widget had fundamental flaws. (RELATED: Are Faulty Algorithms To Blame For Googles Fact-Checking Mess?)
Also, Facebook, feeling the public pressure, decided to take up many experiments at altering its platform to make the discourse and content less vitriolic, and more truthful. It ditched its once-highly anticipated plans to separate the News Feed into two separate sections. Not long before in December, Facebook disclosed that it was jettisoning its “Disputed Flags” program, which generated indicator warnings for what its algorithms deem potentially fraudulent information in certain articles seen on the trending news sidebar or users news feeds.
The decision stemmed from academic research showing “that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs — the opposite effect to what” it intended. How different the ostensibly new aforementioned tips will be is not yet clear.
Send tips to [email protected].
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact [email protected].