Five months after the U.S. presidential election and receiving severe criticism, Facebook is moving on with its plan to stem the flow of fake news and propaganda through its platform — even though it still says fake news on Facebook in 2016 was “marginal” compared to total political discussion.

Facebook just published a white paper which is structured around the topic of civic engagement and the social media network outlines how it plans to stop the spread of misinformation.

“We believe civic engagement is about more than just voting — it’s about people connecting with their representatives, getting involved, sharing their voice, and holding their governments accountable,” Facebook’s threat intelligence manager Jen Weedon, threat intelligence analyst William Nuland, and Chief Security Officer Alex Stamos wrote in the white paper released Thursday.

“Given the increasing role that Facebook is playing in facilitating civic discourse, we wanted to publicly share what we are doing to help ensure Facebook remains a safe and secure forum for authentic dialogue,” they further added.

Information Operations

“Information operations” is the term Facebook uses to describe “actions taken by organized actors (governments or non-state actors) to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome.”

That term encompasses “false news, disinformation, or networks of fake accounts aimed at manipulating public opinion.” Those networks of fake accounts are called “false amplifiers.”

Snapshot: Facebook

To work on stopping various types of misinformation and abuse on the platform — categorized as targeted data collection, content creation and false amplification or fake accounts, and spam that spread misinformation — Facebook is taking a few specific steps.

To stop bad elements from collecting data, Facebook is implementing and promoting security options like two-factor authentication. The social network will also offer custom recommendations for what to do to users who are targeted by attackers.

False Amplification

Facebook ran a case study around the 2016 U.S elections on False Amplification. The social media giant in its report concluded that “the reach of the content shared by false amplifiers was marginal compared to the overall volume of civic content shared during the U.S, election.”

“In short, while we acknowledge the ongoing challenge of monitoring and guarding against information operations, the reach of known operations during the US election of 2016 was statistically very small compared to overall engagement on political issues,” Facebook’s report said.

Concluding its white paper, Facebook acknowledged the need for wider efforts, but for the most of the part, the paper outlined the initiatives the company has already been taking.

The social network works with campaigns and political parties to counter security risks; offers governments guidance on how to do this themselves; runs the Facebook Journalism Project for news organizations; and supports media literacy programs for Facebook users.

“These are complicated issues and our responses will constantly evolve, but we wanted to be transparent about our approach,” Facebook’s report said.

Earlier this month, Facebook had added an alert to the top of news-feed dubbed “Tips for spotting false news.” Read about it HERE.