Meta is reportedly conducting experiments to see if it’s possible to engage Facebook users in discussions about complex political issues, such as: B. How to deal with the significant number of posts on their platforms that contain incorrect information and ultimately come up with practical solutions.
One of the main problems facing social networks like Facebook and Twitter in terms of moderation is intentional disinformation or misinformation, based either on ignorance or misunderstanding.
Significantly inaccurate information can be found on a variety of issues, including the Covid-19 vaccine and climate change, the latter being particularly difficult to address.
See also Scaling Fact Checking: The Role of AI
An investigation by the environmental organization Stop Funding Heat found 45,000 posts that downplayed or denied the climate crisis. The guard reported last year.
Recently, from February to April, the platform brought together three user groups (a total of 250 users from five countries) to find an answer to the question, ‘What should Meta do about problematic climate information on Facebook?‘
Meta was particularly interested in what regular users would want in terms of moderation given the right information on the topic.
Meta commissioned the Behavioral Insights Team (BIT), a UK-based policy consultancy, to invite 250 Facebook users to participate in the policy-making process.
According to BIT, ‘problematic information‘ is the content that is not necessarily false but represents viewpoints that may contain misleading, low-quality or incomplete information that may lead to incorrect conclusions.
Over two weekends, Facebook users were brought together online in the three groups to learn about platform regulations and climate issues. They were given access to Facebook staff as well as independent experts on language and climate issues.
Facebook presented users with a number of possible solutions to the problematic climate information, and users discussed and voted on their preferred results.
The results of the investigation are being analyzed by Facebook’s policy teams and have not been made public.
However, in a post-activity poll, 80% of participants said Facebook users like them need to have a voice in policy-making.
Meta says it intends to maintain this strategy.
“We’re getting dressed‘I don’t think we should be making so many of these decisions ourselves,” said Brent Harris, the company’s vice president of governance edge.
“You‘heard us repeat that and we mean it,” he added.
BIT describes the process as “deliberative democracy” and says it brings “depth and legitimacy” to political decisions.
To improve and scale user reflections, BIT intends to continue collaborating with Meta.
“Together we look forward to developing governance mechanisms that enable social media users to meaningfully influence the design, content and regulation of the platforms that shape their lives. By doing so, we also hope to pave the way for governance innovation across social media platforms and institutions around the world,” the company noted.