Research Indicates that Facebook’s Structure Hinders its Ability to Manage False Information

Research Indicates that Facebook’s Structure Hinders its Ability to Manage False Information

Amid the COVID-19 pandemic, when false information thrived on the internet, several platforms introduced measures and strategies to counteract the dissemination of misinformation. Did these initiatives prove effective?
Credit: Pixaobay

Amid the COVID-19 pandemic, when false information thrived on the internet, several platforms introduced measures and strategies to counteract the dissemination of misinformation. Did these initiatives prove effective?

According to a study published in Science Advances, Facebook, the world’s largest social media platform, did not effectively combat COVID-19 vaccine misinformation through its policies. The research, titled “The Effectiveness of Facebook’s Measures Against Vaccine Misinformation During the COVID-19 Pandemic,” involved contributions from researchers at Johns Hopkins University.

Conducted by a team from the George Washington University, the study revealed that Facebook’s efforts were hampered by the inherent design elements of the platform itself.

Design and Architecture in the Battle Against Misinformation

David Broniatowski, the lead author of the study and an associate professor of engineering management and systems engineering at GW, commented, “Current discussions about social media platforms and artificial intelligence governance often revolve around content and algorithms. However, to effectively address misinformation and other online threats, we must expand our focus beyond content and algorithms to consider design and architecture.”

Broniatowski added, “Our findings demonstrate that simply removing content or altering algorithms may prove ineffective if it doesn’t align with the platform’s core purpose, which, in Facebook’s case, involves facilitating connections among community members with shared interests, such as vaccine hesitancy, and enabling them to access information they seek.”

Facebook’s Community-Centric Design

Facebook’s architecture is primarily centered around fostering communities based on users’ interests. It employs various design features, including fan pages that promote brands and public figures, empowering a relatively small number of influencers to reach wide audiences.

These influencers can then establish groups explicitly designed to cultivate communities where members exchange information, including the distribution of misinformation or other engaging content outside the platform.

Group members, particularly administrators (often page creators), utilize Facebook’s newsfeed algorithms to ensure that information reaches interested parties.

Researchers discovered that despite Facebook’s substantial efforts to remove anti-vaccine content during the COVID-19 pandemic, engagement with such content didn’t decline beyond previous patterns, and in some cases, even increased.

Combating Health Misinformation in the Public Domain

Lorien Abroms, a study author and a professor of public health at GW’s Milken Institute School of Public Health, expressed concern about this finding, stating that it highlights the challenge society faces in eradicating health misinformation from public spaces.

Unremoved content saw an uptick in links to unreliable external sites and links to misinformation on alternative social media platforms like Gab and Rumble, especially within anti-vaccine groups.

Furthermore, the remaining anti-vaccine content on Facebook became more misleading, featuring sensationalistic false claims about vaccine side effects that couldn’t be fact-checked in real time. The platform’s policies also led to the potential removal of pro-vaccine content, contributing to increased political polarization in vaccine-related discussions.

Anti-vaccine content creators used the platform more effectively than their pro-vaccine counterparts, coordinating content distribution across pages, groups, and users’ news feeds.

Facebook’s Architecture and Resistance to Change

Even when Facebook adjusted its algorithms and removed content and accounts to combat vaccine misinformation, the platform’s architecture resisted change.

David Broniatowski, the lead researcher, likened Facebook’s architecture to a building designed for specific purposes, highlighting the need to rethink the platform’s structure to achieve a balance between user behaviors and public health concerns.

Broniatowski suggested that social media platform designers could enhance public health and safety by collaboratively establishing “building codes” informed by scientific evidence to reduce online harm.

He drew parallels to building architecture, where compliance with codes for safety, security, and public health is vital, emphasizing the need for industry, government, and community organizations to develop these rules in partnership and informed by sound science and practice.

The study represents the first and only scientific assessment of the effectiveness of the world’s largest social media platform’s systematic efforts to eliminate misinformation and misleading accounts.


Read the original article on: Phys Org

Read more: Wearable Technology – A Concise Past and a Promising Tomorrow

Share this post