Social norms action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
4 minutes
Read so far

How Facebook's Newsfeed Algorithm Shapes Childhood Vaccine Hesitancy: An Algorithmic Fairness, Accountability, and Transparency (FAT) Perspective

0 comments
Affiliation

University of Texas at El Paso (Calderon); Virginia Tech (James, Lowry)

Date
Summary

"If people do not understand how the algorithms manipulate the content they see to maximize their engagement, they are more likely to be captured and influenced by social media echo chambers."

The proliferation of vaccine content on social media has contributed to the problem of vaccine hesitancy, which is the delay or refusal of vaccination when vaccines are available. Most of the early information systems (IS) research on disinformation (i.e., fake news) emphasised the creation of mechanisms that help users identify fake news. Less IS research has investigated the behaviours online dis(mis)information may prompt, which is the area to which this study contributes. Specifically, the study examines how users' perceptions of the fairness, accountability, and transparency (FAT) of the Facebook newsfeed algorithm, which determines the content social media users see, influence their intention to vaccinate.

The main rationale for this study is that, because the details of the logic newsfeed algorithms use to determine the content users see are largely unknown and proprietary, it is up to users to evaluate whether (i) the logic the newsfeed algorithm employs to display the content they see is fair, (ii) Facebook can be held to account for how the newsfeed operates, and (iii) they understand how the algorithm determines the content they see. Because the details of the algorithm's logic are unknown, it is important to understand what users think about the FAT of newsfeed algorithms and how those perceptions influence the effectiveness of the dis(mis)information the algorithms convey.

The study uses the theory of planned behaviour (TPB) to examine the effectiveness of vaccine dis(mis)information on Facebook, which posits that people's attitudes toward a behaviour, the perceived social norms regarding the behaviour, and the perceived control people have to execute the behaviour influence their intentions to perform the behaviour. This study proposes that Facebook users' perceptions of the newsfeed algorithm's FAT will shape the TPB factors and thus influence behavioural intent.

In brief, the model outlined in the article has three reflective exogenous variables that correspond to users' perceptions of the fairness, accountability, and transparency of the Facebook newsfeed algorithm. The reflective mediating variables are negative attitude toward vaccination and Facebook antivaccination descriptive and injunctive norms.

The researchers test their model on Facebook users who are connected to more vaccine opponents than proponents. They surveyed people in the United States (US) who have (i) at least one Facebook social connection who has posted antivaccination content and (ii) who have more Facebook social connections who post antivaccination content than pro-vaccination content. There were 294 male and 211 female participants, all of whom had or were expecting children, in the final sample.

The investigation finds that people's perceptions of the Facebook newsfeed algorithm's FAT increase their negative attitudes toward vaccination (fairness and transparency). This finding suggests that vaccine dis(mis)information may be effective in turning people against vaccinating their children when that dis(mis)information is displayed by a newsfeed algorithm that users believe employs fair logic. One implication of this result is that vaccine dis(mis)information can be persuasive and that its proliferation on social media by newsfeed algorithms that prioritise engaging content may be exacerbating the problem.

However, people's perceptions of the Facebook newsfeed algorithm's FAT decrease users' perceptions of antivaccination norms on Facebook (fairness, accountability, and transparency). This latter finding suggests that even if the newsfeed algorithm is perceived as fair, users are unwilling to assign antivaccination norms to their social connections on Facebook, even when more of their social connections are vaccine opponents. One explanation may be that people holding antivaccination beliefs are more vocal about those beliefs than those holding pro-vaccination beliefs; thus, Facebook users may believe that they are not seeing pro-vaccination content in their newsfeed because their social connections who are proponents of vaccination are unlikely to post about it, rather than attributing it to an unfair algorithm.

Another finding: Negative attitudes toward vaccination decrease the intention to vaccinate, as do perceptions of Facebook antivaccination norms.

Per the researchers, the study makes three primary contributions to research and theory:

  • The newsfeed algorithm is the key mechanism that determines what many social media users see. This study examines social media users' perceptions of the algorithm that feeds the dis(mis)information to them rather than their perceptions of the dis(mis)information itself. Moreover, it examines how these perceptions can influence users' intentions to perform an offline behaviour: Vaccinating their children.
  • This study contributes to the IS literature on the persuasiveness of social media. Researchers have investigated both advice generated by algorithms (e.g., recommender systems) and advice generated by peers (e.g., reviews). This context provides a mix of the two, because the advice (e.g., antivaccination content) may come from peers or algorithms (e.g., bots), but whether users see that advice is determined by an algorithm. Therefore, their perception of the newsfeed algorithms' FAT is critical to their interpretation of that content within the broader social context.
  • The results confirm that negative attitudes toward vaccination and perceptions that one's Facebook social connections hold antivaccination norms can negatively influence intention to vaccinate. This study clarifies the role the newsfeed algorithm plays in this problem.

Implications for society and practice include:

  • It is critical for social media companies to be more transparent about the logic of their newsfeed algorithms or, alternatively, to give users more control over adapting the newsfeed algorithm's display logic (e.g., prioritising, snoozing, blocking content). It is equally important to educate the public about what is and is not known about how the algorithms work.
  • Exposure to dis(mis)information by an algorithm deemed fair and transparent, and for which it is thought the company can be held accountable, may not be enough for users to assign antivaccination descriptive and injunctive norms to their Facebook social connections at large. Especially if normative influence is powerful, this leaves open the possibility that keeping users out of dis(mis)information echo chambers may be a way to minimise its effectiveness, because it is potentially useful to encourage tactics that may discourage perceptions of antivaccination norms.

In conclusion: "methods for combatting vaccine dis(mis)information may include technical remedies that decrease the prominence of dis(mis)information but also that social remedies, such as educating the public about social media newsfeeds, may be advisable."

Source

Data and Information Management, Volume 7, Issue 3. https://doi.org/10.1016/j.dim.2023.100042. Image credit: Pixabay (public domain stock image)