Afficher la notice abrégée

dc.contributor.authorVicente, Lucía
dc.contributor.authorBlanco Bregón, Fernando 
dc.contributor.authorMatute, Helena
dc.date.accessioned2023-10-16T09:30:59Z
dc.date.available2023-10-16T09:30:59Z
dc.date.issued2023-03-24
dc.identifier.citationVicente, L., Blanco, F., & Matute, H. (2023). I want to believe: Prior beliefs influence judgments about the effectiveness of both alternative and scientific medicine. Judgment and Decision Making, 18, E1. [doi:10.1017/jdm.2022.3]es_ES
dc.identifier.urihttps://hdl.handle.net/10481/85022
dc.description.abstractPrevious research suggests that people may develop stronger causal illusions when the existence of a causal relationship is consistent with their prior beliefs. In the present study, we hypothesized that prior pseudoscientific beliefs will influence judgments about the effectiveness of both alternative medicine and scientific medicine. Participants (N = 98) were exposed to an adaptation of the standard causal illusion task in which they had to judge whether two fictitious treatments, one described as conventional medicine and the other as alternative medicine, could heal the crises caused by two different syndromes. Since both treatments were completely ineffective, those believing that any of the two medicines worked were exhibiting a causal illusion. Participants also responded to the Pseudoscience Endorsement Scale (PES) and some questions about trust in alternative therapies that were taken from the Survey on the Social Perception of Science and Technology conducted by FECYT. The results replicated the causal illusion effect and extended them by revealing an interaction between the prior pseudoscientific beliefs and the scientific/pseudoscientific status of the fictitious treatment. Individuals reporting stronger pseudoscientific beliefs were more vulnerable to the illusion in both scenarios, whereas participants with low adherence to pseudoscientific beliefs seemed to be more resistant to the illusion in the alternative medicine scenario. Alternative medicine refers to a wide range of health practices not included in the healthcare system and not considered conventional or scientific medicine (World Health Organization, 2022). A common feature of alternative therapies is the lack of scientific evidence on their effectiveness, with some popular examples being homeopathy (Hawke et al., 2018; Peckham et al., 2019) and reiki (Zimpel et al., 2020). Therefore, they often can be considered pseudoscientific (i.e., practices or beliefs that are presented as scientific but are unsupported by scientific evidence; Fasce and Picó, 2019). Understanding why some people rely on alternative medicine despite this lack of evidence is relevant, since its usage can pose a threat to a person’s health (Freckelton, 2012; Hellmuth et al., 2019; Lilienfeld, 2007), either by replacing evidence-based treatments (Chang et al., 2006; Johnson et al., 2018a, 2018b; Mujar et al., 2017) or by reducing their effectiveness (Awortwe et al., 2018). In this research, we will assume that people assess the effectiveness of a given treatment (whether scientific or alternative) by estimating the causal link between the treatment (potential cause) and symptom relief (outcome). To achieve this, people can resort to various information sources, but they could certainly use their own experience of covariation between the treatment and the symptoms. However, biases can occur in this process. In particular, the causal illusion is the systematic error of perceiving a causal link between unrelated events that happen to occur in time proximity (Matute et al., 2015). This cognitive bias could explain why people sometimes judge that completely ineffective treatments cause health benefits (Matute et al., 2011), particularly when both the administration of the treatment (i.e., the cause) and the relief of the symptoms (i.e., the outcome) occur with high frequency (Allan et al., 2005; Hannah and Beneteau, 2009; Musca et al., 2010; Perales et al., 2005; Vadillo et al., 2010). Although the causal illusion is subject to variations in the probability with which the potential cause and the outcome occur, and hence most theoretical analyses of the phenomenon have focused on how people acquire contingency information (e.g., Matute et al., 2019), the participant’s prior beliefs could also play a role, and this will be the focus of the current paper. In fact, influence of prior beliefs seems common in other cognitive biases that enable humans to protect their worldviews. A good example is the classical phenomenon of belief bias (Evans et al., 1983; Klauer et al., 2000; Markovits and Nantel, 1989). This consists of people’s tendency to accept the conclusion of a deductive inference based on their prior knowledge and beliefs rather than on the logical validity of the arguments. For example, the syllogism ‘All birds can fly. Eagles can fly. Therefore, eagles are birds’ is invalid because the conclusion does not follow from the premises, but people would often judge it as valid just because the conclusion seems in line with their previous knowledge. There is a specific form of belief bias known as ‘motivated reasoning’ (Trippas et al., 2015), in which people exhibit a strong preference or motivation to arrive at a particular conclusion when they are making an inference (Kunda, 1990). Thus, individuals draw the conclusion they want to believe from the available evidence. To do this, people tend to dismiss information that is incongruent with prior beliefs and focus excessively on evidence that supports prior conceptions, which resembles the popular confirmation bias (Oswald and Grosjean, 2004). Additionally, some evidence points out that motivated reasoning can specifically affect causal inferences (Kahan et al., 2017), particularly when people learn about cause–effect relationships from their own experience (Caddick and Rottman, 2021). Thus, if these cognitive biases show the effect of prior beliefs, it should not be surprising that causal illusions operate in a similar way. In fact, some experimental evidence suggests that this is the case. For example, Blanco et al. (2018) found that political ideology could modulate causal illusion so that the resulting inference fits previous beliefs. In particular, the results from their experiments suggest that participants developed a causal illusion selectively to favor the conclusions that they were more inclined to believe from the beginning. Thus, we predict that prior beliefs about science and pseudoscience could also bias causal inferences about treatments and their health outcomes. More specifically, we suggest that, when people attempt to assess the effectiveness of a pseudoscientific or scientific medical treatment, their causal inferences may be biased to suit their prior beliefs and attitudes about both types of treatments. In line with this idea, a recent study by Torres et al. (2020) explicitly examined the relationship between causal illusion in the laboratory and belief in pseudoscience. These authors designed a causal illusion task with a pseudoscience-framed scenario: participants had to decide whether an infusion made up of an Amazonian plant (i.e., a fictitious natural remedy that mimicked the characteristics of alternative medicine) was effective at alleviating headache. They found that participants who held stronger pseudoscientific beliefs (assessed by means of a questionnaire) showed a greater degree of causal illusion in their experiment, overestimating the ability of the herbal tea to alleviate the headache. Importantly, note that this experiment only contained one cover story, framed in a pseudoscientific scenario. We argue that the results observed by Torres et al. (2020) have two possible interpretations: the first is that people who believed in pseudoscience were more prone to causal illusion in general, regardless of the cover story of the task; the second, based on the effect observed by Blanco et al. (2018) in the context of political ideology, is that the illusion is produced to confirm previous beliefs, that is, those participants who had a positive attitude toward alternative medicine were more inclined to believe that the infusion was working to heal the headache, and causal illusion developed to favor this conclusion. Given that only one pseudoscientific scenario was used in Torres et al.’s experiment, it is impossible to distinguish between the two interpretations. Thus, further research is necessary to analyze how individual differences in pseudoscientific beliefs modulate the intensity of causal illusion, and whether this modulation interacts with the scenario so that prior beliefs are reinforced. To sum up, the present research aims to fill this gap by assessing the participants’ attitude toward pseudoscience, and then presenting an experimental task in which participants are asked to judge the effectiveness of two fictitious medical treatments: one presented as conventional/scientific, and the other one as alternative/pseudoscientific. None of these treatments were causally related to recovery. Our main hypothesis is that the intensity of the observed causal illusion will depend on the interaction between previous beliefs about pseudoscience and the current type of medicine presented. Specifically, we expect that: • Participants with less positive previous beliefs about pseudoscience will develop weaker illusions in the pseudoscientific scenario than in the scientific scenario. For those participants, the conclusion that an alternative medicine is working is not very credible according to their prior beliefs. • Participants with more positive beliefs about pseudoscience could either show the opposite pattern (so that they find more believable the conclusion that the pseudoscientific medicine works than the conclusion that the scientific medicine works), which would be consistent with the studies by Blanco et al. (2018) on political ideology, or, alternatively, they could show similar levels of (strong) causal illusion for both treatments, which would suggest that pseudoscientific beliefs are associated with stronger causal illusions in general, as has been previously suggested (Torres et al., 2020)es_ES
dc.description.sponsorshipGrant PID2021-126320NB-I00 from the Agencia Estatal de Investigación of the Spanish Governmentes_ES
dc.description.sponsorshipGrant IT1696-22 from the Basque Governmentes_ES
dc.language.isoenges_ES
dc.publisherCambridge Universuty Presses_ES
dc.rightsAtribución 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.titleI want to believe: Prior beliefs influence judgments about the effectiveness of both alternative and scientific medicinees_ES
dc.typejournal articlees_ES
dc.rights.accessRightsopen accesses_ES
dc.identifier.doi10.1017/jdm.2022.3
dc.type.hasVersionVoRes_ES


Fichier(s) constituant ce document

[PDF]

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Atribución 4.0 Internacional
Excepté là où spécifié autrement, la license de ce document est décrite en tant que Atribución 4.0 Internacional