Nikita Salovich visited the show on episode 064 Evaluative Mindsets & Sociopolitical Consciousness to discuss her recent publication examining the ability of an evaluative mindset to reduce the impact of misinformation.
Laurence Woodruff
We are lucky to have one of the authors of this paper in our studio with us right now.
Michael Ralph
Yeah, so this was written by Nikita A Salovich, Anya M Kirsch and David N Rapp. Did I say their names correctly?
Nikita Salovich
You got it, they’d be very proud.
Michael Ralph
And this was published in Cognition in 2022. And so we are joined by lead author, Nikita, Antonia salvage. She is a PhD candidate in cognitive psychology at Northwestern University. She studies why people are influenced by false information when they know better, and why does it still influence their decisions? Thank you for joining us. Welcome.
Nikita Salovich
Thanks so much for having me.
Michael Ralph
So this paper and several others that you’ve written recently, are focused on this idea of misinformation. And in particular, that phrase, evaluative mindsets. It sounds like there’s a lot in that. So can you tell us a little bit about evaluative mindsets?
Nikita Salovich
Yeah, for sure. So the idea of being evaluative, at least how we’re using it in this paper, and others, is this idea in which people approach information in a way where they’re prioritizing, considering accuracy or the validity of information. So instead of just reading information for, for example, entertainment, or interest, or making maybe these non evaluative judgments in terms of how we’re processing this information, instead, we’re explicitly considering and deliberately considering whether that information is true or false. And so this idea of an evaluative mindset is, you are in this overall, I don’t know, sphere, frame of mind where you are being critical of the information that is provided to you versus focusing on other non evaluative factors.
Laurence Woodruff
I really very much enjoyed the background literature review, portion of this paper as a framework to understand more of the patterns of misinformation and their effect on individuals. And one of the things early presenting false information may override correct information when, when retrieved in the future. And that as a practicing teacher in a classroom, that right there can encapsulate that fear of accidentally teaching something wrong the first time, because it makes the misconception sticky, because it was the first time it was presented. And then later when the kids are like, Hey, this is how it works. And I’m like, no, no, I did that backwards. And now it’s gonna be backwards in your head forever. And so that, you know, was really that’s what’s happening in the classroom when I just accidentally misrepresent something. When we are exposed to inaccuracies in a broad format. That means the opportunity for that to happen is something we’re encountering all of the time in our lives.
Nikita Salovich
Right, all the time. And something that we find consistently in our work is that having accurate prior knowledge doesn’t actually protect you from being influenced by false information in very obvious ways. Even if you know that Saturn is not the largest planet in the solar system that it is Jupiter, just being exposed to the idea once can lead you to reproduce it on a later general knowledge test. Right. So that’s what we find here. But then also, you can show up in other types of assessments as well. So I think that your concern is real, right? Like incorrect worked examples are something that teachers often use in classrooms as a method to actually, you know, say what you don’t want to do. But it becomes an issue when that’s the first thing that students learn. There is this idea that we don’t reflect, necessarily, in this paper. But is it a very related concept, which is called the continued influence effect. And it really goes back to this idea, the first thing that people learn is really sticky. And it’s really hard to correct what you had been saying about, oh, I accidentally taught you something wrong. It’s worse right? Before these sort of like reliance consequences. If you’re like, that’s wrong, just ignore that, but don’t provide the alternative. But it turns out to be actually quite effective. If you’re saying, This is wrong. This is why and instead, this is how we’re doing it. So there is hope in terms of correcting the sorts of inaccuracies. But you could I definitely feel the sentiment of the first thing that people learn can be sticky. It’s tricky when it’s wrong. And it does go back to this incorrectly worked examples that folks didn’t see. But I saw Michael waving his hands about just like, a minute ago, when I was talking about them.
Michael Ralph
I was getting excited about your comments about using negative examples, because that was the primary classroom application that I was thinking about when I read this paper. Like, I make mistakes sometimes. But there are also some deliberate choices that we can make differently in the classroom that recognize the persistent harms that misinformation can cause. I think one of them is that moment where like, a student is interested in a topic that’s current events related. I taught biology when I was in the K 12. sector. And so they’re like, hey, this current event came out. And they just took the first picture of, of a DNA molecule, right? That’s cool that we just finally got a photo of it. I’m like, Well, I don’t know. And I was like, I don’t know. I don’t think that’s the first picture. I don’t think I can actually nail down when the first picture was. And I want to ask that question in this moment. But I know that I don’t know. And so that’s a spot where nobody else is competing with me. So I don’t have to be first to market. And so the difference between saying, Here’s what I think it is, let me get back to you tomorrow about whether that is correct, is a very different decision than saying I don’t know what it is, let me get back to you tomorrow with an answer. And so those kinds of decisions, I think a teacher can make very differently. And with intention to say I’ve, it’s important because if I get this wrong, I can’t repair that in the same way, as if I am more judicious about when I release that information.
Laurence Woodruff
From a practitioner perspective, in addition to being able to regulate ourselves, in terms of communicating confidence of information, like when I’m confident that this is how it works, or this is how we’re describing it, versus when I’m not in addition to being able to do that, and I’ve tried to avoid initial mistakes. The flip side of that initial mistake problem is that, you know, three weeks later, when you’re working on the board, and you make a mistake, and one of your students calls you out on it, that is like the best feeling in the world, right? Because that means that you have created this critical thinking environment where they’re not just passively consuming the information, they’re actually thinking about what you’re presenting, they are using an evaluative mindset. And then they are like the external authority information authority figure is not infallible. They know that I make mistakes. And so they’re able to say, Mr. Woodruff, have you made a mistake here, I thought it worked this way. When that happens, I get so excited. In my classroom, I have a specia…l I teach ninth through what I call 13th graders, and I have a special set of stickers that I give out when kids correct my mistakes and call me out and to try to encourage that behavior. Because that’s that evaluative mindset. And the more they can map, practice it, be it in my classroom or anywhere else, the more they’re going to be able to get themselves into that space in the in the other facets of their life.
Michael Ralph
Well, and that was one of the findings from one of your studies. I know that we haven’t talked about the details, the nuts and bolts of your three experiments that you ran, which, which were great and rigorous experiments, one of the key facts meanings that you were really drilling down on, as I understand it is that general protective quality of practicing and evaluative mindset and its applicability outside of the prompt itself to be evaluative, right? The there was considerable outperformance for your participants in your experiments, when they were told at any point in the experiment, have an evaluative mindset, then they fell prey to recalling misinformation less often than the folks who were not trained at all, which I think speaks to cultivating that classroom mentality of evaluate everything. And it can lend you some confidence that they’re going to be able to carry that into other parts of their lives and other places where they’re consuming information. However, I’m telling you about your own study, how am I doing?
Nikita Salovich
No, no, I think that I mean, a key, honestly, that was beautiful. I think hearing my study back to me is always wonderful, particularly when someone gets it right. That means that I think I did a good job as a scientist in terms of like science, communication, and writing. And so thank you for that. So yeah, that was spot on. And I think that that is one of the biggest takeaways, I think people don’t always think about accuracy. Basically, people may not be doing it all the time, it is amenable to prompting, and it’s amenable to prompting, in spill over in situations where people can be nudged toward developing the sort of accuracy, focus evaluative mindsets, and not just apply it to information that you are asking them to evaluate, but also other information that might be associated with non evaluative goals. So let’s say you’re going through one example. And you’re asking people to pay attention to whether something in all of the steps as you’re working through something are correct. Well, let’s say you go through something else later in the day, there could potentially be spillover rate, like it’s an empirical question. But based on what we know is that that sort of nudge leads people to maybe develop these goals, these ideas, these framing of how they’re processing that information that can lend itself to other situations, besides these ideas, or situations where evaluation is explicitly instructed.
Michael Ralph
That was something that I thought about, as I was reading your discussion, like your generalized discussion later in the paper, where that was like, that’s sort of the if I’m understanding correctly, a big goal of cognitive scientists is these prompts have a power to disrupt some of these harmful effects of misinformation. But we really need some way to understand how people can have that disruption without explicit prompting, like, how can we get away from having to tell people to be explicit in your prompting, but all and all of your studies were using participants from a single, like, participant recruitment platform. And so it made me wonder about if there were ways to do intentional sampling of different careers or different life experiences to build, identify what kinds of life pathways might confer some of that, like generalize critical reading skill ability, right? Like, if we, if we repeated your study with journalists, would they have? Would they have something closer to an evaluative mindset without any prompting? I want it to be true. I want that answer to be yes. I don’t know if it is what I want it to be. I don’t, I don’t want it to be. So that was one of the things I was thinking about is if there was some way to connect, like these sort of classroom experiences, if there was some way to eventually understand how, you know, what level or what depth or what, to what extent do they need to have those experiences to start to see what I would expect to be a generalized protection against some of the misinformation? I don’t know.
Nikita Salovich
Right, for sure, for sure. I think that that’s spot on. I think this idea of things like a journalist is extremely important. I guess one of the things that I like to emphasize to folks is that, you know, even though people may accept that misinformation is an issue in our day to day, I think that a lot of the focus is put on people believing in sharing, using, that false information. But another thing that is really dangerous about it is that it leads people to question and doubt true information as well, whether that’s true information that they are initially being provided, or if it’s true information or accurate information they might hold as their prior understanding as well. So yes, I think that any sort of ways to encourage people to engage in this sort of evaluative practice, it not only reduces people’s reliance on inaccuracies that they may be presented or see in their day to day that we can’t avoid, right, like, there’s no world I can imagine where we can just cleanse it of all false or misleading information. I think, you know, going back to your point of how do we encourage evaluation without just asking people to do so like asking people to think like journalists, one I think is the sort of, you know, journalistic training that goes into these sorts of, you know, careers where fact checking in is part of the day to day and that is media literacy education, right. So this sort of incorporated that within classrooms, but also maybe as a separate unit all in itself. In order to be evaluative, you have to know, you know, what is worth evaluating, or how to evaluate it, how to look for sources, especially with fake information, false information becoming more and more difficult to spot day to day with these deep fakes videos that are created to look like people are doing and saying things that they never actually did. Older folks tend to be very susceptible to false information and be a population that is highly contributing to its spread online. And, you know, one of the reasons could potentially be because they’re not identifying this ability for Photoshop, or, you know, video altering to be so prevalent and easy nowadays. So I think media literacy, education is huge. But I can also talk about other things that we have done in order to so now that we have identified that people are not always evaluating, like, what can we do about it, and we have some creative ideas that we have worked on in the lab that are part of my dissertation that hopefully we’ll be done within the next few days that have been pretty successful. But the goal is always to try to apply them to real world circumstances from the sorts of manipulations that are rooted in our lab based studies. But yeah, it definitely is an open question and an important question with what’s going on in the world right now.
Michael Ralph
So those comments made me think of some things that you wrote about in your paper, about when I summarize, and I thought it was super important. And so I double boxed it in my notes, because I thought it was maybe the most important finding. I can imagine I can feel myself sometimes that I start to slip towards this idea of nihilism around well, if I just I just never know things. And then that’s not a proper like, there’s this statement, and I have two degrees in biology, and there’s a biology statement, but like, if I just don’t know, then, then I can go play video games, right? In the classroom, especially in like the intermediate and like undergraduate levels of instruction. I think we need to be diligent as teachers about not letting students slip into that nihilistic place of like, here’s the data, but it’s inconclusive. If I had a nickel for every lab report, I’ve read that said, this is inconclusive, because human error is just like a routine, like it seems conclusive, it’s easier to say and conclusive. And then I can just opt out of like, applying sensemaking, applying my prior knowledge and applying my critical evaluation of the information. And so I think that’s a really key step is to think about how we can facilitate students engaging in this evaluative mindset, without letting them slip into this place of like, well just don’t ever know anything. And then, and then it won’t be a problem, because you laid out in your paper that those are different things.
Nikita Salovich
Right? Yes, exactly. I think that right now, we had a paper, David Rapp, my advisor, my PhD advisor, and I think it was 2018, where we outline this sort of model of why we can, we can’t just disregard fake news or false information based on what it does and how it incurs consequences on our later decisions and actions. And those three things in progression of least bad to the worst, are, it lets us be confused. When we encounter false information. We demonstrate that sometimes people slow down, right? Like we were like, Whoa, what’s going on? Like, actually, when we say that George Washington was the third President of the United States, people are like, that seems weird. Like, actually, that doesn’t coincide with what I know to be true. So there’s this element of confusion when we encounter false information. And the next is doubt. And the doubt is not just of you, no doubt in the world, but doubt of ourselves and what we know, right? So it leads us encountering false information the world, let’s say that George Washington was the third President of the United States leads us to start questioning what we know you’re like, Wait, I thought it was the first is this what I’m reading is what I’m read or what I know already to be true. So there’s a doubt of prior knowledge, which you know, leads to these consequences of withholding prior knowledge in the future. And then the third is reliance. And that’s reliance on the false information that we are exposed to over our accurate prior knowledge if it exists. So that is, like, for example, regurgitating or believing or some sort of demonstration that when changes their belief or what they’re using to represent this idea that George Washington is the third President of the United States. But going back to your comment previously, this the second idea, right of doubt of prior knowledge is so so important because you don’t want people to not rely on inaccuracies that they’re were exposed to, but we also want to help people protect their accurate prior knowledge and to remain confident that what they know is corrupt and can be used. And this idea that we sort of grapple with within the paper is that evaluation or evaluative mindsets, could lead people to have lower uses of presented inaccuracies or demonstrate less reliance on inaccurate information that we present to them. However, it can also not correct for this impact that it has on people’s prior knowledge or what they think is true. So it could be that people are just withholding all answers, right. So like, we’re asking you to be evaluative, you’re starting to question this information that we’re presenting you. You don’t use the false information, we present you Sure. But how does that impact people’s use of their accurate prior knowledge? It could still be that exposure to that false information has enough impact. And people are like, okay, yeah, I don’t think that Saturn is the largest planet in the solar system. But now I’m not confident enough to be able to report that it’s Jupiter, right? Even though that’s something that they may have grown up knowing. And so we find, thankfully, that being in this sort of evaluative mindset doesn’t lead to this overall conservativeness in response, as it doesn’t lead people to hold back, it increases correct responses that people offer, as well as decreases reproductions of an accuracy. So this is the essential part of like, protecting what people know to be true or protecting against the influence of false information,
Laurence Woodruff
I have one more thing, but I don’t know if… I have half a thing. My half a thing is about our inability to assess the quality of our prior knowledge, maybe a Dunning Kruger type of influence, where like, I know a lot of molecular biology, I know a lot of, of ecology. So I can trust myself to be able to evaluate news reports, rumors, posts about climate change, and how viruses work, because I’ve studied those things extensively. But most people haven’t, in the same way that I haven’t studied immigration, most people haven’t. So even if they are have achieved this, like, internalized evaluative mindset. We’re so confident in our assessments, like, let me Yeah, let’s do that. Let’s back up. Is there research? Have you done it? Have you read it about people’s ability to assess the quality of their own prior knowledge when using an evaluative mindset? Do we know anything about that? What is the research about that? Is that established explored territory?
Nikita Salovich
Right, I think that it hasn’t directly been applied to this particular context. But like, you hit spot on when you mentioned Dunning Kruger in general, people have this tendency of overestimating how much they know. And another thing that I think is worth mentioning is, the ideas that we explore in this paper are purposefully general knowledge statements, and are not politically opinion, opinions are loaded around people’s ideologies, because it becomes a lot more complicated. So I picked for my entire dissertation work to be able to have experimental control in a way over people’s prior knowledge. So we use inaccurate declarative statements is what we call them, false declarative statements in this paper. These are facts that are like explicitly true or explicitly false, like Chicago is not the capital of Illinois, it will never be it is Springfield, that is just false. Right? Versus the idea that vitamin C cures colds, right? That is false, there is no scientific support right now that that is a direct causal statement that is accurate. However, you can imagine situations where like, oh, actually, you know, you having orange juice can make you feel more comfortable, and therefore like, you know, you’re eating more, and they could help your body recover, like those sorts of ideas, where it’s based on the preponderance of evidence that something is true or false, which becomes kind of fuzzy. And actually, most people possess the incorrect conception, that vitamin C cures cold. So when we’re asking people to be evaluative of that sort of statement, what happens right when people actually already hold the misconception to begin with, and so I think that that is an open question and something that researchers including myself are grappling with.
Laurence Woodruff
I think that I’m concerned about this particular concept because in a science classroom, we actually science teachers actually sort of need to operate from the assumption that the kids, some of the kid, all of the kids, every single kid will come in with some misconceptions and misinformation about how the scientific concepts work. And they won’t be the same as each other, and some of them will overlap, and some of them won’t. And so we have to do a lot of work to put them into an evaluative mindset, and be in a comfortable place so that they can use that evaluative mindset to essentially change those concepts themselves. And that’s why I think when I’m reading your paper, I’m reading it from that, like, Well, what about the misconceptions of the individual? What about the lower quality of prior knowledge of the individual, and I think that I got stuck there, because that’s kind of where I work as a practitioner.
Michael Ralph
So for me, I really liked your vitamin C example. Because for me, the most salient takeaway is actually like a research buttressing of a rhetorical opinion I had. And that was something that I think and like, I’m ready to go advocate for. And it’s really about the importance of avoiding negative examples, and false premises as anchoring and engagement exercises. I just don’t like it as a rhetorical device to be like, you think vitamin C cures colds. In the next 15 minutes, I’m going to tell you why that’s false. And like that might get you some clicks on a headline, but you are explicitly working against yourself by opening with that statement. And it does not matter that you explicitly tell them that’s a false statement during your presentation, you have already done the damage by just opening with that statement. So like avoiding negative case, examples, or negative, you know, we’re going to talk about the fact that bears only eat berries, because that’s a useful way to talk about energy flow. And that’s not a true statement. But we’re simplifying it for us right now, you’ve caused a problem by starting with that negative statement. And it doesn’t matter that you told them that it’s a false statement. I think that this is the strongest evidence I have seen to date for avoiding negative rhetorical examples. Because they actually cause a problem. They’re not just this distasteful tool to me, they actually cause a problem because that memory that hey, they said something about bears only berries, they said something about Vitamin C is cures cold, and I don’t remember the details and so that I’m just as likely to retrieve that memory as what I actually learned 10 years before and so do not do it, like the should is avoid negative examples, when at all possible, because they cause harm and harm we science does not currently know how to fix.
Nikita Salovich
No, no. And I think that in cases when you can’t avoid having people encounter false information, so whether that be practically on social media or in the classrooms, if teachers are leveraging fiction, whether that be in the form of like a movie representation of a historical event, or using a fictional novel to discuss a particular concept, to be able to disclose prior to having students engage with that material, some of the stuff in here is wrong. And not just that, but like, actually, like, let’s point it out, like let’s like go through it. So it’s not, I think that when it’s avoidable, you know, try not to disclose false information in any way, shape, or form, if you can, but in cases where it’s not avoidable, because, you know, authors or directors take liberties with truth in order to make things entertaining, there could be some value in terms of entertainment of getting a students to be able to engage and learn that material, but they should also be encouraged to be skeptical of that material at the same time, and explicitly highlight those inaccuracies, or having them be able to have them point them out based on prior knowledge that you hopefully have discussed earlier. In, like classroom discussions be like Okay, so like what about this is actually incorrect based on what we have learned about the truth of the world and have that sort of discussion? Yeah, I think that that is definitely a big takeaway for classrooms.
Michael Ralph
Thank you for joining us. This has been a really satisfying conversation, and it’s really useful work as we think about how we manage the information we present in our classrooms.
Nikita Salovich
Thanks so much for having me seriously. It’s been wonderful talking to you both. Learning more about what you do more about beer and you know, hearing back to me what I do is great too, from a fresh perspective. I do want to plug that I do have a website. It’s my last name which is salovich, S A L O V I C H dot com. Make it easy for you guys, just have to remember one name. You can read more of my research on there. All of the PDFs of my papers are listed as well as Twitter threads that I have written about my work To which hopefully are a little bit more accessible and friendly to folks that may not want to read, you know, 15 pages, dive through figures, so on and so forth, summarized by yours truly. I’m also on Twitter. My, my user name is psylovich, which is supposed to be like a pun of both Psych and Salovich, But i don’t know if that actually worked out and I’m too scared to actually change it. But also I am defending next month, I accepted a research job in industry. I will be a researcher at Meta starting this summer for the Facebook app on the news team, so I’ll be able to continue doing research that hopefully makes the content that people see online, more informative and safe. And for any sort of grad student or someone who is right now trying to pivot, I guess from academia to industry research, I’d be happy to chat about my transition to you because I know that sometimes it can be a scary and big one. So I didn’t want to put myself out there as a resource for folks.