SCAS Talks Spotlight - Episode 7

Sciences against Misinformation

Published: 15 May 2024

Summary
In this episode within the series of SCAS Talks Spotlight we have gathered some of the voices from the symposium ”Sciences against Misinformation” which brought together experts from different research areas pertaining to the challenges of mis- and disinformation. From biology to philosophy, they unravel the complex web of misleading information, highlighting the crucial role of social media and the chilling impact on democracy. Learn how to spot misleading information, understand the science of trust, and discover innovative solutions to combat this global crisis. The discussion also underscores the significance of interdisciplinary research and collaboration in countering misinformation. Finally, the speakers stress the urgent need for institutional and societal responses to address the spread of misleading information and protect democratic processes.

Keywords
Misinformation, democracy, critical thinking, trust, post-truth era

Transcript of the Episode

Natalie von der Lehr 00:17
Welcome to SCAS Talks, a podcast by the Swedish Collegium for Advanced Study. My name is Natalie von der Lehr, and in this episode, within our series of SCAS Talks Spotlight, we have gathered some of the voices from the symposium "Sciences Against Misinformation" which brought together experts from different research areas pertaining to the challenges of mis- and disinformation. The symposium was held at SCAS in April 2024 and we jump right in with a short background and some terminology. The three organizers, Dan Larhammer, Kathrin Glüer-Pagin and Thomas Nygren, came to the studio to tell us more.

Dan Larhammar 01:00
So my name is Dan Larhammer. I'm a Professor of Molecular Cell Biology in the Department of Medical Cell Biology here at Uppsala University, and I've been involved with SCAS since several years.

Kathrin Glüer-Pagin 01:13
I'm Kathrin Glüer, I'm a Professor of Theoretical Philosophy in Stockholm, Stockholm University, and I'm a fellow at SCAS this term.

Thomas Nygren 01:22
And my name is Thomas Nygren. I'm a Professor of History and Social Studies Education here at the Department of Education at Uppsala.

Natalie von der Lehr 01:29
We do a little time jump here. I would actually like to know a little bit about the background of this conference. Maybe you, Dan, can tell us a little bit about how this symposium came to be about.

Dan Larhammar 01:41
The origin is actually a conference that I was involved in organizing in Stockholm in 2017 and that conference was called "Knowledge resistance and how to cure it". And I believe it might have been one of the very first of its kind that brought together people with different kinds of expertise pertinent to that topic. I was co-organizing this together with Åsa Wikforss and Arne Jarrick who is a professor at Stockholm University, and we had a large number of international prominent speakers in that symposium. So now it was about time to repeat that, because so much has happened since then. The situation has gotten worse in several regards, when it comes to spreading of misinformation and disinformation. And a lot of research has been done since then, so there's a lot more to report, and we really saw this in this meeting.

Thomas Nygren 02:41
Isn't it kind of interesting to think that the first symposium was about the cures, and then things gotten worse since 2017?

Dan Larhammar 02:49
We couldn't have imagined back in 2017 that it would get so bad. So there's every need to do everything we can, and fortunately, there is a lot more research now to base our activities on, both to spread knowledge, to describe how knowledge is obtained, and to increase trustworthiness of science. It takes time to obtain and deserve trustworthiness, and the best way to do that is, of course, to show that science works.

Natalie von der Lehr 03:23
And then some terminology. Some people talk about misinformation. We also have disinformation, and last but not least, we have mal-information. Can we just have a little definition here? What is what?

Kathrin Glüer-Pagin 03:36
There isn't clear consensus amongst those who research this. But some people suggest that one should go about it like this, where you think of misinformation as information that is basically false, where the spread of the information does not need to be aware of the falsity of the information, nor do they need to have any kind of harmful intentions or anything. And that changes when you want to talk about disinformation. Disinformation, many people think what is important is that there is an intentional spread of something that you at least believe to be false. So you're aware of the falsity of what you are spreading, and you do it nevertheless for certain reasons. Then some people also think that we should include in disinformation things that are actually true but heavily misleading, where you have cherry picked the information or baked it into a narrative where it suggests something that is actually false. So many people think that the intention with which disinformation is spread is one that has something to do with instilling false beliefs in the listener, in the audience. Some other people think that that is when you should talk about mal-information, either that or where the intention actually is to inflict some sort of harm on the audience. Because that doesn't seem necessary for something to be disinformation that you want to inflict harm. You can just do it for your own financial gain, that's not necessarily harmful to anyone else, but nevertheless, you spread something that you know or believe to be false.

Thomas Nygren 05:14
And actually in Swedish, mis- and mal-information are not common words that are used that often. We sort of push them into the Swedish language now, but before, we just talk about misleading information. And then, like harmful information should be mal-information.

Dan Larhammar 05:30
But there is disagreements, or we have to be careful with the term mal-information.

Thomas Nygren 05:35
If you want to have the umbrella term that covers it all, then people talk about misinformation. And that's why we have the conference labeled "Sciences against misinformation", because it's all types of information that may be misleading that we've been looking at.

Natalie von der Lehr 05:59
In particular, social media has been pointed out as a source for misinformation and a way to spread it. This, in turn, has many downstream effects on society. One of the more concerning is, of course, the effect on democracy. Åsa Wikforss and Jesper Strömbäck elaborate on this.

Åsa Wikforss 06:17
I'm Åsa Wikforss, I'm a Professor of Theoretical Philosophy at Stockholm University, and I work on theory of knowledge, language, mind in philosophy.

Jesper Strömbäck 06:28
And I'm Jesper Strömbäck from Gothenburg, Professor in Journalism and Political Communication at the University of Gothenburg. My research deals with the relationship between media, media use, media consumption related to political knowledge, political participation, political perceptions, public opinion, conspiracy theories and the like.

Åsa Wikforss 06:55
Maybe I should add that we also together are in a big research program on knowledge resistance.

Natalie von der Lehr 07:01
Jesper, could you tell us a little bit more about - you said you study the media, and in your talk yesterday, you talked about the media both as a problem and the solution to misinformation. Can you tell our listeners who haven't been at your talk a bit more about that?

Jesper Strömbäck 07:16
It's kind of a multifaceted argument that I try to make, but at the core of it is that the purpose of journalism is to provide people with the kind of information they need, with an emphasis on need, to be free and self governing. And that requires that the information is true and that information is verified, but it also requires that the information is rather comprehensive or proportional. One typical example of that might be the coverage of crime, if the news media only report when crime are committed and when crime levels are increasing, but never report if or when crime levels are decreasing. It might be true, it might be verified, but it's not giving an accurate picture of the overall societal development. So from one perspective, you could say that the news media is part of the solution by producing high quality journalism that is truthful, that is verified, and by checking information that is floating around and debunking false and misleading information, they might be part of the solution. But the problem with the news media is, of course, that the news media are organizations where journalism is just one kind of content. Journalism is produced within news media. There are commercial businesses, which means that they need to keep up revenues, they need to keep down costs. At the same time, high quality journalism is costly. They're also in the marketplace for attention, where there is more fierce competition than ever, meaning that the news media need to target people with information, not just that they need, but also with information they want, that they find intuitively appealing, and that goes for both the selection of news, but also for the framing of news. And we know that there are certain news values that are heavily employed by the news media in order to attract attention. Negativity is one typical where we have a negativity bias on several different levels. To begin with, we as individuals, we pay more attention to negative information, and we remember negative information better than positive information. So there's a negativity bias when the media cover politics and society, which then feeds into our own negativity biases. Another thing that the news media do, that because it attracts attention, is to focus on conflicts and polarizing topics, and often times based on the notion that there are two sides of a story, giving voice to the more extreme voices, squeezing out the more moderate voices, which also gives a false impression. They do not do this out of some kind of evil intentions. It follows from the structural constraints that journalists are faced with. Then the third aspect that I want to highlight is that news media are truth seeking. They want to be trusted, both because they see it as a receipt that they are doing something good, but also because they need legitimacy, and high trust is an important part of that. The problem, of course, here, when it comes to the context that we are living now, is that we have an increasing prevalence of mis- and disinformation. We have digital and social media where there's a larger amount of information floating around that either is not controlled or is outright false or is misleading. And oftentimes this information comes from political alternative media, or they come from politicians. And then the news media, on the one hand, they are supposed to provide people with information that people need to be free and self governing. On the other hand, they have a need to be trusted across the board, and are extremely afraid of being seen as partisan, which puts them in a problematic situation when, for example, the party leader or one of the major parties say something that is false and misleading. If they challenge that outright, they are likely to be perceived as biased. So what do they often do? They often fall back on what I call a narrow conception of truthfulness, where their emphasis is to making sure that they correctly quote whoever it is they are interviewing or referring to. So truthfulness is reduced to it's true that person X said Y, rather than having a more broad conception of truth, where they also check whether that which person X said is true. So by doing this, they can appear as neutral, and they can say that they provide correct information, but still, the news media might function as a channel of false and misleading information, and that is where the news media become part of the problem. And we did a study a few years ago within the knowledge resistance program where we looked at the spreading, dissemination of false and misleading information, and reached the conclusion that most of the time, people get to know false and misleading information through major news media. The major news media might highlight this information in order to debunk it, but it's still through major news media, it reaches people. And sometimes it's because false and misleading information might have a certain news value. For example, conspiracy theories, QAnon, to take one example. How many people knew about QAnon before major mainstream news media started covering it? They didn't cover it sympathetically, but still, they covered it and people got to know about it. Or take another example, I mean, there have been a lot of studies over the number of times Donald Trump has lied. We don't have the equivalent in most other countries, but still, the news media feel that powerful sources need to be quoted and they can't be challenged all the time, because then the news media will be seen as partial, and that is when the news media become part of the problem.

Åsa Wikforss 13:29
But it's a tremendous challenge right now. I mean, if you have politicians who are happy to lie, then what is news media supposed to do? And we saw that when Trump came on the stage first, how American newspapers really struggling during the election year 2016, how to report these things that Trump said that were false. You have to sort of be much more willing to call this for what it is, but it also puts you in this dilemma then, because then immediately, well, "New York Times are against Trump saying that he's not telling the truth". That's a serious dilemma.

Jesper Strömbäck 14:03
And we see that in the US. I mean, if you look at media trust in the US, it's way higher among those leaning to the left or liberal part of the scale, or that vote for the Democrats. There are huge differences. Some of this we could see before Trump.

Åsa Wikforss 14:20
Well, there's a sharp escalation from 2016 on. So what's striking about our times is not that there's like falling trust in newspear or in science, is that there is polarized trust. So there's increased trust on one side of the political scale and falling trust on the other. And we see the same pattern in Sweden, by the way, especially when it comes to public service.

Jesper Strömbäck 14:41
And that creates a challenge, then. Should the news media try to win back those who have lost trust, or should they accept that there might be a conflict between truthful news reporting and trust? And I would argue that truthful news reporting must be the most important goal all the time, regardless of the consequences. But I know from conversation with journalists that they are struggling with how they are going to handle this, and they see it as problematic that there are certain groups, mainly among the right wingers, that trust them less.

Natalie von der Lehr 15:20
Åsa, you have written quite a few books on democracy. What about the influence of mis- and disinformation on democracies?

Åsa Wikforss 15:27
It's so terribly harmful, because we need knowledge to be free and self governing, right? So it's harmful on many levels. I mean, the first fundamental level, of course, is just manipulation of our political preferences before an election. That's a very direct way of harming democracy, because when my preferences are manipulated, I'm not free and self governed, I'm being played with, then democracy doesn't work. So that's a fundamental level. We see that now in the EU before the EU elections, now in June, that we know there's a lot of disinformation around that is designed, and a lot of it is AI driven to to manipulate the outcome of the election. So that's one problem. Then there is sort of state actors who drive propaganda and disinformation of certain sort meant to harm democracy, and a good way to do that is to make sure that everybody in a democracy get really angry at each other, you create disagreements that become unmanageable, unbridgeable. And they're very skilled these actors, so that they identify some issue where there is a disagreement in a society, and then they use disinformation and propaganda. It's called the wedge technique. You use a wedge to make this initial disagreement into an unmanageable controversy. In the US, it's been a lot about Black Lives Matter and stuff. In Sweden, the burning of the Koran was such a classic example of that. Of course, it made the Muslim populations very upset that this is going on, and then when some of them misbehaved, and was this burning of the police cars, the people who don't like the Muslims could get extremely upset and so on. So that's a technique that's used to harm democracy, because if we can't handle disagreement - democracy, you can say, is a way to handle disagreement peacefully, to accept that we have different views and different values, and yet we're going to live together, and we have to compromise. And sometimes you win and sometimes I win, and that's okay. When that fundamental idea is not working anymore, which is the goal of this kind of propaganda. When you start looking at your political opponents as not people you have a different view from, but people who are your enemies, who are morally corrupt or don't have legitimate demands on power, then it all falls apart. I mean, what I always say is we have disagreements on facts, on how the world is and disagreements on values. You know, how the world should be. What is a good society? Is freedom more important than equality? Whatever? We will always have differences in values, and that's okay. We can manage that normally, and democracy is meant to handle precisely that. But when we also have an inability to handle factual disagreements, disagreements that we normally know how to handle, because what you have to do when you have a factual disagreement is you check, you go out and get more evidence and do some more science, and then the evidence comes in and we can settle the disagreement. When we have factual disagreements that we can't settle, then you get political disagreements that are unmanageable, and that's precisely what's going on. This is where the dis- and misinformation does so much harm.

Natalie von der Lehr 18:43
How do you know who to trust? A climate of trust, and trustworthiness is essential to counteract misinformation. Maria Baghramian, Professor of Philosophy at University College Dublin and University of Oslo, puts her research into the context of what sciences can do against misinformation.

Maria Baghramian 19:03
So we heard many very interesting papers on the question of what science and scientists and journalists, etc, can do against misinformation, teaching younger people, teaching about science, teaching about how to avoid fake news or disinformation and misinformation. I'm taking a slightly different approach to this. I don't deny the value of what is being done in teaching about science and about avoidance of misinformation, but what I'm trying to convince people to look at is to create an atmosphere where science can be fully trusted, where good science can be fully trusted, and where journalism can be trusted. So I'm putting the onus, the pressure, on scientists, on journalists or media people, to be trustworthy, rather than just to expect trust. So for that I'm arguing in my paper here, in my talk here at SCAS, but also previously, that we need to create a climate of trust, a general atmosphere where certain forms of trustworthiness are taken for granted. And I think that's a doable project. It's not too idealistic. It actually can be done with some level of investment. I don't mean just financial, but also investment of goodwill by scientists, by journalists, etc. And I'm sure that will be there because without trust from the public, science will not survive, journalism will not survive, and democracy will not survive. So we do need to create trustworthy atmospheres, trustworthy climates, as I call them, for the well functioning of democracy at the larger scale, but also its institutions, its contributors, its experts and its communicators. That's the scientists, science advisory boards, institutions that are basing their policies and science, and also journalists and media people.

Natalie von der Lehr 21:05
So do you have any example of how this can be achieved?

Maria Baghramian 21:09
I distinguish in my work and the work through the project I was leading called PERITIA, the project policy expertise and trust project that was funded by the European Commission. So in that project, we put forward certain markers for identifying trustworthiness. And just broadly speaking, those markers fall into two different groupings. Firstly, they are practical markers. For instance, what sort of training the experts or the scientists have, what sort of performance record they have, what they have published, whether they are recognized by their peers, etc. And then we have normative markers. That's ethical considerations, value considerations. So trustworthy experts, in our view, in the view of our project, have certain ethical or value orientations as well. For instance, they are trying to perform at all times with integrity. They are trying to be free of bias. They are trying to be inclusive and not to be dismissive of other opinions, what we call intellectual humility. They have to try to be open to criticisms and respond to criticisms, so once these traits are accepted as markers of trustworthiness, then the next stage is what to do practically, to establish that. Some of them are easy, you know, if you are a good scientist, then you have good training, and you try to make sure that you are following the scientific method. The same with the journalists. You try to be up to date with all the technological requirements, etc. The ethical markers, the value markers, or the normative markers, as we call them, they're a bit more difficult. How do you show that you are trustworthy in the sense of having integrity or in sense of being open to other people? So we have used a few experimental measures in our project PERITIA. And one of them that I do try to encourage others to use is this system of citizen assemblies, which are common in many European countries, but we are using them in a very limited sense, and call them deliberative mini publics. These are one day gatherings of experts, a scientist, a journalist and a member of an NGO on one single topic with the general segment of the general population, 40 people, etc. And we set up discussions, but in a very controlled manner, not anything-goes discussion, but with specific parameters, and we find that this is one of the ways in which we can encourage closeness between general populations and scientists and experts and journalists. And I think that's one of the things that I would encourage others, also the governments as well, to adopt as a way of bringing experts and non-experts, together in establishing the trustworthiness of the experts.

Natalie von der Lehr 24:07
But trust and trustworthiness, my assumption is that it takes time to build this up.

Maria Baghramian 24:13
Unfortunately, as you say, it takes time to build up trust, and takes very little to break trust, and once trust is broken, then it's practically impossible to reinstate it. So that's why trust is so tricky, but it's also essential. However, there are some shortcuts, putting people together under conditions that they can speak to each other openly. They can approach each other without fear of criticism, without fear of being looked upon as somewhat inferior, etc, to exercise what is known as testimonial justice rather than injustice, those conditions allow the beginnings of the flourishing of trust, but we have to be careful if we do things wrong, then trust is broken, and then it will become even more difficult to establish trust.

Natalie von der Lehr 25:07
While I was listening to the talks at this symposium, I was thinking about this meme that you sometimes see on social media: "At the start of every disaster movie, there is a scientist being ignored".

Maria Baghramian 25:18
I don't know whether that's always true. In 21st century, more than ever, governments are listening to scientists for policy advice. Why is that? Well, because there is such a high advanced level of technology operating at governmental level, at policy level, etc, that you cannot base your policies on anything but some scientific information. We saw that during covid, without the input of scientists, there could have not been any policies. The same now, with AI. The question is, which scientists are you listening to, and how are you turning those scientific inputs into policy measures. That's the trickier part. So how science advice happens normally in European countries and in the United States, despite what movies show, is not one scientist advising correctly or incorrectly, it's a scientific panel, and it's still scientific panels that have to be trustworthy, rather than the individuals. And it's important for scientific panels to include non scientists as well. Actually, this is the other thing, that we recommend, another point of our recommendations. And some European countries already have that as part of their policy advice mechanisms that you have non scientists there now. Climate change was a case where for quite a long time, science advice was being ignored, and it's still under threat. And that's not one scientist being ignored, but 99.9% of scientists working in the area being ignored. Why is that? Not because of stupidity, not because of ignorance, but because of vested interest, politicians thinking in terms of short term rather than long term.

Natalie von der Lehr 27:09
You talked about both scientific institutions and also scientists themselves being trustworthy, working for trust. What is the most important, the individual there or the institution?

Maria Baghramian 27:21
So my own orientation towards the question is institutional, working on questions of public trust, rather than personal, individual, one to one trust. When I talk about creating a climate of trust, I'm thinking about a climate in institutional terms, in collective terms, public trust. But then institutions, committees, etc, consist of individuals. If you have one untrustworthy individual in your committee, and you don't notice the presence of such an untrustworthy person, then you are endangering the work of your committee. So we cannot rule out the individuals, but my orientation is that a climate of trust will focus on institutions and policies and policy making that are trustworthy.

Natalie von der Lehr 28:15
There are different kinds of tools and toolkits that can help us in our decisions whether to trust the information given to us. Some of them are more technical, for example, the detection of manipulated pictures and videos, whereas some are more designed using our own cognitive abilities to help us see through frauds in everyday life. Christopher Chabris is a cognitive scientist and professor at the Geisinger Research Institute in Pennsylvania in the US, and has, together with Daniel Simons, published the book "Nobody's Fool", which has just come out in Swedish translation. He tells us more about his book and how not to be fooled.

Christopher Chabris 28:53
Nobody's fool is a book about why we are vulnerable to deceptions of all kinds. It sort of takes a look at the individual, at our cognitive habits, some of our default tendencies, at the patterns of thinking and decision making, in light of the fact that there are lots of people and organizations, unfortunately nowadays trying to misinform us, disinform us, scam us, you know, take our money through fraud and a variety of other things cheat us in various ways. And in the book, we really look at a wide variety of examples of different kinds of cons and scams and frauds and hold them up against these ideas about how we think and see, hopefully, how the fraud and deception work. And then, in light of that, what can we do to reduce our risk of being a victim?

Natalie von der Lehr 29:45
So in your book, I mean, partly, you talk through our habits, why we get fooled, but you also present some solutions. So what can we do not to be fooled?

Christopher Chabris 29:55
Well, it depends a lot on the situation. So one thing that really governs every situation is being aware of the patterns in which people try to fool us. One of them is time pressure. And why does time pressure work? Why is it such a ubiquitous element of scams? Part of the reason is that what the scammers are doing is they're trying to get us to focus on the information they're providing to us, and that's because one of our best, most powerful cognitive abilities is that when we focus on something, we can process it in much more detail, much more thoroughly, much more deeply than if we're not focusing on it. That's apparent from if you just think about following a basketball game or a soccer game or football game or something like that, it would just be a blur of uninterpretable nonsense if we weren't able to focus on the players and the ball and shift our focus among them and so on, that's visual attention. But the same thing applies in other ways too. We focus on what's in front of us. We read what's in front of us. We think about what's in front of us. It seems very important at that time, and if scammers can get us to hurry up, we might not stop and think about whether there's any other relevant information that might be useful, any other checks we should do, and they're using the time pressure to sort of take advantage of our ability to focus. So one thing we need to do is always try to ask ourselves, before making an important decision, what information am I missing or what information is not being shown to me? Why should you make a decision based only on what you're seeing from one source? I personally don't like being sold things by salespeople, because are they really going to tell you about all the different options, including what else you could spend your money on, besides what's in the store? That's something you should be thinking about. But we rarely actually do, and certainly, the salesperson doesn't want you to think about what you could be doing with those 300 euros or whatever, besides spend it in their store. So asking, you know, what other information is missing is critical. At a more, even more practical level, I think the chances that you're going to be making a really bad decision by ignoring a time pressure offer and just walking away are very low. All of the countdown clocks on offers, all the deadlines and so on, are meant to get us to not think about alternatives. And there are very few situations in life, I think, where you're truly making a bad decision, like negative expected value for your life by just ignoring those offers. There are some cases where you might get a job offer and you're told, you know, you have to decide within 72 hours or something like that, because we have other candidates that we want to offer it to and we don't want to miss out and so on. But you'll know when you're in that situation, when it's someone you don't know, telling you you have a certain short amount of time to do something you should probably just hang up. It'll still be there tomorrow, and even if it's a little more expensive tomorrow, probably the money you save by not buying the stuff you didn't really need the day before is going to make up for the extra you pay for the stuff you really want by waiting.

Natalie von der Lehr 32:46
So is there anything particularly interesting that you discovered while working on the book?

Christopher Chabris 32:51
Here's one. There's a certain, there's an interesting pattern that people find especially persuasive, and it comes up in science fraud, and it comes up in healthcare and medical fraud, which is the idea that a tiny intervention can have a huge effect. And what do I mean by that? Imagine that there was a single shot that you could get injected with, and it would prevent you from getting sick with a certain disease for the rest of your life. So one minute of pain, 80 years of being free of a certain disease, you know, wouldn't that be great? Well, of course, that's a vaccine, you know, that we are given in childhood or in adolescence or something like that, and sometimes we never even need a booster shot, right? It's just one and done. Or even an annual flu shot, it's once a year. Flu shots have different reputations and popularities in different countries, but even an annual vaccine can provide a lot of benefit when you consider the downside of the disease itself. Antibiotics are another great example, right? Things that people used to routinely die of can now be cured by taking, you know, one pill a day for seven days. Imagine that. That's great. But unfortunately, the discovery of those kinds of things is rare, like, that's not normal science, right? Sometimes we do have those, those amazing discoveries, and we even do sometimes in behavioral science, but they're very rare. But that pattern is so appealing to us that you get the phenomenon of snake oil salesmen, right? Where someone claims to have discovered a product, invented it, discovered it, brought it back from some foreign place or some ancient tradition or something like that. And not only does it cure this thing, but it cures like 10 or 20 or 30 or 40 other things, and you only have to take it once, and it has no side effects, and sometimes you can't even perceive it, which it turns out, is because there's almost nothing actually there. It isn't too small of a dose or a concentration to even have any any effect. So in the book, we call this concept potency. One of the hooks that people use when they're trying to sell us something that we shouldn't want. And that's something could be a policy change. It doesn't have to be a drug or whatever. It's most obvious in the case of drugs, but it could be a policy change, like, if only we would change our classrooms to work this way instead of that way. That would solve our whole problem of people not learning enough math and science, and of our national scores going down, and of media literacy being terrible. It's never that simple, but that pattern is sort of, I think, in some ways, built into our thinking, and it's not stupid. Again, one point we make in the book is that it's not irrational to have these habits and behaviors, and it's certainly not irrational to think that the best possible thing could be, you know, the smallest intervention that has the biggest effects. That's great, right? But we're not adjusted to the rarity of those things happening. So we are much too credulous when people offer those kinds of solutions and scientific papers claiming those kinds of solutions are too likely to get published with not good enough evidence, people are too likely to spend their money on them, to invest huge amounts of government money in trying to implement those things. So beware of like the potency trap of thinking that a tiny intervention that has a huge impact or huge effect is likely to be true. Occasionally they are, but most often not.

Natalie von der Lehr 35:57
To round off, I asked the organizers, Kathrin Glüer-Pagin, Thomas Nygren and Dan Larhammar to look ahead, We have heard a lot during the last three days. What do you think are the biggest challenges right now?

Kathrin Glüer-Pagin 36:14
The thing we see more and more of even here in Sweden, is all sorts of mis- and disinformation gets mixed up with the politicization of certain issues, even though we don't have as much fact polarization in Sweden as we have in the United States. But we do see rising levels of what they call effective polarization. And that's when you start disliking your political opponent or people who have the opposite, let's say, political identity from your own, and you start finding them despicable or disgusting or something like that. The more we have of that, the more fact polarization we will also see in the end. And if you look at what it looks like with respect to certain issues, for instance, in the United States, I think that is really, really scary, because then you lose the educated, knowledgeable citizenry that you need for a working democracy, and that, I think is the biggest challenge right now.

Thomas Nygren 37:23
This is part of the same thing. But I mean the science in mind, if we have wicked problems like climate change, and then people become not taking the problems as serious as you should, that's certainly a very big problem. So any hot topic today is highly problematic because it's harder to see the core of the problem, because we're all getting distracted the whole time from seeing the problem in itself and then finding common solutions to it, because everything is coming up for debate. I mean, the hesitancy and the problems of the society to solve real big problems when it comes to health, when it comes to finance, when it comes to climate, it's all over the place that we are seeing these problems. So, I mean, it's democratic problems across the board. It's hard for me to say, where do we see the worst problem? It's worse on every level. And I think that we didn't bring it up that much, but I mean, people get tricked. Somebody steal their money. People get, like, at least, intellectually raped because somebody is making up videos with them. This is happening all the time everywhere at the moment. So it's like one big problem with multiple small problems in it.

Dan Larhammar 38:47
And with those big problems, one major challenge is to reach out with the information about how our brains work, how we are easily misled by others, and how we sometimes want to be misled with wishful thinking. So if we could equip everyone with more critical thinking, I think that would be very good consumer protection and of course, to keep a democracy alive, we need to have good information, reliable information to the citizens so they can take informed decisions.

Thomas Nygren 39:23
If we have a hot topic, and people get well informed about the topic, then they're not as easily misled. It's hard to fool someone who knows a lot about some things, but then we need to have all these things in place. But I also see a great big problem in how we have big platform companies, and there is a will now, or a willingness now, to turn this all into an individual problem, when I see that many of the solutions should be on international levels, should be on corporate levels, where you should have protections for citizens in place, and they are not in place at the moment. It's coming there, it's getting better, I mean, in 2015 there were no platform company would say that they had any sort of responsibility for what was being published. Now they are taking some publishing responsibilities, while some platforms are now not taking responsibility anymore. But at least there's a discussion here. Maybe the most vital problem that we have to have the responsibility taken by the bigger actors. And I say that in this case, it's also very much a thing for politicians to sort of act on things, take responsibility, talk based on facts. If you don't talk about fact, if you lie - shame on you. And I think shame is something that we should reintroduce in society a bit more.

Kathrin Glüer-Pagin 40:45
We must see to it that this doesn't go the way the climate discussion went, making it into an individual problem where everybody basically has to see on their own and feels the burden of dealing with it on themselves, because it is systemic. It is big. There's a lot of big money and authoritarian states and whatnot behind a lot of the misinformation that is the most dangerous and that just demands institutional and larger solutions. It's not just we need to train all the kids to be critical thinkers. We need to do that, definitely, but we need other things too.

Natalie von der Lehr 41:27
What are you taking away from these three days with you?

Thomas Nygren 41:30
I'm taking away the range of research that is ongoing. And also I was intrigued by the many conversations coming up from different views. Whenever there was a presentation, there would be five or 10 people with different backgrounds lining up and asking questions. And I see that in this case, interdisciplinarity can be a very good thing. That's something that we should also stick to quite a bit. I am seeing some tendencies that you have this sort of misinformation research is becoming sort of a cluster of here you have the scholars who are just doing the theoretical work, and they are lining up by themselves. You have this group of these types of psychologists, and they are doing this. And here we have those media scholars and they are doing this. And this is something that we need. We need to have the silos, but then every now and then, we also need the silos to meet, to have some interaction in between. So I think that one of the big things that I'm seeing in this conference is how good it can be if you are a limited amount of people, all interested and all coming from different backgrounds, that can really make science move forward.

Natalie von der Lehr 42:52
Thank you for listening to SCAS Talks and this episode in our series SCAS Talks Spotlight, focusing on the symposium "Sciences Against Misinformation", which was held at SCAS in april 2024. This symposium was an event of the natural sciences program of SCAS, and was funded by the Erling Persson Foundation and the Knut and Alice Wallenberg foundation. Organizers were Kathrin Glüer-Pagin, Dan Larhammar and Thomas Nygren. I would like to thank the organizers and also Åsa Wikforss, Jesper Strömbäck, Maria Baghramian and Christopher Chabris for talking to me. Thanks also to the participants for many enlightening chats during the lunch and coffee breaks. Talking and listening to each other is the key to knowledge, an important factor in building resistance to misinformation. Our regular episodes of SCAS Talks feature the research of current and former scholars from a wide variety of disciplines, and this is a reflection of the multi- and interdisciplinary research environment at SCAS. Tune in if you are not a regular listener already. SCAS Talks is available on podbean, Spotify, Apple podcast and most podcast apps. And as always, we are very happy if you can recommend SCAS Talks to your colleagues and friends. Thanks for listening, and bye for now.

Transcribed by https://otter.ai