.
What: Expert panel discussion on the effect of fake news on media relations
When: April 26, 2023, 2 PM to 3 PM EST
Who: Panelists include:
- April Kaull, Executive Director of Communications at West Virginia University
- Professor Stephan Lewandowsky, Chair in Cognitive Psychology, School of Psychological Science at the University of Bristol
- Dr. Stefka Hristova, Associate Professor of Digital Media at Michigan Tech
- Dr. Brian Southwell, Senior Director of the Science in the Public Sphere Program in the Center for Communication Science at RTI International
Where: Â鶹´«Ã½ Live Events Zoom Room (link will be given once you register)
Details:
We are forming a panel to discuss misinformation and how it affects media relations. For the last two years, we have been looking at how Â鶹´«Ã½ can tackle issues around spreading and consuming fake news.
Reporters are tasked with solving misinformation and disinformation but can’t do it alone. We would like to involve our community of members and researchers to discuss and investigate how we, in communications, can find ways to support them.
.
Transcript:
Jessica Johnson: Good afternoon, everyone and thank you for joining us today to discuss misinformation in relation to communication, specifically. My name is Jessica Johnson, I'm the CEO here at Â鶹´«Ã½. We organized this panel to examine what role we, in communications could, should, and shouldn't have regarding misinformation. At Â鶹´«Ã½, we began integrating Google fact check into our service about four to five years ago. Some of you may have seen calls for experts to respond to examples of misinformation in the media. So we wanted to discuss this issue with the panel of experts as well as you, as the communications professionals and look at misinformation more broadly and consider our roles in this ever growing issue. I hope everyone will participate and ask questions you can chat me directly or to everyone, to the whole group and I will share your questions. Ask anytiime, you don't need to wait until the end.
And to begin, I would like to introduce the panelists.To start, Dr. Stefka Hristova, who's the Associate Professor of Digital Media at Michigan Technological University. Then we have April Kaull, the Executive Director of Communications at West Virginia University, and Professor Stephan Lewandowsky, the Chair in Cognitive Psychology at the School of Psychological Science at the University of Bristol. And last but definitely not least, Dr. Brian Southwell is Senior Director of Science in the Public Sphere program in the Center for Communication Science at RTI International. So, thank you all for joining today. I would like to begin by asking you all, this is for everyone. Is there a role that institutional communications offices can play in addressing misinformation? And so in what ways and maybe we'll start off with Dr. Southwell?
Dr. Brian Southwell: Great, thanks, Jessica. And thanks, everybody, for gathering. I know, this has been a topic that's been in the headlines quite a bit. But I think it does be have us all to talk a little bit more nuance about these issues, and to think practically what we can do. So generally, I think some of my other colleagues here in the panel likely have some good ideas towards a directly addressing misinformation. But I also want to make sure that we don't overlook a really important step here, just as the role that you all have in establishing and maintaining trust with audiences, and giving those audiences reasons to listen to your organization, before expecting them to listen to you. So I think we often think about in a defensive way, addressing misinformation that's out there. Some of the advanced planning for that really needs to have happened much prior, um, in which you have established the credibility of your organization. But you've also reached out and found ways to engage with that audience in a preventative way. So that when a crisis does arise, that they know to turn to you, and it's not out of the blue for you to be articulating, you know, certain ideas. So that's one thing I would suggest here is that everything we can do to really think concretely about and take seriously and to be humble about the lack of trust that might exist in some circumstances. I think those are good steps. Here, I'm not sure what else other colleagues might have to say.
Jessica Johnson: Yes, please, April Kaull
April Kaull: Sure. So I would first just reiterate what Brian said that, you know, I think relationships are really the first key and that takes time. And it takes to his point, that buildup of trust, because you have to show with your audience's a track record of if you say it, it is true, and that they can count on you to be their source for consistent and truthful information. I also think that it's really important to set clear expectations with your audiences about what you can and can't do and what they can expect from you, especially when a crisis situation pops up. I think during the crisis is not the time to let people know what you can and can't say although that is probably, that is definitely a time when you'll need to reiterate it. But I think having those expectations set early and then reinforced consistently is very helpful because we all know that in the absence of information, that vacuum becomes filled with misinformation, rumor, speculation, and then you find yourself on defense. As opposed to offense, which is where none of us as communicators want to be when we're dealing with a crisis situation. And I would say, a crisis situation doesn't always necessarily have to be something negative, it can be something positive. But nonetheless, something that you don't expect that you might be dealing with in the moment. And so having a lot of that built in advance is really crucial. I know for us, you know, we have those relationships that we've built within our own internal structures, so that everyone within our organization knows what their role is going to be how we're going to coordinate and share information with each other. So that then, from an external perspective, we can move more quickly and nimbly, to address those questions and issues that are coming up for that external set of audiences. And so that's really been key for us.
Jessica Johnson: Thank you, Professor Lewandowsky, if you can share some thoughts as well.
Professor Stephan Lewandowsky: Sure and I'll focus mainly on the academic context, which is what I lead with. Being within academia, the communications officers of universities are very important. And I can only underscore what April said about the importance of having relationships through them, with journalists. But also, the other thing that is very important is to make sure that the message that goes out through University Communications Officer is actually reflecting what the science is about. And there's often a tension between, you know, scientists trying to be nuanced and careful and you know, what communications officers would like to say, and so you have to negotiate that. And that is generally true for any communication, I think that there's the imperative of nuance, on the one hand to be accurate, but on the other hand, the public, and the media just cannot handle all that. And that, to me, is the crucial thing to negotiate.
Jessica Johnson: And how, what are ways that you can? Are there specific ways maybe that you can recommend that communicators do that?
Prof. Stephan Lewandowsky: Well, that's the million dollar question isn't it? You cannot, you know, in most situations, whether it's politics or science, you have to simplify to get a soundbite out there, make it manageable to make it comprehensible. And I don't think there's a magic bullet, you just have to exercise judgment and be careful. Now, the one thing I would not do is to highlight the uncertainty lead with that, which is, funnily enough, that's what scientists do. You know, if you ask them about climate change, they'll say, oh, yeah, there's one of the things we don't know. And then 10 minutes later that telling you oh, what we know for sure, it's fossil fuel emissions that are causing climate change. But that's not a very good communication strategy to lead within the nuance of the uncertainty, I think you are going to go right out with what you know, for sure. And then at the qualifications.
Jessica Johnson: Great. That's very helpful, thank you. Dr. Hristova, how would you respond to this in terms of, is there a role that institutional communications officers can play in addressing misinformation?
Dr. Stefka Hristova: Absolutely, I think that it needs to be a proactive role. And I wanted to speak a little bit to the genre conventions of thinking about how do we construct a genre that's rooted in evidence and avoid some of the pitfalls that are associated with myths or disinformation, I wanted to speak to the importance of using visual evidence that is from the moment or from the site of inquiry, rather than relying on generic photographs, stock photography, as a way to boost the credibility of this particular form of communication. So thinking about information presented both visually and textually, as being evidence and anchored in evidence away from emotional appeals, away from kind of generic representations of such events will be one way to maintain and establish a credibility of the source.
Jessica Johnson: Right. Yeah, very good point. Well, I think we'll talk more about multimedia and how that is also used further in the discussion. So next, April, how can communications offices assist reporters in addressing issues of misinformation? I mean, that was one thing that you mentioned just now, maybe more specifics, how can that be done?
April Kaull: Sure and I want to go quickly back to something that Stephan said and sort of springboard from that. In that, you know, we were talking about building that trust in those relationships internally. And that is certainly, very important when you're dealing with faculty, researchers and across your campus, those who are engaged in scholarship and helping to share that work in an effective way. One of the things that we do is consistently engage with our faculty directly, to help them understand the approach that we take to sharing their work so that they understand what we're doing what our role is. And one of the key things that we try to share with them is that their work is valued. And our job is not to try to dumb it down, for example, which is something that you often hear people, you know, what we're trying to make sure that, you know, an eighth grade audience understands this. And instead, the approach that we have taken and had a lot of success with is having faculty view us more as translators. Our role is to help translate the work of academics, faculty, staff, and researchers who live in their world of scholarship 24/7, to an audience that is not as familiar with the terminology and the processes and that environment. So we try to really focus in on that idea of translating their work for a lay audience. And they seem to respond to that a lot better. And that in turn, builds trust between them and us. So that as we're working on announcements, and then dealing with questions that may arise, once that communication is out, we're much better equipped to deal with that on the external side. So I wanted to share that in case it's helpful for others. But then it's specifically as it relates to dealing with reporters, the media, with journalists, I know this is going to sound a little bit like a broken record, but it really does come back to building relationships and doing that early and consistently. You know, I don't know how many times, I've been working with a reporter on a story. And they've shared back, just a graph, to fact check, did I interpret this portion of the interview accurately, I want to make sure that I'm sharing this, with the proper context, and they wouldn't feel as comfortable to do that if we didn't have a relationship and a rapport and a trust. Conversely, if I see that there is, a mistake or an edit is necessary. If I have that relationship with a reporter, I feel much better about reaching out and saying, hey, I noticed this, and I'm happy to connect you back with the reporter, or here, let me provide you with some additional information to correct or to provide some more clarity on that. And I have found that, when you do that, it is much easier to work through whether there was a misunderstanding of information, or something else. I rarely, if ever have encountered an intentional piece of misinformation in reporting, it is almost always some sort of a misunderstanding or a need for additional context. And that really is best served by having really good relationships with those reporters. And that means, doing the work to maintain that relationship.
Jessica Johnson: Thank you. To move on, Dr. Hristova, I was wondering if you could tell us if communications offices need to be worried about how to protect their presidents, directors, researchers from being targets of disinformation or malinformation, which I learned from you as a term. So maybe you could also, I don't know if everyone is familiar with that here that is attending this session. So maybe you could explain more about that as well.
Dr. Stefka Hristova: Thank you. I wanted to make sure that we are speaking about the spectrum of information that may not be accurate. Misinformation, being information that's unintentionally shared, that's happened to be false, maybe misunderstanding. Disinformation, however, is a whole separate kind of branch of information that's inaccurate. It’s information that's intentionally false and is designed to cause harm. So when you think about questions that come out, around trawling around information sources that are paid to distribute disinformation in order to be harmful or have a harmful effect on the research agenda, or attack against a particular university or school. I think that we need to have proactive ways of approaching that. Often, the intentions behind disinformation is making money. So spreading information that's inaccurate. Third party actors in foreign countries are paid to spread disinformation. And this is a whole separate factor in the ecosystem of communication than engaging with reporters and trying to tell a story. And we're increasingly encountering more information, which is information that is generated by AI or bots, which is not even information that is constructed by a human agent, but it's often tagged with hashtags that implicate institutions. As you will have bots generating kind of information, it's even some some nonsensical and being tagged, for example, with UN or climate change, right? UN climate change. So trying to damage institutions reputation by linking nonsensical information, more information that's algorithmically generated. So I think that as we're thinking about the communication toolkit, we need to think about how do we proactively report and maybe try to counter both the mal and disinformation from the media landscape. What are the tools? What are the the teams that need to do this particular work of dealing with disinformation or malinformation?
Jessica Johnson: Thank you. Would anyone else on the panel like to address that?
Prof. Stephan Lewandowsky: Yeah, can I? I mean, I agree with. Can I just connect to that to say, the well, we'll probably talk about this later. But pre empting attacks is always advisable for you know, where possible, and very often, ahead of time that's going to happen, like if you publish something that is remotely controversial, or that is, you know, banned to elicit concern among some people, if it's about climate change, or GMO, or vaccinations then it can be very helpful, and I think actually essential to anticipate the blowback and also have a plan for how you're going to deal with it and how you're going to protect your employees or your, officers, executives, whatever. And I, I certainly do that whenever I have a paper that I know ahead of time will elicit some negative commentary in certain sectors. So anticipation.
Jessica Johnson: Okay, and are there specific things that you do to prepare for, that you could mention? Or is it just kind of expecting?
Prof. Stephan Lewandowsky: Well, it's expecting and trying to anticipate what the counter argument may be, which is sometimes, it's some of the responses robotic, there'll be some opposition expresses, don't tell me what to do, for example. Don't tread on me, it was something if you talk about vaccinations, you will get that bill back. And so if you have a Twitter thread, and that you can just pump out in response to that, if you can, when you have media in the news, you can put in some pre emptive information saying, Well, now, some people might object to this on that basis, but actually have thought of that, and then you say x, y, and z, whatever it is to preempt that. So that tends to be reasonably successful.
Jessica Johnson: Okay, thank you. Dr. Southwell, I wanted to know what your thoughts are on whether if an institution has expertise in misinformation, should they debunk it? Or disinformation? Should they debunk it? Or always or sometimes? And should there be guidelines to judge when to debunk something?
Dr. Brian Southwell: Yeah, this is a great question, Jessica. I think that the short answer is that we actually, I don't think we always need to be recommending that misinformation be directly addressed. And I say that with respect for just the limited time and resources that organizations have. And I think that there are judgments that can be made. That said, I think that there is a considerable amount of misinformation, which is problematic. And I think that we can make no judgments based on a couple of thoughts. The first is whether or not there's a reasonable case that you know, the misinformation that exists, is likely to cause harm. And if it's likely to cause direct harm, be life or death situation, for example, you know, then we ought to from public health standpoint address it. I also think we can make judgments about the prevalence of misinformation. Now, of course, it's true that some things in the remote corners of the internet may explode in popularity and take off over times. But over time, but by and large, I think there are instances where if something just picks it's posted somewhere doesn't mean that lots of people aren't necessarily attending to it. I say all this from a pragmatic standpoint, because I think sometimes it's easy for us to be up in arms and absolutist about any misinformation anywhere being really problematic. And I just think practically, that's not useful. I think that organizations have lots of other things that they're trying to address and attend to. And I think there are circumstances when certainly it makes sense to to spend time or resources on it, but I don't think it makes sense to be in defensive mode all the time, either. I think there are instances when you can proactively to trying to see the public discourse agenda, generally. So some of these points actually, I was thinking about this, were ones that we raised a few years ago, with the National Cancer Institute, put a workshop together and published an essay on considerations for when and thinking about the fact that all misinformation is not equal. I'm gonna put that link in the chat, just in case somebody's interested there. Yep.
Jessica Johnson: Yeah, that'd be great. I wanted to maybe Prof. Lewandowsky you can add to that, if you have any other thoughts as well. But also, the next question was for you in terms of how is debunking perceived by the public? And especially related to scientific topics?
Prof. Stephan Lewandowsky: Okay. Yeah, thanks. I mean, I do want to connect to that, because I agree with Brian, you cannot keep on to everything, right? I mean, this cartoon of a guy late at night in front of a computer, and the wife says, why don't you go to bed and he says, oh, somebody is wrong on the internet. You know, that's what the internet is about. Most of it is wrong, you cannot keep up with all that. And so my recommendation would be to consider the importance of the misinformation, but also see if you can track its trajectory and see, try and get a sense on Twitter or whatever popular is this really, and because very often things come and then they go, because, the meme is just not good enough to go via them. And then don't waste your time, don't draw attention to it. If you have the capability of monitoring this stuff, then I would say, if it doesn't gain traction, don't go there.
Now public's view of debunking? That's a very interesting question. First of all, we know from a lot of experiments, now with 10s of 1000s of participants that debunking works, kind of, most of the time. Now, what I mean by that is that if you correct a specific falsehood, in public, the people who are exposed to the correction will adequately and appropriately adjust their belief in that information. They are actually responsive to corrections, which is good news. The problem however, is that doesn't necessarily mean they stopped relying. And this is the fundamental problem of the cognition, the basic cognition of corrections, which is, means foreign narratives in their head. And if you then tell them, hey, plot, if that narrative is wrong, then they'll happily say, oh, gosh, that's too bad. And they've been killed if they believe it last, but they don't believe it. But five minutes later, if they have to sort of think about the narrative, well, that information is still there. Because if you went that out, you wouldn't have that see the problem with misinformation that if it lingers, even if people are responsive to your correction, so that's at the level of, you know, actual cognition, then it kind of works, but you got to be careful because it may lead. Now, in terms of political perceptions, this is very interesting, because during the last few weeks, it hasn't been that long, there has been a surprising amount of activity in a lot of different sectors, including the academic literature, that is trying to restrain the concern with misinformation as being exaggerated, or more on panic or reflecting the political agenda. And there is a Republican member of Congress who has used Freedom of Information, requests against certain universities in the United States, I think Washington and Stanford and maybe some others. I don't know them all. To request details of what they what he calls so called a search of so called misinformation. Now that it's a problem, because what this is trying to do is to further grow the notion that we could actually agree on what's true or false. And that I think is a fundamental problem of contemporary politics that the populace conception of truth does not allow for independent evidence. But it's based on intuition and authenticity. And that is happening in many countries. And we have to be very concerned about that. I think, because this goes beyond some false information, it goes to the heart of the possibility of even be able to establish what we should share as a common truth.
Jessica Johnson: Interesting. Yeah, thank you. I didn't know that. And yes, I agree. It is putting doubt into everything, right?
Prof. Stephan Lewandowsky: Precisely. We could choose our alternative facts and we can believe whatever we want, and the moment you will have that, of course, then the strongest and loudest voices carry that. You know, that's a problem. I think, if you're trying to understand reality.
Jessica Johnson: Yes. So following up on that, and going into more of best practices for debunking misinformation in terms of maybe a little bit more in the specifics, misinformation and disinformation. I guess we've already talked a little bit about developing relationships with reporters to address misinformation or disinformation, is there a best practices for debunking? And when they should be addressed? If that is, you know, if it's affecting people's health, or if it's something that rise to the level that should be addressed?
Dr. Brian Southwell: Yeah. So I can grab this. So let's start this I, generally speaking, there are a couple of considerations here. First, is that, generally speaking, I don't think most people who believe or hold on to misinformation or disinformation have an active sense of it as such. In other words, you know, this is a tricky topic, because we are putting the label on something as being factually inaccurate. But for the most part, you know, that wouldn't be particularly irrational for people to hold on to something that they knew to be false. And so, you know, so part of the trick of the correction is leaving some room for face saving, and be careful to not shame and blame people, because all of us, first of all, are vulnerable at some point in our lives to believing in misinformation by nature of being human, by nature of the way that we process information. And I think that, you know, it's possible to put somebody into a corner here and suggest that while they've been wrong in a way that doesn't allow them socially to kind of come back into society in ways that I think is really problematic. The other aspect of this, I think, is really important to note is that there's there are various ways of being wrong, various ways of being false. And I think that you might find it easier to rhetorically correct, or directly address, explicitly incorrect to explicitly false information or claims. Some of what's problematic that's out there is a matter of lying by omission. It's a matter of leaving out conveniently certain caveats in certain context, that's a little bit tougher to correct because Dr. Lewandowsky and others will have more to say probably about this. But we think from the standpoint of cognitive architecture, if you're trying to correct something that wasn't already in somebody's head, if you're trying to correct an omission, that's harder, because there's maybe less, you know, for there to be traction there. And so if you could point out that well, that particular statement was explicitly the wrong for these five reasons that you might have an easier time of it, but then saying, Well, you know, the reason why that was wrong is because we left out this whole other part of the story. Now filling in that story later, that can be a little bit more difficult. And so just being aware of what you're up against, in terms of the nature of the misinformation that you're attempting to correct. That can be important to keep in mind too.
Jessica Johnson: That’s a very good point, how to deal with omissions of information. April, what about ideas and ways that you think are best practices for debunking disinformation?
April Kaull: So I think that when it comes to disinformation that is, of the variety we've been talking about an omission and/ or lacks context or has been taken from a perspective that is, you know, based in someone's opinion. That's one thing. And I think that a lot of the tools we've already heard can be really effective. I will tell you what, frankly, keeps me up at night are deep fakes and the emergence of a technology to change the actual video, image, speech, comments of people, especially when it comes to social media or other platforms like that, for example, making it appear that your university president or your researcher has said, X, Y, and Z, when in fact that is not, it is video of them but it has been manipulated by technology in a way that it is almost impossible to decipher, that it is not in fact, the words of that person. And I think that is really the next level of what we're going to be up against when it comes to disinformation. And having to think about how we battle, that is what I'm really worried about in the coming years, we've already started to see it in some areas. Fortunately, I haven't had to deal with it outside of some really bad photoshopping but I think we're all going to be faced with that. And that puts trusted journalists and media in the same boat as us, because we are all consuming that same information. And it becomes very difficult, if not impossible, to determine whether something is true.
Jessica Johnson: Yes, actually, that's great for mentioning that. Sorry, go ahead, Dr. Hristova, specifically you are in terms of digital media, what do you see and kind of building off of that from what April was talking about?
Dr. Stefka Hristova: So I wanted to point to three types of relationships that I think will become increasingly important as we're moving into this new landscape of AI generated or bot propagated mis and disinformation. I think that we need to have a very explicitly linked forms of communication, connected to research. So communication officers, working with researchers on misinformation, working with social media labs, whose job is to monitor the development in mis and disinformation. That will be one important kind of bridge that needs to be strengthened as we entering this new technological landscape. The second point I wanted to make is about engaging with the technological media giants in and of themselves, continue to press for labels that label misinformation or disinformation that allows us to signal both accounts, and also designate deep fakes because I think those there's a responsibility in those media platforms for doing some of the tech savvy work in allowing us to signal or label or report information that is ingenuous, and not factually truth. And the third piece of this is continuing to be involved in policy and regulation because I think that we as encountering this new technological landscape, it will be really important to put broader policies in place to educate legislators about the ways in which information can be now safeguarded in order to find points of evidence and points of truth as we move together, so I think we need to have an assemblage a team that has policy aspects that has academic cultural aspects, and also technological experts and expertise in order to address the problems of mis and disinformation as they becoming more and more complex.
Jessica Johnson: Thank you. Professor Lewandowsky, can it be harmful to fact check and debunk disinformation or misinformation? Let's say disinformation, because we're giving more attention to these sources. When would you say those times are?
Prof. Stephan Lewandowsky: That's bringing a real problem. There was concern myself included, we were all concerned a 10 years ago about the possibility that by correcting something, you might actually strengthen that memory the people skips ironically, at some point backfire effect. Now, they do exist, but they are far less pervasive than the first thought. And at the moment, on average, in the vast majority of cases. I'm not worried about that in fact, we can debunk successfully. And I'll put a link in the chat right now to the page of resources on my homepage, which points to all sorts of material for communicators to calm their debunking stereotypes. So I'm not worried about debunking things for the most part. Now, the one exception is the one I already mentioned. You know, it's not worth your while to deal with things that no one knows about. Why would you debunk a myth that no one has ever heard of? There's no point. And, indeed, in that case, you might draw more attention to it than that is warranted. But by and large, I'm not concerned about that at all.
Jessica Johnson: Okay, thank you. For April isn't helpful to address misinformation by putting out a lot of information related to factual information in order to drown out the false conversations? I'd like to ask everyone about this. So, I'll start with April.
April Kaull: So I think maybe it was Brian, who mentioned earlier, you're never going to be able to combat all of the dis and misinformation that's out there. That's like, you know, a really bad game of whack a mole. So, I don't think that's effective. By and large, I think it's much more realistic to target those groups, as you prioritize, you know, what is the most important mis or disinformation to address? And then which audiences who are engaged in receiving or believing that information do you need to reach? And how best is it for you to reach them. And I mean, that if you look at that, at a macro level, you know, it might be utilizing your main university social media accounts to target specific audiences who are engaged in a particular, you know, conversation or debate about content on a particular platform, or at a micro level, it might be that you reach out individually to an influencer, who you've built a relationship with, and engage that person to help you fight that battle. Because I think that one of the important things for us to remember is, if we, as institutions and groups, try to do this all by ourselves, we're just one communications group, for example. But if we can form partnerships, and again, relationships with people who can help us who can carry that water, or at least helps to share in carrying that water, not only does that take some of the burden off of us, but it also helps with those audiences, because there's already a relationship and a trust built among that influencer with that group. And so if you can have some of those helpers identified and target them for assistance, I think that's super helpful in these circumstances. But no, I don't think you can just try the drowned out method, because you'll you'll never succeed with that.
Jessica Johnson: Great, thank you. It seems like it is something that would be nice to be able to do but yes, possibly difficult to achieve. But yes, let's hear Dr. Southwell, please, I would love to hear.
Dr. Brian Southwell: Yeah,well, I'll just build on some of what April just raised. So I think it's really crucial to think about instances, just concretely, in which there's apparently a bit of a vacuum in terms of public discourse. And those moments, I think, being unafraid to offer content and ideas is not a bad strategy. If there's something that you know, people are hungry for this topic, but there seems to be interested in and there is a stunning lack of official information, you know, coming out. That's a real recipe for people to turn to, you know, other sources. But I think generally, the notion of kind of flooding the landscape, you know, for the sake of it, it's a bit cynical, at best, what you're going to accomplish is just your widespread distraction in ways that for most of the missions of the organizations that focus on the line right now, that's not really what you're trying to do. And if you're trying to cultivate a relationship with audiences think concretely about what it is that they need to live their lives. So what is it that we need as a society to be focused on? There's a lot of work to be done there. Sometimes it'll be a matter of directly addressing misinformation that's getting in the way of public discourse that needs to happen. Now on key issues of the day, you know, and so all, Dr. Lewandowsky is earlier notion of thinking about climate change. Well, gosh, you know, it's a central issue of our lives. And so it really does matter in instances when, you know, false information is crowding out, really what we need to know, in order to make decisions. And other topics, it's probably less crucial, but really not losing sight of why your organization exists, what is you're trying to do in terms of serving audiences and developing information for them, and then focusing on that, because this goes back to I think, the only way through the landscape that is possible now, and will be by virtue of artificial intelligence, by virtue of deep fakes. The only way through that is for people to know where to turn to, you know, for credible information. I think that, you know, if we're in a world where we could see April appearing on video ever bid, you know, saying something different than what you want to say, which is, unfortunately possible, the counterbalance to that is knowing that we can go to her edu.edu site and find, you know, something there. Similarly, the media outlets that exist and have credibility, hopefully, unless you're in a position of those being mimicked. I mean, I think that would be, you know, hopefully addressed in a quick amount of time, there are places where people are gonna be able to turn the channel, type in, go to a certain site, go to a news event. And we've got to think in those ways, we got to think about what can we establish as kind of the safe ground that people can return to in the storm? You think about it in the moment of a crisis, you have a spot in the parking lot? Where are we supposed to gather? Well, similarly, do we have those pre established before these moments? For communities to know, well, here's the site that you go to, and that we can trust. And if missteps are taken, and you lose credibility and trust in those sites, that's a real problem. And it's damaging, it means you're gonna have to take the long road to kind of building that back up. So it's one of the reasons we need to take seriously the responsibility we have for doing the best job we can, you know, when people are turning to those outlets.
Jessica Johnson: Thank you. Dr. Hristova and Professor Lewandowsky, would you like anything to add, please?
Dr. Stefka Hristova: Absolutely, I wanted to point back to the gender specificity in the shifting landscape. As we're moving towards ways in which AI can generate information, we understand that the conventions of the current chatGPT for models are one in which we are working with generalized statements, very formulaic expressions. And in terms of visuals, we have moved into a culture that uses stock photography uses and relies on the genetic. So I think that maybe you want that if you can be thinking about grounding back information into specific, both in terms of visual and textual as a genre that emerges to be connected and linked back to evidence in the face of this generic oriented mis and disinformation landscape.
Jessica Johnson: Interesting. Professor Lewandowsky, what are your thoughts?
Prof. Stephan Lewandowsky: Yeah, well, I by the way, that's a fascinating idea that's particularly apt to go away from the generic to the more local context. That's very interesting. I also want to comment on what she said earlier, about deep fakes and all that stuff, keeping you awake at night? And yes, I think there is something to be really concerned about. However, my hope is that is solvable technologically, I mean, you can tamper proof the video and you now have some algorithm that verifies that it hasn't been edited. And the same as far as it goes, you know, the, you can fingerprint this stuff, so that if you edit it, you can tell it's been manipulated. And now that to achieve that, however, does require regulation and a policy response. And I think it's time to talk about that because if we want to deal with misinformation, then you have to address the sort of structures that are in place now that allow that just bring it. And I'm going to put another link in the chat in case anybody was interested in this in depth. So that's a report that I wrote, with a team of people for the European Commission about the relationship between technology democracy and what it means how social media can alter democracy. And I think we have to talk a lot more about clever ways in which we can regulate, for example, by mandating, you know, the development of some sort of fingerprinting mechanism to deal with deep fakes. Yeah, which I think, from what I've heard, was taken logically possible. But there are other things that are connected to that, like the role of algorithms on Facebook or Twitter or whatever, and the implication of them favoring information that makes people angry. For example, as we know, Facebook, anger is locking things up in your newsfeed that made other people angry. If it made other people happy, you're never going to see it as much. And somehow that seems, you know, strange that you would want a society that gets angrier and angrier. And so there are all sorts of things we have to address there. And I would think the conversation in the next couple of years will increasingly go in that direction. Certainly here in Europe, there was a lot of stuff happening to change that and to hold the platforms accountable and it's a very exciting development that I think, is pointing in the right direction.
Jessica Johnson: Thank you. Do you think it's possible for us to predict ways it sounds like from what you're saying Professor Lewandowsky that maybe there are ways to even eliminate misinformation or disinformation or deep fakes. Is that something that could be done?
Prof. Stephan Lewandowsky: No, no, I don't think you can ever eliminate it. I'm definitely not advocating, you know, misinformation. Hungary just passed a law against disinformation and that scares me because democracy. This is not about censorship, it's simply about identifying when something has been manipulated, you know.
Jessica Johnson: That there might be able to be tools, though, to more easily understand if something has been manipulated, which in effect could kind of eliminate it? And it's an important possibly, is that? I mean, it sounds appeal but that would be the goal, right?
Prof. Stephan Lewandowsky: Yeah, exactly. That's what I'm talking about, you know, if there's a Photoshop image up there that should come with, with a little, you know, whatever, wanting people that says, Oops, that's been manipulated. I've seen reason why that shouldn't be the case because, who wants to be manipulated? Makes no sense. To me, of course, in some context, it's a lot of fun. Yeah, sure, you know, like satirical, there also could be images out there that are a lot of fun that are Photoshop, but you might as well put a label on it and want to say, hey, it's been Photoshopped, or at least have the metadata embedded so you can tell. I don't see that as interfering with this dissemination of anything and my mind is regulating algorithms. It's not about content, specific content, it is not about saying this content is better than that. It is making sure that the attributes of the content that are being played with by an algorithm are in the public interest and not anger-evoking. I think that's a reasonable expectation in the democracy.
Dr. Brian Southwell: You know, just just to build on that point, I, you know, some of what we've seen in the public landscape has been a matter of organizations promoting content, because there's money in it, right? And there's an incentive to ostensibly cynically attract audiences by virtue of, you know, emotional, evocative content, you know, for example, that we just talked about that. I think we're often I think we've been overlooking, though, another opportunity, frankly, which is the interest that many people have in credible, verified information, you know, as well, I think that we've seen without naming names, a bit of a debacle in that regard, in terms of, you know, some of the platforms opting to, attempt to monetize their credibility label, and to really, I think, take away something that there was a fair amount of audience interest in and heuristic that many people used. And so I think, outlets that get that right, that can regularly and predictably provide verification, I think there is an audience to be built around that. And I think that's something that is an opportunity, even if you just think about it from a market standpoint, that there could be people who go further, you know, that is the case if you had ease of use not cumbersome sort of verification tools around content and you know, what you're seeing that some is understood to be verified and some direct way. I think you might see that there being an audience for that. A large scale audience.
Jessica Johnson: For example, research, like the publications? And is that kind of what you're made of? Directly?
Dr. Brian Southwell: Well, all the above. I mean, it is just thinking about the notion that there's no video that hasn't been unfairly edited. But, certainly, we assumed that's the case with a peer reviewed journal in that process, of course, that could be on some level impersonated but that's ultimately the ideal and the hope. But I think similarly around statements all through the last few years, we have rediscovered the societal value of capital J journalism, right? In a landscape where it's easy to crank out content, there's certain types of content, true content that is more valuable for certain types of decision making than others and I think that's something that we discovered this during the dark days of the pandemic, for example, there's quite a bit of value in a certain organizations having the right on the correct story, having the latest information having and and that's only done through the hard work of journalists and others that work in that kind of a framework.
Jessica Johnson: Yes, thank you. So maybe to kind of, we're getting close to the end and I just want to check in with everyone, I guess, quickly. What do you think in the future? Do you think that misdismal information is going to become a larger, much larger or would be reduced with technology's problem in the future? What should people be expecting? Anyone want to chime in?
Prof. Stephan Lewandowsky: Wow, I'm only going to say, Okay, nevermind the technology, what matters, politics, what the future will be determined by choices we made at the political level and policy, and the technology, whatever, one way or the other, because it has the potential to make it worse or better. And it's up to us to guide them to policy.
Dr. Brian Southwell: I was just going to point out that, you know, as much as this is a headline level issue, we have been dealing with this information for a very long time. And that's important to keep in mind, it's been possible to manipulate images for a very long time it's been possible for and in fact, you turn to the 19th century, and a lot of what was printed, there was problematic in different ways. So that's been a long standing issue. It's also the case and this might sound a bit strange to say in a conversation like this, but I think others would agree, we have to remember that the vast that the majority of our information environment is not what we would label misinformation. I mean, it is a crucial problem but it's not somehow everything that's out there and I don't think we're gonna get to a point because it's just people need information functionally, like it’s that useful. And in the thing that gives me some hope is that much misinformation does not have what we would think of as predictive value and I think that's ultimately what people want is information that's going to tell them and then when it turns out to not be true, what was going to happen in a month or a year, that some of that's going to fall away. So I think we're gonna see constant churn, and it's in, we're gonna have continued problems to deal with. But there are reasons to be somewhat hopeful, too, I think.
Jessica Johnson: Okay, thank you. Dr. Hristova.
Dr. Stefka Hristova: I just wanted to say that in thinking about the shifting landscape, it will be really important to maintain a web of connections that are themes that include policy, academics, journalists and communicators working together to address the shifting landscape, and to insist on regulation and regulatory structures that will allow us to be effective communicators, as part of a larger team that includes all these different and shifting pieces. So yes, team building that includes all these different pieces will be a way to reduce mismaldisinformation in the future.
Jessica Johnson: Thank you. April
April Kaull: I would just say that, you know, I also am hopeful. I think that this is just the next chapter in the evolution of a lot of what we have dealt with, you know, throughout history, there are new tools. There are new platforms, and things move faster. All of those are certainly challenges that perhaps we have not dealt with in exactly this way before but we figured out how to deal with them. As that evolution got us to this point, I think will evolve with the evolution as we move forward. So I would just encourage people not to be discouraged and to keep engaging, don't step away from engagement. Don't step away from building those relationships and trying to share information to further knowledge because if we do that, that just creates a wider vacuum for other people to fill. So stay in the game and stay active and don't be discouraged.
Jessica Johnson: Thank you so much. I think it's been a great conversation, I really appreciate everyone's participation. I think that's all the time we have for today, I'd like to thank all of our panelists for participating, Dr. Hristova, April Kaull, Professor Lewandowsky and Dr. Southwell. Well, thank you so much.