Friday, March 31, 2023
In this new AI age, misinformation can run rampant
On March 15, Â鶹´«Ã½ hosted an expert panel on how artificial intelligence and chatbots are changing the landscape of journalism and the transfer of knowledge (watch the video and read the transcript here). Panelists included Sercan Ozcan, Associate Professor of Innovation & Technology Management at the University of Portsmouth, Jim Samuel, Associate Professor of Public Informatics at Rutgers University-New Brunswick, and Alan Dennis, Professor of Information Systems at Indiana University. We learned that there are exciting things from AI tech that can assist us as science writers and communicators. How awesome is it to have a program summarize a study you might be struggling to get into coherent words with just a few carefully worded commands? ChatGPT can help with some of the routine work of a journalist and a science communicator, searching for information, gathering information, and possibly even putting that information into a first draft. But with that benefit come significant challenges. The biggest challenge is sussing out the bullshit (bullshit is a technical term according to our panelist Alan Dennis. Honest!). Artificial intelligence in the form of large language models (LLMs) such as ChatGPT gives information to you that looks very realistic, as if a real person wrote it. But this is an illusion. Sercan Ozcan refers to the deceiving output of chatbots such as ChatGPT as “hallucinations.”
The creation of misinformation is what worries Alan Dennis the most. “Deep fakes (artificial videos of real people) and other tools like it is going to change everything, particularly for journalism because we’ve created digital puppets of several different celebrities and I can make them say anything that I want them to say.”
Deep fakes are one thing, but what about the dangers of media relying on AI to generate news content? Panelist Jim Samuel says that “we need to treat AIs as some kind of very smart, but inexperienced and probably not very, not comprehensively knowledgeable teenager.” The output that AI produces requires supervision. Samuel says that we have a responsibility [as educators, media, and science communicators] to educate the public in order to develop internal mechanisms to deal with misinformation.
On April 26, 2023 at 2 pm EST, Â鶹´«Ã½ will host an expert panel on fake news and how it affects media relations. Reporters are tasked with solving misinformation and disinformation but can’t do it alone. We would like to involve our community of members and researchers to have a discussion and investigate how we, in communications, can find ways to support them. Both communicators and researchers are invited to participate in this panel. Join us by registering