New systems for text and image generation can support many types of behavior and goals that are only now being considered, said John Behrens, director of technology initiatives for Notre Dame’s College of Arts and Letters. “While many of us may look at ChatGPT and think, ‘Wow, it can write a 500-word essay or help me be more efficient,’ others may look at it and say, ‘I’m really shy and this can help me learn how to write or discuss topics with my friends in new ways,’” he said. Although there are many clear benefits to society, we unfortunately do not yet know when they work well enough to use appropriately and how people should interact with them, Behrens noted. “For example, is it appropriate for a young person to interact with a chatbot if the software is so human-sounding that the young person becomes emotionally attached and vulnerable?” he asked. “The fundamental issue is that the technology and its applications are evolving faster than those in the social sciences, the humanities and the arts can keep up.” The first step in addressing the many emerging concerns is educational, said Behrens, a former vice president of AI development for Pearson “Artificial intelligence is a type of software, and the more people treat it that way — rather than as some robotic being — the better off we will be,” he said. “But we need to support education at all levels to get there. The questions society is facing because of AI are not only ethical but involve all the liberal arts: What are the economic impacts? What are the psychological impacts? What questions does this human-like fluency in language raise for issues of philosophy and theology? “Notre Dame has a unique opportunity to bring to bear the full range of the liberal arts to help society tackle these issues.”