
What does it mean to be human in an age of advanced technology?
Exploring ethical implications of AI in government, business, the military and academia.

By Andrew Clark
While the topic of artificial intelligence (AI) has been around for decades, its recent and continuing advances now seem to be happening at warp speed. From health care to manufacturing to the financial sector, the use of AI technology is gaining momentum across industries.
In higher education, AI is both a tool to be embraced and a challenge not to be ignored. Since 1989, when Salve Regina University’s Ph.D. program in humanities was launched, doctoral students and faculty have been exploring a complicated question: “What does it mean to be human in an age of advanced technology?” Recently renamed as Doctor of Humanities and Technology, the program has stayed ahead of the curve by examining the advancement of technology and its ethical implications in government, the military, academia and business.
“The name change makes apparent the area of research that our program has been concentrating on since its inception,” said Dr. Troy Catterson, associate professor of philosophy and graduate program director for the Ph.D. "It has always been about the humanities in dialogue with the sciences and technology. We have changed the name so that those who are interested in pursuing the highest levels of such a dialogue will now know that it is being pursued here."
Two recent publications from Dr. Sean O’Callaghan, associate professor of religious and theological studies and former doctoral program director, and Ph.D. candidate Russell Suereth showcased Salve’s commitment to exploring the ever-evolving field of AI.
Co-authored by O’Callaghan and published in March by Baker Academic Press, “AI Shepherds and Electric Sheep: Leading and Teaching in the Age of Artificial Intelligence” examines the role of AI in politics, economics, medicine, law, the military and many other areas. According to O’Callaghan, he and his co-author thought that there was a lack of books available to inform spiritual leaders about the nature of AI and provide them with the tools to discuss it with their
congregations or students.
“AI has a huge role in decision-making,” said O’Callaghan. “It often augments human decision-making. For the first time in human history, we have a non-human entity either making decisions for us or with us. So, it is all pervasive and we need to take notice of it, or we will sleepwalk into a world we don't understand and haven't had a role in constructing.”
For O'Callaghan, AI is an attractive research subject. He has always firmly believed that faith needs to be relevant to the age in which it exists. If it is not, he says, then it has nothing to offer and no answers to give. O'Callaghan says that he started with that premise and then started to explore how faith and AI could dialog with each other.
“I looked at different definitions and, while you have to start with the technical definitions, the ones proposed by technologists and scientists, I quickly realized that AI technology starts in the research laboratories, but then it shapes the world we inhabit,” he said.
In December 2024, Suereth published “Caring in AI: Considering the LIDA model” in World Scientific Research, a peer-reviewed journal that covers a range of technology topics. His article examines whether caring can be designed into an artificial intelligence system. The research considers a specific AI design known as the Learning Intelligent Decision Agent (LIDA) model and describes both the cognitive cycle and the global workspace within it.
“For the readers, one takeaway is that human caring can be complex,” said Suereth. "Yet there are simple and limited forms of caring that we already perform today in hospital and health care situations. Another takeaway is that these simple forms of caring can be designed as a safeguard in AI systems to help ensure that they won't harm people."
“From my personal viewpoint, I feel that the article highlighted a couple of avenues of caring,” Suereth continued. “First, it highlighted that both AI systems and human individuals must have some degree of caring to function properly in our world. Second, it highlighted that I care deeply about this matter of caring in AI systems and about caring in general in our human relations.”
For Suereth, who has a background as a software developer and an interest in the humanities, AI is an area that fascinates him. He sees the field as a combination of software engineering and human connection.
“It's hard for me to stay away from AI,” he said. “I find that it's quite intriguing. For me, caring is a vital aspect of our everyday human lives, and it should also be for any model or machine that has human characteristics.”
Salve’s humanities and technology doctoral program challenges students to consider the essential link between the two, seemingly disparate, disciplines. “In my view, the question of the relationship between technology and what it means to be human is more pressing than ever before,” said Catterson.
“The objective is to evaluate what it means to be human in an age of advanced technology, to demonstrate how culture and the liberal arts blend with and inform science and technological innovation—and to pursue that in a scholarly, academic environment,” said O’Callaghan.
Featured image by Getty Images/Blue Planet Studio