AI’s Role In Health, Unanswered Ethical Questions Take Center Stage At Triangle AI Summit

(The Chronicle, Max Tendler) — Hundreds poured into the Washington Duke Inn Friday for the Triangle Artificial Intelligence Summit; Duke’s second annual forum focused on engagement with AI technology in the region.

Hosted by Provost Alec Gallimore, the symposium took place a week after he announced a new initiative that aims to increase conversation about the technology and make the University a leader in the field.

“We’re not just reacting to the evolving field of artificial intelligence,” Gallimore said in an introductory speech. “We are actively shaping its future.”

The summit — organized by Duke Learning Innovation and Lifetime Education, Duke Libraries, Duke Community Affairs and the School of Nursing — was formatted with four “pillars” in mind drawn from the initiative: trustworthy and responsible AI, advancing discovery with AI, life with AI and sustainability in AI.

Attendees began with an overview from New York Times reporter Cade Metz on the history and functions of AI, then heard from leaders in the AI space in a series of panels framed around the four pillars. The event also featured a showcase of AI projects from across the Triangle and the perspectives of undergraduate participants in Duke’s Code+ summer program.

New AI initiatives at Duke

Tracy Futhey, vice president of information technology and chief information officer, discussed a number of the University’s plans to invest in AI in support of its educational mission — including constructing a data center with an energy output large enough to power “the entire town of Carrboro.”

She said her office is working with Facilities and Management to outline the feasibility of the project, focusing on “how we build a next-generation data center that will be energy efficient [and] … sustainable for the environment.” She noted that Duke is investigating liquid cooling technology as a cost-effective approach to reduce the facility’s climate impact, as data centers are known for their high energy consumption.

The new center is projected to be online in 18 to 24 months.

Futhey listed several AI programs that will be available to community members in the upcoming academic year, including ChatGPT-4o and a new software called DukeGPT.

She also promoted AI training partnerships between Duke and other universities in the state, noting that “there is more that needs to be done than any one institution can do.”

“Our goal here is to have North Carolina be the number one place for AI research and education,” Futhey said.

AI’s uncertain role in health

Moderator Nicoleta Economou-Zavlanos, assistant professor of biostatistics and bioinformatics and director of Duke Health AI evaluation and governance, began her panel with a seemingly simple question: “What does trustworthy and responsible AI mean to you?”

“I’ll just be honest with you all; I have no clue,” said Jun Yang, Bishop-MacDermott family professor of computer science. “And I don’t think I’m alone.”

He was not. As the panel turned to the use of AI in health systems, Robert Califf, former commissioner of food and drugs, Trinity ’73 and School of Medicine ’78, expressed concerns about the inability of AI tested in one health care setting to be deployed in another, a problem which he said comes down to barriers to sufficiently validating AI tools.

“I don’t know of a single health system in the country that can actually do what needs to be done to validate AI,” he said.

Califf also said that hospitals often integrate AI for financial reasons, making “patient well-being … a minor part of the equation.”

He pointed to the Department of Health and Human Services’ recent Make America Healthy Again report, which has been accused of being produced with falsity-spouting AI, as an example of a failing national approach to integrating AI into health systems responsibly.

Steve Kearney, medical director at the Cary-based international tech company SAS Institute, said AI’s development has to “move at the speed of trust” but maintained that responsibility for deciding how software products should be integrated into health care lies with practitioners.