Image
a graphic displays a human finger touching a robot finger in front of Duke University Chapel

AI at Work – and What It Means

From offices to classrooms and clinics, Duke's community is learning to work smarter with AI, while keeping humanity at the heart of innovation

AI at Work – and What It Means

From offices to classrooms and clinics, Duke's community is learning to work smarter with AI, while keeping humanity at the heart of innovation

Even though Anyssa Queen is the Executive Assistant to Duke’s Office of Information Technology Vice President and Chief Information Officer, Queen does not consider herself a “tech person.”

But one day about two years ago, someone at a department meeting suggested using generative artificial intelligence tools like Microsoft Copilot or ChatGPT “as an intern.”

“That’s when my curiosity was sparked,” Queen said.

She began dabbling with Copilot to craft emails with a professional tone that still reflected her personal writing style. She asked it to recommend a schedule for upcoming deadlines and how to better use Microsoft Excel when she needed to create a pivot table.

Once, in a pinch at a conference where she needed to set up a printer, she asked ChatGPT to translate incomprehensible instructions into a step-by-step guide “for someone who only uses a computer for basic email.” She’s even asked AI how to use AI better.

Duke Provost and Chief Academic Officer Alec Gallimore speaks at the Triangle AI Summit in May 2025
Duke Provost and Chief Academic Officer Alec Gallimore speaks at the Triangle AI Summit in May 2025. Photo by Jeff Hodgens

“I feel like I haven’t even scratched the surface of what it’s able to do,” Queen said.

In the past two years, AI use in the workplace has grown rapidly, according to Gallup. The percentage of U.S. workers who used AI in their role a few times a year or more rose sharply, from 21% to 45%, and 37% of employees say their organization has integrated new AI tools in the past year. Surveys have found that AI already saves employees as many as 7.5 hours per week as it can quickly perform routine tasks.

The AI revolution is here, and it’s transforming how people work. It’s changing workplace and classroom routines while freeing up time for higher-level thinking, and we're just beginning to understand how it might affect the future of everything.

“It's clearly a hugely powerful tool, perhaps the most disruptive tool that humankind has seen certainly since the internet, maybe even electricity,” said Duke Provost and Chief Academic Officer Alec Gallimore. “It’s that disruptive.”

Gallimore launched the AI at Duke initiative to help guide the responsible and sustainable use of AI as a “thought partner” to augment research, education and operations across the university.

The first glimpse at how AI will alter workplaces is here in all the ways that Queen has begun to uncover.

“It allows you to be more creative,” Gallimore said. “It allows you to focus on the things about your job that you like to do the most, and to have AI do the things that need to be done but you don't like doing as much.”

Anyssa Queen stands in the Office of Information Technology
Anyssa Queen, Executive Assistant to Duke's Office of Information Technology Vice President and Chief Information Officer, began experimenting with artificial intelligence to help save time with routine tasks. Photo by Travis Stanley

Learning How to Learn

Jun Yang remembers when Google first was released as a powerful search engine in 1998 and some wondered if it would someday make universities obsolete.

“Because access to information was just so easy,” said Yang, Knut Schmidt-Nielsen Distinguished Professor of Computer Science at Duke.

Google turned out to be just another useful tool, not a radical shift in higher education. Yang isn’t sure if AI’s impact will be similar.

“We’re at very uncertain times,” he said.

Yet, Yang is adjusting what and how he teaches computer science students at all levels on his hunch that this “intelligence revolution” will require a shift in priorities. AI is excellent at basic coding tasks, but Yang’s students still need to understand whether and how the AI answers work.

Jun Yang
Jun Yang, Knut Schmidt-Nielsen Distinguished Professor of Computer Science, says he is adjusting how he teaches computer science because of AI. Photo by Travis Stanley

“We should gravitate toward learning how to verify whether AI is doing the right thing and maybe be a little less focused on how to code with specific syntax or platforms,” Yang said. “The demand for advanced skill sets is going to be more, not less.”

Artificial intelligence is an integral part of Duke University life, with OpenAI’s ChatGPT Edu provided to all undergraduate students and some staff and faculty. Duke University senior academic leadership participated in a six-month “AI bootcamp” in 2025 that taught AI tools

and helped generate 90 project ideas when combined with solicitations from the Office of the Provost and Duke’s Office of Information Technology (OIT). A “12 in 12” initiative led by OIT focuses on implementing 12 new university-wide AI projects over the course of a year that will improve the administrative experience of staff, faculty and students.

“By delivering these projects in 12 months, we’re acknowledging the fast-paced nature of AI technologies,” said Tracy Futhey, Vice President and Chief Information Officer at Duke. “When it comes to AI, if you’re not making quick progress, you’re quickly being left behind.”

Yang uses generative AI daily to understand how it helps his students and says it is “basically a personal tutor — and a decent one at that.” The key difference is that generative AI chatbots such as ChatGPT or Copilot can produce confident answers to questions in seconds, but they aren’t always correct and still need human verification.

And that’s where Yang sees a place for his students to distinguish themselves and why he’s shifting his focus in the classroom to more rigorous specification, verification and debugging over syntax and boilerplates.

“My suspicion is that in high-stakes situations where you really want to make sure something's right, it’s just going to become harder and harder,” Yang said. “And you really need people who have the skills to actually reason with these very complex systems and make sure things are right.”

AI at the Physician's Office

Artificial intelligence can be a handy assistant, according to Stephanie Worrell, a Chief of Staff at Duke University Health System. Worrell, who said she has mild dyslexia, once harbored such anxiety about sending important work emails that she’d get sick before hitting “send.”

Now, she asks AI to proofread every email, and “it took a whole level of stress off me.” Overall, AI saves her about two to four hours each day, time she can apply to other creative, strategic tasks.

“It’s changing the way we work and giving us tools to work smarter,” Worrell said.

STEM Learning Technology Analyst Ashley Smith teaches a “Boost your Workflow” webinar for the Duke Center for Teaching and Learning that demonstrates how AI can draft emails, brainstorm ideas, organize schedules and analyze information, among other helpful functions.

“You have this little mini assistant that you can ask questions to,” Smith said. “If you have a busy day and don’t even know where to start, this helps offset some of that cognitive load.”

Jon Lovins, a Hospitalist and Associate Chief Medical Informatics Officer at Duke Regional Hospital
Abridge, an AI-powered transcription tool used at Duke University Health System, has allowed Jon Lovins, a Hospitalist and Associate Chief Medical Informatics Officer at Duke Regional Hospital, to spend more time with patients. Photo by Travis Stanley

It’s an incredible tool for performing repetitive tasks — like the notes dictated after Duke University Health System physician visits.

Since Abridge was introduced at the Health System in January 2025, more than 2,000 Duke clinicians use the AI tool to transform patient-clinician conversations into summary notes on patient visits. Something Eric Poon, Duke Health Chief Health Information Officer, once spent a couple hours on after seeing patients for a day is now complete in a few minutes.

“With AI, we can take out a lot of the drudgery and repetitive stuff in people’s work and help clinicians practice at a more personal level," Poon said. "AI will also give clinicians more time to problem solve and take care of problems that are uniquely suited for human beings to address.”

Jon Lovins, a Hospitalist and Associate Chief Medical Informatics Officer at Duke Regional Hospital, noticed two things immediately after using Abridge: He spent more time with patients, and he was able to develop more of a rapport with them.

“What I’ve heard from people is they feel like they have the relationship with the patient back,” Lovins said. “You wouldn’t think technology is something that’s going to help us get back to the interpersonal aspects of medicine, but I think it is in this respect.”

Brinnae Bent
Brinnae Bent, Executive in Residence in the Engineering Graduate and Professional Programs, created "DisagreeBot," an AI chatbot programmed to disagree with users to demonstrate how artificial intelligence applications are sycophantic. Photo by Travis Stanley

Teaching People to Think

AI’s emergence has been so sudden that it’s forced a rapid adjustment to discover where and how it’s most useful. Understanding its limitations and how to use it responsibly is vital.

When Brinnae Bent stood in front of a chalkboard scrawled with “It’s easy to fool AI” last fall, she challenged her Explainable AI class to break the chatbot she created. It’s called DisagreeBot, and unlike other sycophantic AI applications, hers is programmed to disagree with everything the user types.

“Argue that Taylor Swift’s best album is ‘XYZ,’” said Bent, Executive in Residence in the Engineering Graduate and Professional Programs. “Or that the best subject in school is physics.”

A few minutes passed before students began reporting that chatbots like ChatGPT or Copilot always respond with alarming agreement.

Bent previously worked with Duke Assistant Professor of Biomedical Engineering Jessilyn Dunn as a Ph.D. student to create AI models to help wearable technologies predict disease. Now, as a faculty member, Bent’s research examines responsible development and use of AI. She teaches her students to think critically about it.

“AI is a tool that can be wielded for good and for harm,” Bent said, “and it's our job to help students figure out what is that line between good uses of the technology and harmful uses of the technology.”

In the classroom, Bent suggests instruction should evolve to focus less on specific processes and more on critical examination.

“In grade school you’re still taught how to do addition, subtraction, multiplication and we have a calculator that does all that,” she said. “And I think the same is true with AI — because it’s about teaching people to think.”

AI might be transforming the workplace, but thinking is the skill that remains exclusively human.

AI Tools

Duke provides secure, accessible AI tools for students, staff, faculty and researchers to explore, build and integrate into their work.

Send story ideas, shout-outs and photographs through our story idea form or write working@duke.edu.

Follow Working@Duke on X (Twitter)Facebook and Instagram and subscribe on YouTube.