Texas nursing professor is using AI to teach students how to handle difficult patients
AUSTIN (KXAN) -- When you think of artificial intelligence, you probably think of Siri and self-driving cars. But one Texas professor is bringing it into her classes to help physical therapy students practice patient interaction and communication skills.
Dr. Kaelee Brockway, PT, DPT, EdD, is a professor at the College of Rehabilitative Sciences at the University of St. Augustine for Health Sciences. Her specialties are cardiovascular and pulmonary chronic disease management and geriatrics.
She explained that she's a board-certified geriatric specialist whose primary patient population consists of individuals with advanced heart and lung diseases, but her full-time job is teaching at the university.
Brockway has spent about a year developing AI roleplay chatbots to "give students kind of a lens into what they might run into out in the world," she said. The bots are used to teach students communication skills and expose them to potential real-life issues like empathetic chronic pain management.
"Sometimes we have to create environments where [students] get a chance to practice dealing with more difficult situations, so that they can practice that now when they're with us, and learn better techniques to do that, as opposed to running into that for the first time out in practice with a real person," Brockway explained.
Brockway has developed four AI chatbots for one of her courses, and she's in the process of creating four or five more. The bots were created based on real experiences Brockway has run into in her medical practices in the past. The chatbots emulate those scenarios and were programmed to help guide the students toward a certain goal.
"So, this is a very specialized style of communication," Brockway said.
Brockway explained two of the chatbots she uses.
One is "EMT Bart," a biased emergency responder who's being dismissive of a patient's fall, writing it off as an overdose. The scenario was based on an experience Brockway had when she was working with a complex patient, she said.
"I had seen him for a few visits and knew him fairly well at that point. And when I got to this visit, he was not himself, and he reported to me that he had had a fall and hit his head, and I was seeing that something was very wrong with him," Brockway said. "So I called for emergency services and was told by the EMT that, 'Oh, he's just overdosing on his pain medication. He'll be fine,' and it had to fall onto me as the leading provider in that scenario that has a clinical doctorate degree to practice my autonomy by insisting that my patient get the care that they needed."
Brockway said that interaction places students in the position of being the leading provider, having an interaction with the EMT to convince him to take the patient to the hospital for further medical care.
Another one is an interaction with "Mr. George," an AI patient experiencing intense pain to the point that he refuses to get out of bed. The students have to play out a scenario through messaging back and forth with Mr. George, where the goal is to get him to consent to treatment. Brockway said the students are being taught to recognize his discomfort and offer appropriate pain management and treatment options while maintaining empathetic communication.
Brockway demoed the interaction with Mr. George, showing how the scenario plays out if it "goes well" and if it "goes poorly."
The interactions students have with the bots are saved once the conversation is complete, then submitted for Brockway to review. She said the AI system is a feature that's built into their learning management software using their own large language model. Meaning it's a "closed loop" and the information it's fed stays where it's put.
"We can put things into this, and they stay where we put them," Brockway said. "They don't go into a larger language model that then influences external communication, which is nice, because this is interacting with student information, this is graded material... Under FERPA regulations, it keeps it secure and it keeps it safe, and the students can also feel very secure knowing that this will never be used externally."
Brockway uses the chatbots in lower-level courses for students who don't yet have field practice, so they're able to get exposure to this specific type of communication and practice those skills before needing them in real life. They're placed in the second term of a seven-term doctoral program, she said.
"Why would you choose AI to do this when you could have a human do this and practice actual human interaction? We are working with the generation of students who did their undergraduate education during COVID -- they are not used to jumping into interpersonal communication in a face-to-face way without having practiced some of that prior," Brockway said.
"So, it helps them, in a very safe space, engage with these very difficult situations in a way that it's very closed loop. They don't have to worry about an actual human who might get angry at them, or something like that," Brockway said. "They can practice in a way that will also garner them feedback."
A simple way of understanding Brockway's AI chatbot "patients" is to think of them like non-player characters in video games. The students are the main heroes of the game and have to interact with the NPCs in order to progress in the plot of the game.
Later on in their education, the students practice the scenarios again with real people playing the parts, instead of the chatbots.
"It's kind of a bridge to actual patient interaction, but because they're so early in their education, we're just not quite there yet. So we do similar scenarios later on in their education with standardized patients, that's the human actor playing that role. But this is kind of like a preparatory activity for that."