In the medical field, "pajama time" refers to the time physicians spend updating patient charts after hours.
Artificial intelligence is helping to reduce that unpaid, overtime work — which can lead to burnout — by taking notes and creating charts during appointments.
Ambient AI operates quietly during appointments while physicians are free to interact with patients and often comes in the form of scribe technologies to assist physicians with notetaking and creating patient charts.
“We provide an ambient listening technology between the patient and provider,” said Michael Draugelis, Geisinger’s associate vice president of artificial intelligence.
“The provider can have more focused time with the patient and less time looking at the computer and typing away, as well as capturing information that the physician otherwise would have to spend what they call pajama time updating,” Draugelis said.
The Wright Center is currently in the trial phase of implementing ambient AI technology Suki across its locations. So far, feedback from physicians involved in the trial has been positive.
Dr. Jignesh Sheth, The Wright Center’s senior vice president and chief medical and information officer, sees it as a huge time-saver.
“One of the biggest pieces of that documentation burden comes from charting,” he said. “Charting time [that] can take anywhere from 15 to 20 minutes is [now] down to like five. You're saving 75% on documentation. Then times by how many patients you see, 10 patients a day, and you're saving 15 minutes of patient time, 150 minutes. So that's like almost two-and-a-half hours saved on charting time.”
Support, not replacement
FOBO, or fear of becoming obsolete, is a common hesitation about AI use. Healthcare providers emphasize that the use of AI in their facilities is not about replacing the work that doctors do but enhancing it.
“AI systems [are] a component to the solution,” Draugelis said. ”It's being integrated into a health system with providers making decisions that have interventions that have a certain efficacy. They're not always 100% going to solve the problem. We just believe it's the best thing to do.”
Suki CEO Punit Soni said his company’s technology has high accuracy rates.
“We use a metric called suggestion acceptance rate to understand the percent of suggestions that are accepted, without changes, by the clinicians — our suggestion acceptance rate is between 93%-94%,” Soni wrote in a statement to WVIA. “We have a high bar for acceptance, so notes that are changed in any way by the user, including formatting, are counted as not accepted.”
Although Suki’s technology has proven reliable thus far with high accuracy rates, The Wright Center mandates users to check the AI’s work before submission.
“You have to fact check it every single time,” Sheth said. “That's one of the requirements at The Wright Center, that the clinicians have to 100% sign [off] on every section of the note that was created, not just the whole note.”
Dr. Devang Gor, Lehigh Valley Health Network’s chair of radiology-diagnostic medical Imaging and diagnostic radiology for neuroradiology, said physicians are using AI incorrectly if they are using it to make diagnoses.

“We know that we are not exclusively depending on the technology to make a diagnosis or to make a finding,” Gor said. "We are using the technology to aid us or to direct our attention on something which could possibly be wrong, so that we can take care of patients sooner.”
Improved patient satisfaction, outcomes
Sheth said patients are happy to have more face-to-face time with their doctors since The Wright Center started using AI.
“We're not busy with our head dug into the computer,” Sheth said. “We are instead facing the patient. We're talking to the patient, the encounters actually are quicker.”
To make sure the AI picks up physician-patient conversations correctly, doctors at The Wright Center emphasize their speech, which also makes directions more clear to the patients.
“When we talk to our patients, we repeat the things more than once, which helps in two ways, because now you're having the patient listen to the instruction twice,” Sheth said.

Patients also can benefit from early detection software that allows them to be treated quickly.
At LVHN, AI helps physicians detect strokes early. While doctors train to know the signs of stroke, the technology can help them detect those signs sooner.
“We started using automated detection of large vessel occlusion in the brain using a company called razor AI, and that became like the centerpoint of our stroke care,” Gor said. “We use that for early detection and automatic detection, but also for communication facilitation, which helps actually provide faster care to our patients in stroke.”
Geisinger uses proactive care management to flag patients that are more susceptible for certain diseases so patients can get testing done early to check for potential diseases they may be predisposed to.
“We can look for using these more advanced machine learning, traditional machine learning methods to kind of find the needle in the haystack, the patient that may be at higher risk for [say] breast cancer,” Draugelis said. “We want to really reach out to them to make sure they're coming in for their treatments.”
Assisting in radiology, other uses
Gor thinks using AI in radiology — which uses medical imaging to diagnose and treat patients — just makes sense.
He has helped LVHN’s radiology department implement AI which assists their findings. One such example is technology used to detect asymptomatic pulmonary embolisms, which can be life-threatening.
“We have an algorithm which detects pulmonary embolism in patients who are asymptomatic,” Gor said. “When we get this alert, we notify the provider, and it actually helps us treat that pulmonary embolism on an outpatient basis, rather than getting the patient admitted. It's sort of having patients develop treating their disease before the patient develops symptoms.”
He’s seen physicians in his department use it as a reinforcement on their work, while also being able to prioritize who gets care first.
“I have most of my department actually using it for double checking their work and prioritizing patient care, taking care of the patients who are sicker first,” Gor said.
Geisinger's facilities use risk detection to get care to rapidly deteriorating patients quickly.
“Imagine you're a patient in the health system, and you have monitoring occurring,” Draugelis said. “We can turn on algorithms that can fuse hundreds of different variables and can find patients that are deteriorating and may need to be seen by what we call a rapid response team and potentially transfer [them] to the ICU. We found that these algorithms can help close that gap of time and get patients the care they need that would give them a chance at a better outcome.”
But beyond ambient and early detection AI, hospitals are using AI in various other ways throughout their operations.
The Wright Center uses AI for meeting and email summarization.
And Geisinger uses AI to improve staff scheduling based on potential demand.
“We can use forecasting algorithms to understand what the burden in the emergency department might be, so we can make sure we're staffing it correctly,” Draugelis said.
Criticisms and cautions
Another common concern with AI is inaccuracy. Sheth gave an example of how he’s observed Suki’s inaccuracies in trying to create context when it misses words.
“When the microphone wasn't facing the patient and it did not capture a certain part of the conversation, it will fill in the gap and just make it up,” he said. “It will not leave a sentence half baked. It will complete the sentence whether it makes sense or not. To a person who is not in the room, that sentence will make perfect sense, but someone who's in there will be like, ‘That's not what I was talking about.’”
That’s why he suggests doctors using the technology state their words precisely and repeat them if needed.
“It's a matter of just getting used to being more precise, being more specific, being more clear when you're talking to the patient,” Sheth said.
As an AI officer at Geisinger, Draugelis sees the lack of regulation and uniformity as challenges to the industry.

“How do we create a system or process that we can kind of agree is safe, fair, effective and transparent?” he questioned. “So I think a challenge in our industry right now is each health system is kind of defining that for themselves.”
It can be difficult to find AI technology that complies with HIPPA (the Health Insurance Portability and Accountability Act) and ensures the protection of patient information.
“At The Wright Center, we value and respect patient privacy more than anything else, and we have taken numerous steps to safeguard that,” Sheth said. “I would recommend anyone trying to use any AI products should definitely go through the same exercise, even though it's time consuming.”
Despite the potential problems that come with the new territory, LVHN wants to stay ahead of the curve on the latest AI trends.
“We are in the preface phase of the story of AI,” Gor said. “The book is going to be written very, very fast. We [LVHN] want to be writing the book.”