AI Use Grows at Hospitals

0
AI Use Grows at Hospitals
Device: UCLA Health is using robots in surgery.

No longer just characters in your favorite science fiction thriller, robots and artificial intelligence are already playing an increasingly significant role at local hospitals, and many stakeholders expect their importance within the health care industry to expand dramatically in coming years.

According to a Morgan Stanley report released in August, 94% of the health care companies surveyed said they are now employing artificial intelligence or machine learning in some capacity. That Morgan Stanley report also projected the health care industry’s average estimated budget allocation for AI and ML will increase from 5.7% in 2022 to 10.5% this year.

“Every major academic medical center in the country is busy developing, evaluating and implementing AI for the clinic,” said Jason Moore, who chairs the Department of Computational Biomedicine at Cedars-Sinai. “Every single one.”

From autonomous robots that transport critical medications to AI systems capable of accurately predicting emergency room wait times and the probability of life-threatening infections in cancer patients, Los Angeles-area hospitals are making use of robotics and artificial intelligence in a variety of ways – not only to positively impact patient care but also to improve the efficiency of their businesses.    

“The human mind and brain are amazing, but there are examples in which technology can do things people may not be able to,” said Omkar Kulkarni, the Chief Transformation and Digital Officer at Children’s Hospital Los Angeles. “Technology can pick up on little details that just the nature of the human mind may not remember or may have forgotten to incorporate. … So efficiency and effectiveness – I think those are the two real values we’re going to see out of AI and robotics.”

High spending expected

The global health care market is poised to spend $20.9 billion on artificial intelligence this year, according to a MarketsandMarkets study released in January. The same report also forecast spending on artificial intelligence within the global health care space will soar to more than $148 billion by 2029.

Still, Cedars-Sinai’s Moore cautioned that while there has been terrific excitement about AI’s potential within the health care industry, actual implementation of many of those technologies has been somewhat slow. 

“To demonstrate that an AI is truly effective and safe for health care is a heavy lift,” Moore explained. “Just validating that an AI model works in a patient population takes time and resources and experience, and not all AI models are going to validate and be shown to be clinically useful.”

Moore said there are also companies pitching AI technologies for clinical care within the health care space that just aren’t very well thought out, which has led to further delay.

“But I think the process should be slow because we’re talking about patients’ lives,” Moore said. “You don’t want to put something that could harm patients into clinical practice, and so that’s why everybody’s being so careful. But at the same time, everybody’s ramping up to do what they need to do to move AI into the clinic.”

City of Hope

City of Hope’s Executive Director of Applied AI and Data Science Nasim Eftekhari manages a team of 20 scientists, who focus their work on applying artificial intelligence to the hospital’s business operations, research and real-time clinical care.

One of her team’s accomplishments that Eftekhari is particularly proud of is a real-time clinical decision support tool that uses AI to predict sepsis infections in cancer patients who’ve received bone marrow transplants.

“When these patients get sepsis, which is a very bad infection that finds its way into their bloodstream, they don’t present with normal signs and symptoms of sepsis,” Eftekhari said, explaining that after a bone marrow transplant the patient’s immune system is often very weak or compromised entirely.

“They don’t have a fever because they have no immune system,” she continued. “Oftentimes, this goes unnoticed, and clinicians have a very hard time of saying if someone is at risk of sepsis. Unfortunately, because they don’t have an immune system, when these patients get sepsis, oftentimes they have a very little chance of staying alive. A lot of times when you hear someone had a bone marrow transplant and died, unfortunately, it’s because of sepsis. And sepsis, in general, is the leading cause of death in the hospital in the U.S.”

Eftekhari said City of Hope’s sepsis prediction model uses AI to monitor patients and pick up on subtle changes in their vitals in real time while also examining what she described as “hundreds of different data elements,” including the patient’s health history, details of their disease and medications they’re taking.

If the patients’ data patterns suggest they may be reaching a high risk of sepsis, Eftekhari said the AI tool sends an alert notification to doctors and nurses that features a risk-quantification figure and typically provides hours of crucial lead time to intervene. 

“We have shown we are decreasing (intensive care unit) admissions because of sepsis, and sepsis mortality is being decreased over time since the model went live,” Eftekhari said. “We’ve shown we have improved patient care, and we have saved lives.”

Antelope Valley Medical Center

Antelope Valley Medical Center’s Director of Pharmacy Services Le Du first heard about the idea of a robot that could autonomously deliver medications throughout a hospital a few years ago.

“We all thought it was a cool concept, and it was an interesting idea,” Du said about her initial reaction. “But we never thought in a million years that we would have robots here delivering our meds.”

Fast forward to today, and Antelope Valley Medical Center now has two robots that deliver medications from the pharmacy to designated nursing locations throughout the hospital; they also transport patient specimens to the lab for testing.

In use now at hospitals across the country, Moxi is a 4-foot-tall robot designed by Austin-based Diligent Robotics Inc. that Antelope Valley Medical Center leases on a subscription-based model and began using for medication and lab deliveries late last year.

“The biggest benefit of having Moxi is really cutting down the amount of time that our (pharmacy) staff spends on deliveries,” Du explained. “The same is true for our nurses. They’re there to take care of the patient, and frequently they were interrupted and had to come down to the pharmacy to pick things up. Having the Moxi around has been really helpful in cutting down the amount of time needed to go pick up meds and labs.”

Outfitted with a secure storage bin that opens only with a security badge, Moxi operates with a system of sensors and moves about autonomously through a hospital setting; it can even make deliveries on multiple floors.

“Moxi actually has this arm extension that’s sort of like a finger at the end of the arm, and when it reaches the elevator, it actually maneuvers its arm and pushes the elevator button,” Du said. “So that’s how it uses the elevator.”   

Cedars-Sinai

Cedars-Sinai officials are excited about a free, patient-facing generative AI app called CS Connect that the hospital launched about six months ago.

“The AI behind the app is a chatbot, like ChatGPT,” explained Jason Moore, the chair of Cedars-Sinai’s Department of Computational Biomedicine. “What it does is interact with the patient, asks them questions. The patient answers the questions about what symptoms they’re having, what health issue they’re having, and the chatbot understands what the patient’s saying and can then help the patient schedule an appointment here at Cedars-Sinai to see a doctor.”

Moore said the AI technology in the CS Connect app is also helping the hospital’s physicians by analyzing what the patient told them, generating a summary for the doctor and even providing some potential diagnoses. 

“So when the doctor meets with the patient, they’ve already read the summary,” Moore said. “They’ve already read the potential diagnoses, and they’re not starting from ground zero, so they can have a much more informed encounter with the patient and get to the problem much more quickly.”

A longtime artificial intelligence expert, Moore’s own research at Cedars-Sinai covers a lot of territory, but his team’s work using machine learning to combat AI bias certainly stands out. 

“AI models often don’t treat all patients equally,” Moore said. “They don’t treat men and women equally. They don’t treat Black and white patients equally or rich-versus-poor patients equally. The reason for that is there are biases in health care – discrimination biases that then can create biased data.”

As an example, Moore pointed to a recent study that found Black patients wait, on average, 40 minutes longer in the emergency room than white patients.

“If patients who need urgent attention in the emergency room are waiting 40 minutes longer, or more, then their health outcomes are going to be different – they’re going to have worse outcomes,” Moore said. “So if you’re building an AI model to predict patient outcomes after emergency room visits, that model’s going to be biased. … AI doesn’t know any different, right? It’s just working with the data that’s there. But if the underlying data’s biased because there’s a discriminatory practice at the health care level, that’s going to show up in the AI models as a bias.”

UCLA Health

It’s not uncommon for Dr. Mark Girgis, the director of UCLA Health’s Robotic Surgery program, to encounter patients who are concerned by the idea of being operated on by a robot.

“The thing I tell patients, and what they need to realize, is the robot is not autonomous,” Girgis said. “It is 100% controlled by the surgical provider.”

An oncology specialist, Girgis said he’s conducted “thousands” of robotic surgeries over his career, and he explained that he also tries to encourage his skeptical patients to think of the robots he works with as being tools.

“These tools of the robot just happen to not be straight,” Girgis said. “They happen to have wrists and elbows associated with them. So the functionality of these tools are much greater and are controlled by my hands through a computer to the robot. But they are still controlled by my hands.”

Girgis said robotic surgery was introduced in the early 2000s and has been an integral component of patient care at UCLA Health for more than seven years. About 80% of the surgeries Girgis conducts today are done using a robot, and his work focuses often on oncology procedures involving the pancreas, liver, stomach and esophagus. But robotic surgery at UCLA Health is used for a range of specialties, including thoracic, cardiac and neurologic procedures.   

“Cardiac surgeons are doing valve replacements robotically now,” Girgis sad. “That’s something that was unimaginable for many years.”

Girgis made it clear that robotic surgery, which makes use of 3-D cameras and much less invasive incision sites, provides a host of positives for patients.

“The benefit is really enhanced recovery,” he said. “There’s less pain, less trauma to the tissues in the abdominal wall, less blood loss and shorter hospitalizations, while not compromising the conduct of the operation.”

Girgis noted that UCLA Health makes use of the dominant robotic surgery platform available now in the U.S. – da Vinci surgical systems made by the Sunnyvale-based tech company Intuitive.

“There are other platforms out there, and eventually those will be introduced,” he said. “I’m confident the cost of these instruments and the cost of the platform are going to significantly go down as more competition is introduced. … Just like everything else: When you encourage competition, it drives price in a favorable way for hospital systems.”

Children’s Hospital Los Angeles

Emergency room visits are never any fun, but they can be especially difficult for children.

“It’s tough to wait when you’ve got your child – or perhaps multiple children who are with you – and you have no idea how long you’re going to wait,” said Omkar Kulkarni, the chief transformation and digital officer at Children’s Hospital Los Angeles. “So what do you do? You walk up to the nurse and say, ‘How long is it going to be?’”

Kulkarni said Children’s Hospital Los Angeles went live with an AI-driven app called MyVisit at the beginning of this year, created in large part to answer that very question.

Making use of AI to analyze current, historical and even contextual data – meaning the tech considers whether a patient came in for an urgent issue or something less severe – the MyVisit app will provide families with real-time estimates, according to Kulkarni.

“Not of their entire time in the ED,” he clarified. “But how long you’ll be waiting for the next thing. ‘You’ll be waiting this much time to see a nurse,’ and ‘You’ll be waiting this much time to see a doctor.’ The accuracy of this thing is over 95%, which is amazing.”

Kulkarni noted MyVisit has a number of other features, including insight about why certain labs have been ordered or what certain test results mean, as well as perspective about care providers. The app is available in a range of different languages.

“What’s amazing is that’s it’s been used incredibly well,” Kulkarni said. “We’re seeing about 70% of the families in the emergency department use this app … with lots of families using it in Spanish and other languages.”

No posts to display