The Advanced Research Projects Agency for Health, or ARPA-H, is investing in the development of such a clinic, with the University of Michigan leading one of two large teams designing and building the AI component.
Even before rural hospitals began scaling back services and shutting down, getting consistent medical care was challenging for people living in remote areas. The new program to improve access conjures Knight Rider crossed with Northern Exposure, in a future where rural health care professionals extend their knowledge on the fly with help from the vehicle they travel in.
“We want to bring the hospital to the house, or to the church parking lot—whether that’s in Michigan’s Upper Peninsula or in the middle of Indiana—where the nearest medical center that performs the care the patient needs might be two hours away,” said Jason Corso, director of the AI project and U-M professor of robotics and electrical engineering and computer science.
A mobile clinic would reduce the cost of care by requiring fewer permanent buildings in its service area and enabling physician assistants and nurses to perform more advanced procedures with coaching from the AI agent, Corso said. The AI agent would also be able to learn the needs of the clinic operators and patients, so manually customizing the software would not be necessary.
Funded with up to $25 million, the U-M-led AI team brings together an extensive group of hospital specialists, rural practitioners and engineers, organized into three subteams. Together, they represent eight universities and the research and development company RTX BBN Technologies. The AI agent will be one piece of ARPA-H’s five-part program to prototype the equipment needed for this mobile medical clinic.
Other parts aim to link up various data sources within the clinic, worn by the patient and in the patient’s health record; develop a miniaturized CT scanner for mobile 3D imaging; and build a prototype mobile clinic. Eventually the team will test the AI agent in the mobile clinic, expected in the third year. Until then, they will use a stationary clinic, equipped like the proposed van, to assess how well the agent meets the needs of patients and clinicians.
The clinic AI builds on earlier work led by Corso, designing AI agents to provide intelligent guidance across different scenarios. Cooking is an excellent test bed, he said, because it involves a set of raw materials and instruments, a sequence of events and skillful techniques—and it’s reasonably low-stakes. His team then built on the strategies needed to upskill a person’s cooking, guiding soldiers through lifesaving battlefield medicine. For the current project, the team anticipates the AI agent working alongside a family doctor or nurse practitioner, for instance—someone with a lot of foundational knowledge but without the training or experience of specialists.
The team breaks the work down into many pieces. The technical team, strong in computer science, will build models capable of representing medical tasks, what’s happening in the van and with the patient, and how the patient and generalist are doing. The goal is an AI agent that can not only observe the generalist’s actions and walk them through unfamiliar tasks, but that can also recognize when something unexpected has happened and adjust accordingly.
Part of that capability would be recognizing the emotional state of the humans, such as the generalist becoming stressed if the patient’s condition worsens dramatically. Collaborators in nursing will bring expertise in reading people and calibrating the assistance they provide. With their input, the AI agent may learn to alter the way that it delivers information in tense situations.
The medical team and system integration team will gather the data set to train the models that will power the AI. That task includes assessing biases in the data that could lead to inaccurate diagnoses and treatments. The medical team will also provide rich guidance on how to perform medical tasks, in areas such as cardiac and trauma care.
The systems integration team and technical team will build the prototype AI agent, called VIGIL for Vectors of Intelligent Guidance in Long-Reach Rural Healthcare. Then the systems integration and medical teams will iterate on it through testing in clinical settings, addressing pain points identified by medical professionals and patients.
The technical team includes:
- U-M: Corso (PI; physical AI), Anhong Guo (interactive systems), Andrew Owens (multimodal AI), Mike Oelke (project management)
- Colorado State University: Nathaniel Blanchard (computer vision), Nikhil Krishnaswamy (AI language), Sarath Sreedharan (AI physical systems), Bruce Draper (computer vision)
- Stevens Institute of Technology: Enrique Dunn (3D environment modeling)
- Northeastern University: Ehsan Elhamifar (task and recovery monitoring)
- University of Pennsylvania: Alison Marie Pouch (medical image analysis)
- Purdue University: Jeffrey Siskind (AI physical systems)
- University of Rochester: Chenliang Xu (multimodal AI)
The medical team includes:
- U-M: Prashant Mahajan and Kevin Ward, Milisa Manojlovich (nursing), Donald Likosky (outcomes assessment), Francis Pagani (cardiac surgery)
- University of Pennsylvania: Pouch (medical image analysis), Emily Mackay (cardiac anesthesia)
- Central Michigan University: Sethu Reddy (endocrinology), Alison Arnold (rural medicine), Steve Vance (rural medicine)
The systems team includes:
- RTX BBN: Brian van Voorst (lead scientist and research fellow), Muntaha Samad (principal engineer of research)
Guo is an assistant professor of computer science and engineering; Owens is an assistant professor of electrical and computer engineering; Ward and Mahajan are professors of emergency medicine; Manojlovich is a professor of systems, populations and leadership; Likosky and Pagini are professors of cardiac surgery.
![withyou android app](https://sciencenewsnet.in/wp-content/uploads/2023/10/viber_image_2023-09-16_21-44-30-623.png)