The open-source platform RCareWorld provides a realistic simulation of home caregiving scenarios by combining: accurate avatars representing people with motor disabilities; homes with different levels of modifications for accessibility; and caregiving robots. The simulator allows users to design new robotic caregiving scenarios and program existing robots to perform caregiving tasks, without expensive robots or human volunteers.
Ruolin Ye, a doctoral student in the field of computer science, announced the platform in her presentation, “RCareWorld: A Human-centric Simulation World for Caregiving Robots,” at the IEEE/RSJ International Conference on Intelligent Robots and Systems, in October.
The paper received the Best RoboCup Paper Award from the RoboCup Federation, which hosts an annual competition of autonomous service robots in a home caregiving environment.
“There are a lot of barriers to entry to this field,” said Tapomayukh Bhattacharjee, assistant professor of computer science in the Cornell Ann S. Bowers College of Computing and Information Science, who led the project. Bhattacharjee cites the challenges of connecting with people with mobility limitations and other clinical stakeholders, the need for an institutional review board to approve studies involving close contact with humans, and the cost of the robots.
“You need continuous feedback from the stakeholders – the care-recipients who would potentially use this technology, the caregivers and health care professionals – to know whether the technology we’re developing is going to translate from the lab to real homes one day,” he said.
Bhattacharjee runs the EmPRISE Lab, one of a handful of labs designing robots that assist with physical caregiving, where the robot touches the person, such as for feeding, bathing or dressing. They also work on solutions for social caregiving; verbal support such as medicine reminders or instructions for exercise.
An estimated 190 million people worldwide have conditions that impair their ability to move and function; assistive robotics has the potential to give these individuals more independence while reducing the burden on caregivers. But currently, no caregiving robots are widely available for home use. Through RCareWorld, Bhattacharjee’s team hopes to provide the basic tools needed to design and program these robots.
The simulator is the first of its kind, in that it aims to realistically simulate caregiving scenarios through six true-to-life human avatars that move and behave like people who have motor impairments, such as different levels of spinal-cord injuries, a brainstem stroke or cerebral palsy. Each avatar has a specific range of motion and muscle strength based on clinical data collected from individuals with motor impairments. In future versions, the researchers plan to expand the number of disabilities represented.
RCareWorld also includes robots commonly used to do research in home environments; researchers can also import their own robot models. In the simulator, roboticists can test navigation and manipulation algorithms and access data from multimodal sensors on the robots. They can also use the virtual reality interface to enter the simulator and control both the robots and the human avatars.
Robots and avatars can interact in 16 different homes with more than 200 rooms. The homes have three levels of modifications for accessibility, ranging from no modifications to homes that are completely barrier-free for people with disabilities, with assistive devices such as stair lifts, hospital beds and patient lift slings.
“We are giving people a variety of tools that are necessary to come up with behaviors for physical caregiving or social caregiving scenarios,” Bhattacharjee said.
Using algorithms learned in the simulator, one robot successfully performed a real-world sponge-bathing scenario; another coached volunteers wearing a virtual reality headset and gloves to perform an exercise routine.
The first version of RCareWorld will be freely available for use starting in early 2023; Bhattacharjee said he has already received interest from colleagues.
Co-authors on the paper include Rajat Kumar Jenamani, a doctoral student in the field of computer science and Vy Nguyen ’23, along with Wenqiang Xu, Haoyuan Fu and Cewu Lu of Shanghai Jiao Tong University, and Katherine Dimitropoulou of Columbia University.
-30-