Why I Want To Build Personal Health Care Companions Like Baymax

Last updated 12/27/14


CMU video

CMU video: Director's cut


This is a more complete version of a commentary published on the CNN Opinion pages.

Is Disney's Big Hero 6 a children's movie about an endearing marshmallow-like inflatable robot, or an astute prediction of the future that should be carefully analyzed by engineers, investors, and policy-makers? I will try to sort out accurate futurology from wishful thinking, since to some extent, Big Hero 6 is my fault.

In 2011, one of the co-Directors of Big Hero 6, Don Hall, visited my lab at Carnegie Mellon University and saw our work on inflatable arms. We were exploring this technology with the idea that "soft safe robots" someday would feed, dress, and groom our parents, and then ourselves, when we got old. From this visit, Don created his vision of Baymax, an inflatable robot that was designed to be a personal health care companion (a radically different vision than the giant shape shifting Baymax robot in the original Marvel Comics version of Big Hero 6).

"You are my patient. I am responsible for your care."

In the Disney movie, Baymax scans for illness and injury, comforts the hurt, and gently nags teenagers to be safe and take better care of themselves. Baymax can speak and understand speech, can see and display information on his skin, and can feel and generate different tactile impressions such as changing skin temperature. The movie does not address taste and smell, but as a healthcare robot, it is important that Baymax can smell (analyze chemicals in the air), and as an inflatable robot that can exhaust air it is easy to spray smells as an olfactory display.

The movie showed how a personal health care companion robot could interact with humans, and this vision will drive interface design for a long time. Users can improve and program new behaviors, as well as customize the robot shape and mechanics. Baymax can use his arms and tools (like Scotch tape) to maintain and repair himself, can be compactly and automatically stored in a suitcase, and comes out when needed.

"My brother built Baymax to help people."

At some level roboticists hate movies like this, because the movies create expectations that are currently impossible to meet. Star Trek's Data, C3PO from Star Wars, or any of the Terminators are not yet close to being built. I want to convince you that in the case of personal health care companions, some parts of Baymax are already available and others are just around the corner.

I am motivated by personal experience, by the clear and pressing need for help for caregivers, many of whom are spouses or family members who often are themselves older adults, and by technological opportunity. Technology might not only relieve the load on caregivers and enable older adults to live in their own homes longer, but could provide new opportunities for frequent screening for diseases such as cancer and dementia, and drug timing, efficacy, side effects, and interactions. With the coming retirement of the Baby Boomers who had fewer children, combined with increased family mobility, society will be faced with the challenge of more and more older adults wanting to remain independent as they age, but fewer and fewer family caregivers to help them do so. Technology can and must play a key role in facing this challenge.

"Your neurotransmitter levels are rising steadily - the treatment is working."

My father was ultimately diagnosed with colon cancer because an annual blood test indicated an iron deficiency. Unfortunately, annual screening is not often enough. My brother died from prostate cancer in his fifties. I want to know why we can't screen monthly for cancer, or even daily to find the right dosage for drugs such as anti-depressants, where balancing theraputic effect and undesirable side effects is often difficult.

Caregiving for a family member is a tremendous load, and if technology could just allow a caregiver to take a few hours off with peace of mind, that would make a tremendous difference. Enabling older adults to live in their own homes longer would make them happier and the economy stronger. I see technology available now to meet these challenges.

Current smart phones combined with cloud computing can deliver useful sensing, diagnosis, baby-sitting for adults, and cognitive assistance. We are close to making useful robot servants using traditional metal robotics that can help people with disabilities and older adults have more control over their lives, more dignity, and live more independently or live independently longer.

One challenge that needs additional basic research is how to make robot cargivers that can touch and physically interact with people who need physical care, ranging from helping them move from bed to chair to changing their diapers. Currently many professional caregivers are forced to quit due to the physical toll of moving patients. A personal health care companion can take some of this load.

Any robot in the home needs to be fail-safe, avoiding injury to pets, overly friendly and malicious small children, and the people it is caring for. I believe the only way to achieve the level of safety required is to make new types of robots that are extremely lightweight, and one way to achieve that goal is inflatable robots. Lightweight inflatable devices are strong enough and tough enough to lift cars and houses, and protect NASA probes landing on Mars.

"You are my patient. Your health is my only concern."

Perhaps most importantly, a personal health care companion can provide a single 24/7 point of continuity for a user, in a world where patients are often confronted with a bewildering array of specialists and caregivers who are constantly changing. In some sense, Baymax is an intelligent, natural, and portable interface to one's electronic health records.

"I will scan you now."

Baymax shows an impressive ability (for a robot) to see what is going on, and has the ability to scan a person and detect and diagnose illness or injury. What about robots and other devices in real life? Focusing on what is available now, MRI and CT scanners are still too large and expensive to be part of a personal health care companion. Ultrasound scanners, which require physical contact with the patient, are now the size of cell phones, and could be built into a robot hand or a portable scan tool.

Specialized handheld devices can detect specific injuries, such as infrared absorption scanners that can detect brain hematomas (Infrascanner 2000). It may be that real-life Baymaxes need to carry a set of scanners and choose a diagnostic test based on a preliminary assessment, as real-life doctors do.

A real-life Baymax will also have color vision, and be able to create "depth" images that accurately map the shape of objects. Inexpensive thermal imaging is becoming available, so abnormal skin temperature will be detectable (which is an indicator of poor blood flow and impaired circadian rhythms, and other disorders) in addition to thermal imaging greatly simplifying robot perception of humans and what they are doing.

Making up for the missing sci-fi scanner, it turns out people will wear a wide range of devices to provide health information and therapy. Fitbit, Jawbone, Fuelband and similar wearable monitoring devices are surprisingly popular. Physiological and behavioral measures are available such as heart rate and electrical signals, body and skin temperature, steps taken, stairs climbed, distance traveled, estimated calories burned, sleep patterns and quality, eating habits, and more general activity patterns over the day. Wearable cardiac event recorders and stimulators, and handheld heart monitors such as the Heart Check PEN are also available.

Interestingly, some of these devices go beyond reporting measurements and deviations from norms or targets and nag or "nudge" their users about water consumption, taking breaks, and sleep amount. Changing behavior is one of the more difficult challenges for personal health care companions, and it is nice to see that there may be effective ways to do it without the device ending up being discarded. It may be that people accept nagging by machines better than they handle it from other people.

"Blood type is AB negative, cholesterol levels are low ... body temperature is low, blood pressure is elevated - you appear to be distressed."

Future wearable devices may be supplemented by swallowed or implanted devices which have access to body fluids, possibly in dental fillings and crowns (access to saliva and breath), and internally (access to blood). Astronauts have taken pills that measure and radio out their core temperature (a concern during space walks). There are camera pills that observe the intestinal tract. Implantable devices are being developed that can perform blood tests.

"Your emotional state has improved."

Measurements can also be taken by the environment. The Aware Home is a house we built at Georgia Tech that included permanently installed cameras and other sensors to be aware of inhabitant's activities and predict their needs and intent.

"Don't scan me!"

An issue with measurements by a personal health care companion, a house, or wearable or implanted devices is privacy. Older adults often hide their accidents from caregivers and family and reject monitoring technology if they think it is used to spy on them, but for good reasons. Changed behavior and accidents can lead to forced loss of independence, increased constraints, and potentially moving to assisted living or a nursing home "for their own good." Many accidents occur in bathrooms and bedrooms, sensitive areas for monitoring.

We need to establish the principle that the user owns the data about themselves and controls its use, not a doctor or other caregiver, hospital or nursing home, insurer, manufacturer, or service provider. We got this wrong with credit cards, phones, and car engines, and now our financial, communication, location, and driving information is sold and used in ways we don't know about and wouldn't necessarily agree to. One way to protect data against government subpoena is to periodically delete it, and only retain summary information. Transmitting data to or storing data in another computer or the cloud makes it vulnerable to hacking and surveillance, so designers will need to think carefully about how to build systems that can be trusted with intimate data.

Who interprets the data? It is important to realize that older adults sometimes view others (including doctors and family members) as adversaries, intent on taking away the car keys or institutionalizing them. Here, automated agents can help interpret the data, advise the user in simple cases, help the user decide what they are willing to share, and when outside help is absolutely necessary. Computer and car engine monitoring software can already do this. If older adults do not feel they own the device and the data, many will not use it.

Interestingly, my experience is that few people object to sensors taking measurements if they are mounted on a robot, because it is "obvious" the robot needs to see, hear, and feel.

Other issues include how do we establish "standards of care" and deal with malpractice claims against automated monitoring systems? Who gets sued if a medical condition is not detected, mis-diagnosed, or not handled correctly? What does the FDA have to approve, and what can be used without government approval?

"Does it hurt when I touch it?"

I am also motivated by my experience with my grandparents. My grandfather had ALS, and my grandmother was not able to help him up when he slid out of his chair or otherwise ended up on the floor. She would call my family, and I would drive over and be her robot. She provided the brains, and I provided the muscle. I want to be replaced by an actual robot.

We and others have made tremendous progress in programming robots to do useful errands with user supervision. This progress can be seen in the work on Personal Robotics at CMU and in the DARPA Robotics Challenge. Building a robot caregiver that touches, dresses, feeds, grooms, cleans, lifts, steadies, or physically interacts with a user in other ways is much harder. Baymax is an example of a lightweight inflatable robot that is fail-safe and can safely physically interact with people. build-baymax.org describes our efforts to build real-life soft and safe personal health care companions.

"If I only had a brain."

The biggest challenge in building Baymax is building a brain capable of useful human-robot interaction. Apple's Siri, Amazon's Echo, and similar question answering agents demonstrate the recent progress in this area of artificial intelligence, and could be the basis of a real-life Baymax as well. Quality human-robot interaction matters. My other grandmother, who had become blind, was uninterested in early reading machines because the voices were not gentle or soothing.

We can also take advantage of the patterns of our lives, which robots can currently learn, as well as take advantage of the close human-robot symbiosis of pairing a human to a personal device or robot. The human learns to help the robot as much or more than the robot learns to help the human. Robots can also learn from other robots (after anonymizing the data).

I see an immediate future in which our phones or pocket computers communicate with sensors in our shoes, clothes, wristbands, jewelry, and possibly teeth and inside our bodies to track and improve our personal health on a minute by minute basis. In the further future I see physical assistance by safe robots. Such personal health care companions will enable a large step forward in preventive medicine and lowering health care costs by addressing issues early when they are less expensive and less painful to treat. The way to begin inventing this future is to build a real life Baymax.


More info: build-baymax.org