A dexterous robot arm that can can automatically feed people forkfuls of food has been developed by researchers in the US.
Experts studied how real people use forks to feed each other in order to teach the robot the best way to go about its task.
The arm automatically adjusts both the force it uses and the angle at which it spears items to best pick up and deliver mouthfuls of food – regardless of size or texture.
‘Being dependent on a caregiver to feed every bite, every day, takes away a person’s sense of independence,’ said roboticist Siddhartha Srinivasa.
‘Our goal with this project is to give people a bit more control over their lives.’
To address this, Professor Srinivasa and his colleagues at the University of Washington set out to develop a self-guided feeding system which can feed its user whatever they want to eat, when they want it to.
The solution they have designed, the Assistive Dexterous Arm (ADA), is a nimble automaton which – along with its controlling computer – can be attached to a user’s wheelchair.
ADA is guided in part by its on-arm camera and tactile sensor.
When activated, the robot simulates real human motions in order to identify, pick up and deliver bite-sized food items using its fork, adapting its technique depending on the morsel in question.
‘When we started the project we realised there are so many ways that people can eat a piece of food depending on its size, shape or consistency. How do we start?’, said paper co-author Tapomayukh Bhattacharjee.
To answer that question, the team set up an experiment to study how people eat various common foods.
Volunteers were observed using a special fork – which contained sensors to measure how much force was being applied to it – to pick up and feed different pieces of food to a mannequin.
The snacks the researchers served up in the tests were picked to have a range of consistencies, covering everything from hard carrots to soft banana slices as well as produce with tough skins but soft insides like grapes and tomatoes.
‘There’s a universe of types of food out there, so our biggest challenge is to develop strategies that can deal with all of them,’ Dr Srinivasa said.
People use different fork techniques to pick up different types of food, the researchers noted.
For items like carrots and grapes, for example, they used wiggling motions to increase the force applied and successfully spear each mouthful.
Softer foods, in contrast, required skewering for an angle, the researchers noticed, to ensure the items didn’t slip back onto the plate.
The researchers also observed that the actions of picking up a piece of food on a fork and then feeding it to someone are closely connected.
For example, volunteers would often spear pieces of food in a particular place, or from a particular angle, so that it ended up orientated in a way that was easy to eat.
‘You can pick up a carrot stick by skewering it in the centre of the stick, but it will be difficult for a person to eat,’ said Dr Bhattacharjee.
‘On the other hand, if you pick it up on one of the ends, and then tilt the carrot toward someone’s mouth, it’s easier [for them] to take a bite.’
Next, the researchers applied what they had learnt to teach ADA to do the feeding itself.
Comparing various approaches, they found that the robot needed to adjust its fork technique just like humans – varying the angle and force used – in order to be able to pick up different types of food.
To deliver an adaptable feeding strategy, the researchers combined two different algorithms within the ADA’s controlling computer, which they designed to be mounted under the seat of the user’s wheelchair.
An object-detection algorithm – dubbed ‘RetinaNet’ – is used first to scan the diner’s plate and identify the types of food which have been served.
A second piece of software, ‘SPNet’, then analyses each piece of food to determine the best way for the bot to pick it up on its fork.
For example, the algorithm might guide ADA to skewer a soft piece of banana in the middle, but from an angle so it stays on the fork.
In contrast, faced with a long stick of carrot, SPNet could instead tell the robot to spear the vegetable at one of its two ends.
Once ADA had picked up food, facial tracking helps it guide the food into the user’s mouth at the best angle for taking a bite.
When the robot is no longer needed – either when ADA detects an empty plate, or is told that the user has finished eating – it can stow its fork on the side of the wheelchair and fold away.
The researchers have now teamed up with the Taskar Center for Accessible Technology to improve the robot based on feedback from both patients residing in assisted living facilities, and their carers.
‘Ultimately, our goal is for our robot to help people have their lunch or dinner on their own,’ Srinivasa said.
He adds, however, that the point was not to replace caregivers but to help empower them.
‘With a robot to help, the caregiver can set up the plate, and then do something else while the person eats,’
Full details of the robot are described in a pair of papers, one published in the journal IEEE Robotics and Automation Letters and a second article that was presented at the recent ACM/IEEE International Conference on Human-Robot Interaction in South Korea.