Researchers at the University of Georgia have developed a wearable AI engine that can help visually impaired people navigate the world around them.
Housed in a backpack, the system detects traffic signs, crosswalks, curbs and other common challenges, using a camera inside a vest jacket.
Users receive audio directions and advisories from a Bluetooth-enabled earphone, while a battery in a fanny pack provides about eight hours of energy.
Intel, which provided the processing power for the prototype device, says it’s superior to other high-tech visual-assistance programs, which ‘lack the depth perception necessary to facilitate independent navigation.’
Jagadish Mahendran, an AI developer at the University of Georgia’s Institute for Artificial Intelligence, was inspired to create the system by a visually impaired friend.
‘I was struck by the irony that, while I have been teaching robots to see, there are many people who cannot see and need help,’ he said.