Scientists are creating exoskeleton legs capable of thinking and making control decisions on their own utilizing onboard cameras a sophisticated AI technology.
The framework combines deep-learning AI and computer vision to mimic how able-bodied people walk by — seeing their surroundings and adjusting their movements.
“We are giving robotic exoskeletons vision so they can control themselves,” said Brokoslaw Laschowski, a Ph.D. candidate in systems design engineering who leads a University of Waterloo research project called ExoNet.
Exoskeleton legs worked by motors as of now exist, but they must be manually controlled by users through smartphone applications or joysticks.
“That can be inconvenient and cognitively demanding,” said Laschowski
“Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”
To address that limitation, the scientists fitted exoskeleton users with wearable cameras and are now optimizing artificial intelligence computer software to process the video feed to accurately recognize doors, stairs, and other different features of the surrounding environment.
The next phase of the ExoNet research project will include sending instructions to motors so that robotic exoskeletons can climb stairs, avoid obstacles or take other appropriate actions based on — analysis of the user’s current movement and the upcoming terrain.
“Similar to autonomous cars that drive themselves, we are designing autonomous exoskeletons that walk for themselves.”
The scientists are also working to improve the energy efficiency of motors for robotic exoskeletons by using human movement to self-charge the batteries.