The Brain and Behaviour Lab, Imperial College London
United Kingdom

Device: Gaze-controlled semi-autonomous wheelchairs for urban mobility

The team’s device combines low-cost gaze-based intention decoding with autonomous driving technology that can be installed on any existing joystick operated electrical wheelchair. The technology allows users to drive-by-eye, without the need to interact with a user interface, simply navigating the wheelchair as the users look where they want to go. This enables users to give high-level commands (“get me to the table in the kitchen”) instead of having to tediously manoeuvre around obstacles; these low-level actions are handled by a semi-autonomous Artificial Intelligence. This empowers to follow natural communication and interactions with the environment while navigating the wheelchair.