Brain and Behavior Lab, Imperial College London
Gaze-based semi-autonomy for wheelchairs
The team aims to develop a low-cost gaze-based intention decoding interface that can be installed on any existing joystick operated electrical wheelchair. Our technology will allow users to drive-by-eye, without the need to interact with a “user interface,” simply navigating as the users imagine where they want to go. This allows users to give high-level driving intentions (“get me out of the room”) instead of having to fiddle around with complicated manoeuvres, these are handled by a semi-autonomous AI for navigation. The system consists of a laptop and off-the-shelf sensors, leading to a low-cost upgrade for any powered wheelchair.