Startup OxSight uses computer vision algorithms and cameras to make the world visible again for the visually impaired.
Stephen Hicks and his team developed algorithms that replicate our natural visual interpretation process
To see, you need more than eyes. "Even when someone is losing their sight, they still have a good brain that's trying to understand and pick up clues from objects, if given enough input," says Stephen Hicks, a research fellow in neuroscience at the University of Oxford. This mechanism means partially sighted people can be helped to see, even as their eyesight worsens. To make that possible, Hicks's startup, OxSight, is building augmented reality glasses that render the physical world visible, even to the visually impaired.
The sense we experience as vision is the outcome of a constant jigsaw-assembling process in our brain: the eyes only need to pick up specific visual tidbits (colour, contrast, dimensions), and the occipital and parietal lobes will make sense of the overall picture. Having observed this through his research, Hicks teamed up with fellow Oxford computer-vision scientist Philip Torr to create OxSight, a spinout that launched in March 2016. The pair designed augmented-reality glasses that let partially sighted people make sense of their surroundings by spotlighting specific visual cues and overlaying them on the lenses in real time.
Using computer-vision algorithms and cameras, OxSight's glasses can increase image contrast, highlight specific visual features or create cartoonish representations of reality, depending on the eye condition they're being used to compensate for. "For instance, if you have tunnel vision and issues with colour perception, they'd emphasise colours," explains Hicks, 43. "If you have got glaucoma and your vision is blurry, the glasses will enhance the salience of certain objects."
Hicks says OxSight's biggest technical challenge was tweaking the computer-vision software so that it could run on very little computing power. "We've optimised the system for particular use cases so that it could work on a mobile phone's graphics processor," he says. (The glasses run on Android.) Aesthetics are harder to crack. "They need to look like regular sunglasses: visually impaired people won't tolerate something that makes them stand out," he adds.
OxSight won a £500,000 Google Global Impact Award in 2015, and raised £2 million from angel investor Jiangong Zhang in 2016; its device is scheduled for release in late 2017. The company is still trialling the glasses with several people across the UK. Pilot users, who are suffering from diseases such as glaucoma, retinitis pigmentosa or diabetes, report that, due to the glasses, they can avoid obstacles, see blurry faces clearly again and read from slides. Hicks is pleased with the results: "Most pilot users find them to be life-changing."