Our solution has three stages:
- Mapping the functional visual field
- Increasing awareness of what it's like living with MD
- Warping a real-time camera feed to move images out of your blindspot
Built using Pixi.js and html2canvas
- Place the phone in the google glasses and navigate to here.
- Press start and look at the yellow dots in the center of the screen.
- Tap the screen when you see a white dot.
We used a recursive algorithm investigating smaller and smaller grid sizes, narrowing down on the boundary between where you can detect the white dot and where you cannot.
Results are stored locally on the phone and saved as a pdf.
Go to here when using a Google Cardboard and try walking around (currently works best on Android).
(Tap on the screen to see our solution, warping the image so that objects are no longer in your blind spot)
Built using html5 canvas, webrtc and WebGL.
Go here and tap on the screen to toggle the warping effect or go to here to see a pre-recorded demo.
Other possible options to improve vision:
- Adding zoom similar to GiveVision
- Instead of warping increase the contrast/exposure in that area