My father lost his sight a number of years ago. Even after a number of laser procedures and actual surgeries to stop his retina from detaching and blood from seeping into his eyes, he is still something like 90+ percent blind in one eye and almost completely blind in the other.
Before I start talking about this project, I want to preface all this with a major shout-out to the fantastic doctors at St. Michaels hospital in Toronto who tried so unbelievably hard to save as much of his remaining sight as they could.
Everyone there is a hero in my book.
There was a CNIB (Canadian National Institute for the Blind) event a few years ago where a number of medical companies were showing off the latest tech to help people who were suffering from vision loss. The most fascinating item there in my opinion was a product by eSight, which were a set of glasses that would enhance the limited vision of the wearer. They did this using a camera mounted on the front of the glasses, and used screens instead of transparent lenses to display what the camera saw. In doing so, they could zoom and increase the brightness and other facets of what the wearer was seeing. The glasses themselves were controlled by a small device that clipped to the wearers belt. My fathers vision problems include everything looking incredibly dark, blurry and with little color. He is only able to read if the light was just right and the text was either absolutely massive and inches away from his face. And speaking of faces, he hasn’t seen a face that looked like more than a fuzzy outline since 2009.
So naturally, I thought this product was perfect. So I enquired about the price, thinking that SURELY it would only cost a couple thousand dollars. And if that was the case, I would have purchased a pair right then and there. When they told me the actual price – $15,000 USD – I was floored. And I saw this on their website:
Oh, you can’t afford it? Yeah we totally believe that one day you won’t have to pay out of pocket for this thing. But until then, we’re still going to charge you as much as it costs to buy a god damn car in order to see again. But hey, we’ll help you do fundraisers and stuff so that you can give us all that money for our insanely over-priced product. Because we’re nice like that.
I’ve been furious about this for quite a while.
Now I’m all about capitalism. I don’t think eSight are necessarily terrible people for charging this much. I know how much it costs to do research and development, to get approvals, etc. And as much as I like to rag on eSight for their insane costs, they are doing something incredible that is helping people who need it. So I do have a lot of respect for them.
But I also hate them, because that price tag was just fucking unacceptable to me.
So I thought “Screw those guys, I’ll build my own. I’ve got a spare arduino lying around. I’ll buy a cell phone camera and a couple of small LCDs and just 3D print the frames and a case for the controller on the belt.” And then I won a Pebble watch at the 2015 Toronto NASA SpaceApps Challenge and started playing around with building apps for it. Around the same time, I got my hands on a Google Cardboard just because I thought the idea of cheap VR was fascinating. And then it somehow all came together as one solid idea.
I realized that I could use the camera from an iPhone or iPod touch, stick it in a Google Cardboard and achieve the exact same effect as what eSight was doing for the glasses. And using the Pebble watch paired to the phone/iPod, I could use that as the controller.
So I wrote some software that would take the view finder from the camera and split it into two screens (which turned out to be a bit of challenge) and then used what are called “lookup images” to filter the feed from the camera, allowing me to adjust brightness, contrast, and color saturation. This is, as near as I can tell, the method that Instagram uses to filter your photos by the way. And zooming was just a simple act of cropping the video view.
Similar to the way the Google Cardboard SDK works, by splitting the view on the phone, each eye takes in the visual information from one side and the brain combines it into a single image.
The Pebble watch app was surprisingly easy to develop, as all I had to do was display a menu and relay commands back to the phone. I’m going to create an AppleWatch companion app as well. I also realized that I could add commands to capture photos and record video of whatever the wearer is seeing, so I’m going to add those features next.
So now I’m trying to figure out what to do with this project. The total cost of the technology involved would be about $320 if you bought it all on Amazon.com, plus what – if anything – I decide to charge for the software which makes it all work. I went to a fantastic Hack’n’Tell event in March and got amazing feedback from the hackers and developers there. I’m thinking that maybe a Kickstarter is next, but I suppose I have to figure out a business model first. I haven’t even come up with a proper name for it. DigitalGlasses was just a practical temporary name, but I should probably come up with something catchier. Expect more updates soon!