Project Overview:

Human computer interaction today is extremely cumbersome. To get results to a simple query when on the go, one must whip out their mobile gadget, unlock, pull out the relevant application and then query for the data they require. Imagine a world where that tedious process was no more. Imagine a word where all this data is present right in front of your eyes at the flick of your fingers. That is what we are trying to achieve with glass. A platform that sits on your eye to deliver the content that you want when you want it with the least possible friction to access!

In comparison to products available one distinction is that it supports gesture recognition and the primary difference from market existent products is that it aims to be cheaper.

Project Specific Success Criteria:

  1. An ability for the phone to communicate with the microcontroller by sending and receiving data.
  2. An ability of the microcontroller to buffer and display images on the screen.
  3. An ability to change the screens on the OLED to display different types of information, for instance, switching from weather to news.
  4. An ability for the microcontroller to see gesture/touch interface inputs in order to receive or delete image notifications off the screen.
  5. An ability for the android app to sweep notifications and gather application data for the glass screen.
Midterm Presentation

Project Video
shay.ecn.purdue.edu\477grp8 =Ituso*y