Presentation is loading. Please wait.

Presentation is loading. Please wait.

Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool.

Similar presentations


Presentation on theme: "Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool."— Presentation transcript:

1 Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool

2 Project Overview Our project, code named Virtual Imaging Peripheral for Enhanced Reality or VIPER, is an augmented/virtual reality system. It will track a handheld unit’s location and perspective and use this information to find the location of a camera position in a virtual environment. Through a LCD screen on the handheld unit the user will see the virtual environment at the cameras location as if the handheld unit was a window into the virtual world. As the user moves the handheld unit around a table top sized environment the handheld unit’s actual and virtual perspective changes, allowing for different viewing angles of the virtual space.

3 Project-Specific Success Criteria 1. An ability to communicate time stamp data using RF between the base unit and handheld unit. 2. An ability to display images to the LCD display. 3. An ability to estimate the angle and position of the handheld unit with respect to an origin point using accelerometer, gyroscope, compass, visual data, and ultrasonic data. 4. An ability to find angle displacement of the handheld unit’s front face relative to the IR beacon origin using mounted camera. 5. An ability to find distance from base to handheld unit using ultrasonic emitter and receiver.

4 Block Diagram

5 Project Functions  Main: tracking the location of the handheld unit with IMUs; using Kalman filter to calculate unit’s location in virtual space and recalibrate data to account for accumulating error  Minor: displaying unit’s location in virtual space on an LCD screen using Beagleboard’s graphic capabilities.  These are the functions which will be evaluated for patent liability anaylsis.

6 Possible Literal Infringements  Based on the main function of our project, the closest possible literal infringement we could find was for this Patent: NO. US6480152  This patent details a GPS that is able to track location and recalibrate to get rid of error

7 Possible infringements under Doctrine of Equivalence  Our project does substantially the same thing  Whether or not it’s substantially the same way remains to be seen  This product uses 6 DOF while ours has 9  While our project is mainly virtual reality, this product does augmented reality  Main difference comes more from our minor function than our main function

8  Another example where our project does substantially the same thing  A better argument could be made for “substantially the same way”  Like the other Vuzix product, it uses 6 DOF as opposed to our 9  Our minor function is also very similar to this product  For both products, LCD vs. video glasses could be debated as “substantially the same way” Possible infringements under Doctrine of Equivalence

9 Possible Actions  For literal infringements, only course of action may be to buy a license or pay royalty fees  For infringement under doctrine of equivalence, since its mainly minor function that could be infringing, first must find if Vuzix has patents on said functions  If so then functions can be eliminated from design  If not, then action might not be necessary.

10 Questions?


Download ppt "Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool."

Similar presentations


Ads by Google