100 likes | 203 Vues
Our project, VIPER (Virtual Imaging Peripheral for Enhanced Reality), is an innovative AR/VR system designed to track a handheld unit's location and perspective, providing immersive experiences. The user views the virtual environment through an LCD screen, simulating a window into the virtual world. Key functionalities include using RF for data communication, image display on the LCD, and precise location estimation using IMUs. Our system overcomes existing patents, focusing on a unique 9 DOF approach. We aim to offer a competitive solution in the AR/VR space.
E N D
Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool
Project Overview Our project, code named Virtual Imaging Peripheral for Enhanced Reality or VIPER, is an augmented/virtual reality system. It will track a handheld unit’s location and perspective and use this information to find the location of a camera position in a virtual environment. Through a LCD screen on the handheld unit the user will see the virtual environment at the cameras location as if the handheld unit was a window into the virtual world. As the user moves the handheld unit around a table top sized environment the handheld unit’s actual and virtual perspective changes, allowing for different viewing angles of the virtual space.
Project-Specific Success Criteria • An ability to communicate time stamp data using RF between the base unit and handheld unit. • An ability to display images to the LCD display. • An ability to estimate the angle and position of the handheld unit with respect to an origin point using accelerometer, gyroscope, compass, visual data, and ultrasonic data. • An ability to find angle displacement of the handheld unit’s front face relative to the IR beacon origin using mounted camera. • An ability to find distance from base to handheld unit using ultrasonic emitter and receiver.
Project Functions • Main: tracking the location of the handheld unit with IMUs; using Kalman filter to calculate unit’s location in virtual space and recalibrate data to account for accumulating error • Minor: displaying unit’s location in virtual space on an LCD screen using Beagleboard’s graphic capabilities. • These are the functions which will be evaluated for patent liability anaylsis.
Possible Literal Infringements • Based on the main function of our project, the closest possible literal infringement we could find was for this Patent: NO.US6480152 • This patent details a GPS that is able to track location and recalibrate to get rid of error
Possible infringements under Doctrine of Equivalence • Our project does substantially the same thing • Whether or not it’s substantially the same way remains to be seen • This product uses 6 DOF while ours has 9 • While our project is mainly virtual reality, this product does augmented reality • Main difference comes more from our minor function than our main function
Possible infringements under Doctrine of Equivalence • Another example where our project does substantially the same thing • A better argument could be made for “substantially the same way” • Like the other Vuzix product, it uses 6 DOF as opposed to our 9 • Our minor function is also very similar to this product • For both products, LCD vs. video glasses could be debated as “substantially the same way”
Possible Actions • For literal infringements, only course of action may be to buy a license or pay royalty fees • For infringement under doctrine of equivalence, since its mainly minor function that could be infringing, first must find if Vuzix has patents on said functions • If so then functions can be eliminated from design • If not, then action might not be necessary.