1 / 31

Next Generation 4-D Distributed Modeling and Visualization of Battlefield

Next Generation 4-D Distributed Modeling and Visualization of Battlefield. Avideh Zakhor UC Berkeley June, 2001. Participants. Avideh Zakhor, (UC Berkeley) Bill Ribarsky, (Georgia Tech) Ulrich Neumann (USC) Pramod Varshney (Syracuse) Suresh Lodha (UC Santa Cruz).

Audrey
Télécharger la présentation

Next Generation 4-D Distributed Modeling and Visualization of Battlefield

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh Zakhor UC Berkeley June, 2001

  2. Participants • Avideh Zakhor, (UC Berkeley) • Bill Ribarsky, (Georgia Tech) • Ulrich Neumann (USC) • Pramod Varshney (Syracuse) • Suresh Lodha (UC Santa Cruz)

  3. Battlefield Visualization • Detailed, timely and accurate picture of the modern battlefield vital to military • Many sources of info to build “picture”: • Archival data, roadmaps, GIS and databases: static • Sensor information from mobile agents at different times and location: dynamic. • Multiple modalities: fusion • How to make sense of all these without information overload?

  4. Major Challenges • Disparate/conflicting sources of information resulting in large volumes of data. • Data sources inherently uncertain: resulting models also uncertain. • Data and models together with associated uncertainty need to be visualized on mobiles with limited capability. • All of this is time varying, time dependent and dynamic. Make decisions and take actions

  5. Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model extraction Visualization Database UNCERTAINTY UNCERTAINTY Model update Decision Making Mobiles with augmented reality sensors

  6. Research Agenda • Model construction and update: • Radar, Laser, Cameras, Roadmaps, Satellite photos, GPS, Gyros, Speedometer, INS. • Augmented reality sensor tracking and registration required • Mobile, real time visualization, interaction and navigation within the database : • Augmented reality sensor tracking and registration required; • Uncertainty processing and visualization: • Sensors, algorithms, models. • Fusion used in all of the above.

  7. Visualization Pentagon 4D Modeling/ Update Tracking/ Registration Visualization Database Information Fusion Uncertainty Processing/ Visualization

  8. Model Construction • Develop a framework for fast, automatic and accurate 3D model construction for urban areas. • Models must be: • Easy, fast, automated, to compute; • Compact to represent; • Easy to render; • Amenable to photo realistic flythroughs. • Strategy: • Fusion of multiple data sources: intensity, range, heading, speedometer, panoramic cameras. • Incorporate apriori models, e.g. aerial photos, and digital roadmaps.

  9. 3D Modeling: • Close-range modeling: • ground based vehicle with multiple sensors • Far-range modeling: • Aerial/satellite imagery • Fusion of close range and far range info at multiple levels: • Data and models.

  10. Ground Level Based Model Construction • Laser range scanners • Digital roadmaps • Aerial photos

  11. Ground level based 3D model construction • 3D point cloud from 2D scans • Mesh generation • Texture Mapping

  12. Far Range Modeling Example of an aerial image and extracted disparity map of man-made structures

  13. Tracking, Registration, & Autocalibration • Needed for Augmentation or Visualization • Mobile people with sensors provide textures and data for visualizations… • Where are they? Where are they looking? • What are they doing? • Tracking provides position and orientation • Tracking is possible with landmarks in view • Autocalibration allows tracking in regions beyond/near landmarks – also can improve/refine models • Sensor fusion (GPS, vision, gyroscopes) adds more tracking information for wider areas

  14. Mobile Testbed

  15. Registration Using Mutual Information Original IRS and RADAR image pair Registered RADAR image

  16. Temporal Aspects • Time critical applications: • Sequential methods for information fusion • Tradeoff between speed and accuracy of results(QoS) • Fuse information arriving at different times: • Perform independent time-based calculations, followed by fusion/integration

  17. t1 tnew Source 1 Source 2 Source 1 Source 2 t2 tnew Decision maker Projection Fusion Fusion of Asynchronous Information

  18. Mobile Visualization • Synchronized data bases of dynamic data in collaborative environments • Peer to peer mode • Sharing geo-spatial data, annotations, triggering events • Mobility: • Development of mobile visualization capability using laptop version of global geospatial system • Gesture recognition capability • New hierarchical volume rendering technique to weather and uncertainty Real-time acquisition and insertion Dynamic, Synchronized Databases

  19. Uncertainty Computation and Visualization

  20. Uncertainty in Visualizations of Terrains • Terrain visualization is important in a battlespace environment • Terrain compression needed due to -- real-time interaction -- limited screen space -- progressive transmission requirements • How much uncertainty is introduced due to data compression? • Can we trust compressed terrain images?

  21. Monterey Bay original unconstrained, 20% unconstrained, 70% constrained, 70% constrained, 80%

  22. Target Uncertainty Visualization in 2D Galaxy Diffuse Color Speed: Mean 1.0; Std Dev 0.2 Direction: Mean 0.0; Std Dev 45

  23. Transitions (1) • Established close relationships with both government agencies and industry: • Government agencies • AFRL Information Directorate, TRADOC Analysis Center • Office of Naval Research (ONR) • NASA (Houston Space Center) • DARPA • CERTIP (Center for Emergency Response Technology, Instrumentation, and Policy), • Industry: • HRL, Boeing, Raytheon, General Electric, HP • Olympus America • Niagara Mohawk Power Corp, ICT • Carrier Corp, Andro consulting, Kitware

  24. Publications (1) • Hassan Foroosh, “Conservative Optical Flow Fields”, submitted to International Journal of Computer Vision, October 2000. •  Hassan Foroosh, “A General Motion Model based on The Theory of Conservative Fields”, submitted to International Conference on Computer Vision and Pattern Recognition, May 2001. •  Hassan Foroosh, “A Closed-Form Solution For Optical Flow By Imposing Temporal Constraints”, accepted in International Conference on Image Processing, January 2001. • Hassan Foroosh and Avideh Zakhor, “Virtual Cities From Aerial Images By Fusion of Segmentation And Stereo Vision in a Bayesian Framework”, submitted to International Conference on Computer Vision and Pattern Recognition, May 2001. • Hassan Foroosh and Avideh Zakhor, “Projective Rectification Using a Single Homography”, submitted to International Conference on Computer Vision and Pattern Recognition, May 2001. • Christian Frueh and Avideh Zakhor, “Fast 3D Model Generation in Urban Environments”, accepted in Conf. on Multisensor Fusion and Integration for Intelligent Systems, February 2001.

  25. Publications (2) • Christian Frueh and Avideh Zakhor, “3D model generation for cities using aerial photographs and ground level laser scans”, submitted to International Conference on Computer Vision and Pattern Recognition, May 2001. • N.L. Chang and A. Zakhor, “A Multivalued Representation for View Synthesis”, in Proceedings of the International Conference on Image Processing, Kobe, Japan, October 1999, vol. 2, pp. 505-509; Also submitted to International Journal of Computer Vision, January 2000 • T.Y. Jiang, William Ribarsky, Tony Wasilewski, Nickolas Faust, Brendan Hannigan, and Mitchell Parry,“Acquisition and Display of Real-Time Atmospheric Data on Terrain,” to be published, Proceedings of Eurographics-IEEE Visualization Symposium 2001. • Mitchell Parry, Brendan Hannigan, William Ribarsky, T.Y. Jiang, and Nickolas Faust, “Hierarchical Storage and Visualization of Real-Time 3D Data,” to be published, Proceedings of SPIE Aerosense 2001. • Mitchell Parry, William Ribarsky, Chris Shaw, Tony Wasilewski, Nickolas Faust, T.Y. Jiang, and David Krum, “Acquisition, Management, and Use of Large-Scale, Dynamic Geospatial Data,” submitted to IEEE Computer Graphics & Applications

  26. Publications (3) • J. W. Lee, S. You, and U. Neumann. “Large Motion Estimation for Omnidirectional Vision,” IEEE Workshop on Omnidirectional Vision, with CVPR, June 2000 • J.W. Lee, U. Neumann. "Motion Estimation with Incomplete Information Using Omni-Directional Vision," IEEE International Conference on Image Processing(ICIP'00), Vancouver Canada, September 2000 • S. You, U. Neumann. "Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration," IEEE VR2001, pp.71-78, Yokahama Japan, March 2001 • B. Jiang, U. Neumann, “Extendible Tracking by Line Auto-Calibration,” submitted to ISAR 2001 • Antonio Hasbun, Pramod K. Varshney, Kishan G. Mehrotra and C.K. Mohan, “Uncertainty Introduced by Quantization in Image Processing Applications”, Under preparation. • Kishan G. Mehrotra, C.K. Mohan and Pramod K. Varshney, “Theory and Applications of Random Sets: A Review”, Under preparation.

  27. Publications (4) • Suresh K. Lodha, Krishna M. Roskin, and Jose Renteria,``Hierarchical Topology Preserving Compression of 2D Terrains'', Submitted for publication to Computer Graphics Forum. • Suresh K. Lodha and Arvind Verma, ``Spatio-Temporal Visualization of Urban Crimes on a GIS Grid'', Proceedings of the ACM GIS Conference, November 2000, ACM Press, pages 174--179. • Suresh K. Lodha, Jose Renteria and Krishna M. Roskin, ``Topology Preserving Compression of 2D Vector Fields'', Proceedings of IEEE Visualization '2000, October 2000, pp. 343--350. • Christopher Campbell, Michael Shafae, Suresh K. Lodha and D. Massaro, ``Multimodal Representations for the Exploration of Multidimensional Fuzzy Data", Submitted for publication to Behavior Research Methods, Instruments, and Computers.

  28. Publication (5) • Q. Zhang and P. K. Varshney, “Decentralized M-ary Detection via Hierarchical Binary Decision Fusion”, Information Fusion, Vol. 2, pp. 3-16, March 2001. • Hua-mei Chen and Pramod K. Varshney, “A pyramid approach for multimodality image registration based on mutual information”, Proceedings of 3rd international conference on information fusion, vol. I, pp. MoD3 0-15, Paris, July 2000. • Hua-mei Chen and Pramod K. Varshney, “Generalized partial volume interpolation for image registration based on mutual information”, presented at 2000 WESTERN NEW YORK, IMAGE PROCESSING WORKSHOP, University of Rochester,October 13, 2000. • Hua-mei Chen and Pramod K. Varshney, ”A cooperative search algorithm for mutual information based image registration,” Proc. of Aerosense 2001, Orlando, April 2001. • Andrew L. Drozd et al., “ Towards the development of multisensor integrated display systems,” To be presented at Fusion 2001, Montreal, Aug. 2001. • David Krum, William Ribarsky, Chris Shaw, Larry Hodges, and Nickolas Faust, “Situational Visualization,” submitted to ACM Virtual Reality Software and Technology.

  29. Cross Collaboration

  30. Outline of Talks • U.C. Berkeley, "3D model construction for urban environments“ • USC, " Tracking and Visualization with Models and Images" • Georgia Tech, "Mobile Visualization in a Dynamic, Augmented Battlespace" • Syracuse, "Uncertainty Processing and Information Fusion for Visualization" • U.C. Santa Cruz, "Uncertainty Quantification and Visualization: Terrains and Targets"

  31. 3D Model Construction for Urban Environments • Close-range modeling: • ground based vehicle with multiple sensors • Far-range modeling: • Aerial/satellite imagery • Fusion of close range and far range info at multiple levels: • Data and models.

More Related