1 / 9

Long-Wave Infrared and Visible Image Fusion for Situational Awareness

Long-Wave Infrared and Visible Image Fusion for Situational Awareness. Nathaniel Walker. What is image fusion? Applications System-level considerations Image fusion algorithms Image quality metrics Further research. Agenda. Combine data from multiple sensors into a single image Visible

kevina
Télécharger la présentation

Long-Wave Infrared and Visible Image Fusion for Situational Awareness

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

  2. What is image fusion? Applications System-level considerations Image fusion algorithms Image quality metrics Further research Agenda

  3. Combine data from multiple sensors into a single image Visible Image Intensified (I2) Near Infrared (NIR) Short-wave Infrared (SWIR) Medium-wave Infrared (MWIR) Long-wave Infrared (LWIR) X-Ray Enhance the capabilities of the human visual system ‘See’ outside the visible spectrum All-weather visibility What is Image Fusion?

  4. Surveillance and Targeting Navigation Applications • Satellites • Guidance/Detection Systems

  5. Parallax Optical alignment Image registration Sensor pixel resolution Color vs. grayscale Spectral resolution can be lost in fusion Human factors Presentation of IR data Realism of displayed data (superposition, contrast reversal) Preserving relative intensity across the scene System-Level Considerations

  6. Weight-based combinations of the two sources linear combination general loss of contrast Feature extraction High-pass filtering or edge detection Maximizing image quality metrics Image Fusion Algorithms (Zhang, Blum 1999)

  7. Mostly done by subjective evaluation ‘Optimal’ methods are task and application dependent Two classes of quantitative metrics (Chen, et al. 2005) Analysis of the fused image standard deviation – measure of contrast entropy - measure of information content SNR Comparison of the fused image to the source images cross-entropy objective edge based measure universal index based measure Image Quality Metrics

  8. Concentration on grayscale fusion algorithms for effective communication of spectral information to the viewer Sensor Assumptions perfect optical alignment and image registration same pixel resolution and field of view (FOV) Compare quantitative metrics of image quality to subjective image evaluation for situational awareness Focus on human factors for injecting infrared content into a visible spectrum image What approach adds value without causing distraction or removing detail Further Research

  9. References

More Related