1 / 52

EE 4780: Introduction to Computer Vision

EE 4780: Introduction to Computer Vision. Introduction. EE 4780. Instructor: Bahadir K. Gunturk Office: EE 225 Email: bahadir@ece.lsu.edu Tel: 8-5621 Office Hours: MW 10:00 – 12:00. EE 4780. We will learn the fundamentals of digital image processing and computer vision.

blouis
Télécharger la présentation

EE 4780: Introduction to Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EE 4780: Introduction to Computer Vision Introduction

  2. EE 4780 • Instructor: Bahadir K. Gunturk • Office: EE 225 • Email: bahadir@ece.lsu.edu • Tel: 8-5621 • Office Hours: MW 10:00 – 12:00

  3. EE 4780 • We will learn the fundamentals of digital image processing and computer vision. • Lecture slides, problems sets, solutions, study materials, etc. will be posted on the class website. [www.ece.lsu.edu/gunturk/EE4780] • Textbook is not required. • References: • Gonzalez/Woods, Digital Image Processing, Prentice-Hall, 2/e. • Forsyth/Ponce, Computer Vision: A Modern Approach, Prentice-Hall. • Duda, Hart, and Stork, “Pattern Classification,” John Wiley&Sons, 2001. • Shapiro/Stockman, Computer Vision, Prentice-Hall. • Horn, “Robot Vision,” MIT Press, 1986.

  4. Grading Policy • Your grade will be based on • Problem Sets: 30% • Midterm: 30% • Final: 40% • Problem Sets • Mini projects: Theoretical problems and MATLAB assignments • 4-5 Problem Sets • Individually or in two-person teams

  5. Digital Image Acquisition Sensor array • When photons strike, electron-hole pairs are generated on sensor sites. • Electrons generated are collected over a certain period of time. • The number of electrons are converted to pixel values. (Pixel is short for picture element.)

  6. Digital Image Acquisition • Two types of quantization: • There are finite number of pixels. (Spatial resolution) • The amplitude of pixel is represented by a finite number of bits. (Gray-scale resolution)

  7. Matrix Representation of Images • A digital image can be written as a matrix

  8. Image Size

  9. Image Resolution

  10. Bit Depth – Grayscale Resolution 8 bits 7 bits 6 bits 5 bits

  11. Bit Depth – Grayscale Resolution 4 bits 3 bits 2 bits 1 bit

  12. Digital Color Images

  13. Video = vertical position = horizontal position = frame number

  14. Why do we process images? • To facilitate their storage and transmission • To prepare them for display or printing • To enhance or restore them • To extract information from them • To hide information in them

  15. Image Processing Example • Image Restoration Original image Blurred Restored by Wiener filter

  16. Image Processing Example • Noise Removal Noisy image Denoised by Median filter

  17. Image Processing Example • Image Enhancement Histogram equalization

  18. Image Processing Example • Artifact Reduction in Digital Cameras Original scene Captured by a digital camera Processed to reduce artifacts

  19. Image Processing Example • Image Compression Original image 64 KB JPEG compressed 15 KB JPEG compressed 9 KB

  20. Image Processing Example • Object Segmentation “Rice” image Edges detected using Canny filter

  21. Image Processing Example • Resolution Enhancement

  22. Image Processing Example • Watermarking Original image Watermarked image Generate watermark Hidden message Secret key

  23. Image Processing Example • Face Recognition Search in the database Surveillance video

  24. Image Processing Example • Fingerprint Matching

  25. Image Processing Example • Segmentation

  26. Image Processing Example • Texture Analysis and Synthesis Photo Computer generated Pattern repeated

  27. Image Processing Example • Face detection and tracking http://www-2.cs.cmu.edu/~har/faces.html

  28. Image Processing Example • Face Tracking

  29. Image Processing Example • Object Tracking

  30. Image Processing Example • Virtual Controls

  31. Image Processing Example • Visually Guided Surgery

  32. Cameras • First camera was invented in 16th century. • It used a pinhole to focus light rays onto a wall or translucent plate. • Take a box, prick a small hole in one of its sides with a pin, and then replace the opposite side with a translucent plate. • Place a candle on the pinhole side, you will see an inverted image of the candle on the translucent plate.

  33. Pinhole Camera Model • If the pinhole were really reduced to a point, exactly one light ray would pass through each point in the image plane. • In reality, each point in the image place collects light from a cone of rays. • In addition, real cameras are equipped with lenses. • Still, pinhole model is an acceptable approximation of the imaging process.

  34. Pinhole Cameras Pinhole too big - many directions are averaged, blurring the image Pinhole too small - diffraction effects blur the image

  35. Perspective Projection • Far objects appear smaller than the close ones. Focal point

  36. Perspective Projection • Perspective projection equations

  37. Cameras With Lenses • Most cameras are equipped with lenses. • There are two main reasons for this: • To gather light. For an ideal pinhole, a single light ray would reach each point the image plane. Real pinholes have a finite size, so each point in the image plane is illuminated by a cone of light rays. The larger the hole, the wider the cone and the brighter the image => blurry pictures. Shrinking the pinhole produces sharper images, but reduces the amount of light and may introduce diffraction effects. • To keep the picture in sharp focus while gathering light from a large area.

  38. Geometric Optics • Ignoring diffraction, interferences, etc., the behavior of lenses is dictated by the laws of geometric optics. • Light travels in straight lines in homogeneous media. • When a light ray is reflected from a surface, this ray, its reflection, and the surface normal are coplanar, and the incident and the reflection angles are identical. • When a ray passes from one medium to another, it is refracted. Snell’s Law Refraction indices

  39. Paraxial Geometric Optics • In paraxial (or first-order) geometric optics, the angles between all light rays going through a lens and the normal to lens surfaces are small. • If the angles are small, their sines and tangents are equal to the angles, to the first order.

  40. Thin Lenses • For a thin lens surrounded by a vacuum (refraction index = 1), the formula is where Refraction index of lens

  41. Real Lenses • Better approximation of thick lenses can be obtained:

  42. Real Lenses • Thin lens assumption is not correct. Small angle approximations are not valid. • Rays do not focus at a single point. Spherical aberration Spherical aberration can be eliminated completely by designing aspherical lenses.

  43. Real Lenses • The index of refraction is a function of wavelength. • Light at different wavelengths follow different paths. Chromatic aberration

  44. Real Lenses Chromatic Aberration

  45. Real Lenses • Special lens systems (achromatic doublets) using two or more pieces of glass with different refractive indexes can reduce or eliminate this problem. However, not even these lens systems are completely perfect and still can lead to visible chromatic aberrations, especially at wide angles. • Third-order lens model might be helpful to quantify the aberration.

  46. Real Lenses Stop, put for reducing spherical aberrations • Barrel Distortion & Pincushion Distortion

  47. Real Lenses • Barrel Distortion & Pincushion Distortion Corrected Distorted http://www.vanwalree.com/optics/distortion.html http://www.dpreview.com/learn/?/Image_Techniques/Barrel_Distortion_Correction_01.htm

  48. Real Lenses Vignetting effect in a two-lens system. The shaded part of the beam never reaches the second lens. The brightness drop in the image perimeter.

  49. Real Lenses Optical vignetting example. Left: f/1.4. Right: f/5.6. f-number

  50. Real Lenses Flare Hood may prevent flares

More Related