1 / 20

A Real-Time Multi-Sensor 3D Shape Surface Measurement System Using Fringe Analysis

G eneral E ngineering R esearch I nstitute. A Real-Time Multi-Sensor 3D Shape Surface Measurement System Using Fringe Analysis. By Mohammad Al Sa’d. www.megurath.org. Introduction. General background Functional requirements of the system

meris
Télécharger la présentation

A Real-Time Multi-Sensor 3D Shape Surface Measurement System Using Fringe Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. General Engineering Research Institute A Real-Time Multi-Sensor 3D Shape Surface Measurement System Using Fringe Analysis By Mohammad Al Sa’d www.megurath.org

  2. Introduction • General background • Functional requirements of the system • Stages of the 3D surface reconstruction process • Specifications of the system • Hardware design of the system • Software design of the system • Results

  3. Fringe Pattern Profilometry Projector Background (1/2) Object • Optical non-contact 3D surface method • Fringes generation • Laser Interference • structured light projection • Measurement precision: from 1μm • Depends on the optical resolution of the fringes • Fringes width and their optical quality (depth of field, camera resolution and display resolution) • Light Wavelength • Applications • Inspection of components during the production process (turbine blades and circuit boards) • Reverse engineering (CAD data from existing objects) • Documenting objects of cultural heritage • Medical applications: live measuring of human body shape Camera Projected Pattern Image plane

  4. Metrology Guided Radiotherapy Background (2/2) • Radiation therapy is used since about more than 100 years for the treatment of cancer. • The goal is to destruct the cancer cells with minimal radiation damage to the surrounding healthy cells. • Pre-treatment stages: • 3D planning models are created (CT, MR or others) to accurately guide treatment. • Radiation treatment sessions are planned and radiation doses are calculated (dosimetry). • Treatment stages: • Radiation beam is shaped to precisely hit the target (site of tumour). • Radiation is delivered from multiple angles, using the controlled gantry and patient table. • Treatment is repeated over multiple sessions. • Any small movement (like breathing) or patient’s body changes during successive sessions affect the goal of the treatment. Rotating Gantry

  5. Field of View Functional Requirements 400mm • Spatial Resolution 400mm 400mm • At least 100 x,y points • At least 400mm × 400mm × 400mm • Measurement Error (Accuracy) • Not to exceed ±1 mm (according to the tolerance of the dosimteric models used in radiotherapy planning). • Dynamic Real-Time Measurement • At least FIVE measurements per second (to detect small movements, like breathing) Z X Y

  6. 3D Surface Reconstruction Stages(1/5)

  7. Fringe Profilometry Analyses 3D Surface Reconstruction Stages(2/5) • Spatial Fringe Analysis Techniques (modulation phase is generated from a single input image): • Fourier Profilometry: • Windowed Fourier Profilometry: Processing window passes through the image to find the phase at the centre pixel using forward and inverse FFT. • Wavelet Profilometry: Generating phase from the wavelets of the image line-by-line • Temporal Fringe Analysis Techniques (modulation phase is generated from multiple input images – at least three images): • Phase-Stepping Profilometry: Using least square method to extract the phase. 3 2 1 1 2 3

  8. Phase Unwrapping 3D Surface Reconstruction Stages(3/5) • To remove the phase ambiguity (2 modules). • Types: • Path-Dependent Unwrappers • Schafer Unwrapping Algorithm • Path-Independent Unwrappers • Path-Independent Unwrappers • Goldstein's Branch Cut Algorithm • Quality-Guided Path Following Algorithm • Flynn's Minimum Discontinuity Algorithm • Preconditioned Conjugate Gradient (PCG) Algorithm • Lp-Norm Algorithm • Reliability Ordering Algorithm • Synthesis Algorithm • Differ in speed and robustness.

  9. Absolute Height Calibration 3D Surface Reconstruction Stages(4/5) • Unwrapped phase map is converted to real world heights • Height calibration process: • Triangulation spot (embedded inside the fringe pattern) is detected. • Unwrapped phase value at the spot (x,y) location is subtracted from the unwrapped phase map (to generate a relative phase map). • Unwrapped phase map is linked to the real world heights via interpolation (using the height calibration volume). • To compensate the geometric distortions by optics and perspective. • X,Y world coordinates are generated for a number of height steps to generate the traversal calibration volumes • X,Y world coordinates are retrieved depending on the correspondent height value of each pixel and using interpolation. • Traversal (XY) Calibration

  10. Deliverables (so far!) Specifications of the System(1/3) • Speed: 8Hz (using Fourier Profilometryand Goldstein's unwrapper) 5Hz (using Fourier Profilometryand Reliability ordering unwrapper) • Field-of-view: (X,Y,Z) = 400mm × 500mm × 400mm • Spatial resolution: 262,144 x,y points • Multiple sensors: Coverage area around 270° • Measurement error: Accuracy around ±0.5 mm • Pre-processing Techniques: Noise Reduction and gamma Correction • Catalogue of measurement techniques: Ability to select different algorithms • User interaction and multi-user modes: GUI to interact with the user Normal and advanced modes (for both metrology experts and normal users) • Various operating modes: Online: for real-time measurements Offline: for pre-saved images and videos • 3D visualisation and 2D plotting. • Various image saving choices. Sensor1 Sensor3 Sensor2 Treatment coach

  11. Program snapshots Specifications of the System(2/3)

  12. Program snapshots Specifications of the System(3/3)

  13. Hardware Configuration Hardware Design of the System(1/3) Synchronisation Unit Main Control & Processing Unit Sensor1 Sensor2 Sensor3 Sensor Processing Unit Sensor Processing Unit Sensor Processing Unit

  14. Conventional LCD projector LCoSprojector • Sensor Components: Projector Hardware Design of the System(2/3) Canon XEED SX60

  15. Gige Camera Controller Fourier Triangulation Phase-Stepping • Sensor Components: Camera Hardware Design of the System(3/3) Gige Broadcasting Prosilica GE1380 • GigE technology: • Progressive scan CCD • 20 fps @ 1360 × 1024 • 35 fps @ 512 x 512 • Direct image registration to the system memory via a compatible Gigabit port • Upto 100 meter cable length

  16. Output Buffer Grabbed Images from the Camera Processing Thread Input Thread Output Thread Input Buffer Display and/or store according to the user preferences • Software Configuration: Processing Core Software Design of the System Multithreaded processing framework: • Input Thread: Project and grab frames • Processing thread: Apply measurement, unwrapping and calibration techniques • Output thread: Stream/save/display results

  17. Static Object Measurement – One Sensor Results(1/3)

  18. Static Object Measurement – Multi-Sensor Results(2/3) Sensor1 Sensor3 Sensor2 Treatment coach

  19. Moving Object Measurement – One Sensor Results(3/3)

  20. Thank You!

More Related