General Engineering Research Institute

Slides:



Advertisements
Similar presentations
Accurate fringe analysis in a 3D range sensor for the fast measurement of shapes G. Sansoni, F. Docchio, E. Redaelli Laboratory of Optoelectronics University.
Advertisements

Shapelets Correlated with Surface Normals Produce Surfaces Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia.
QR Code Recognition Based On Image Processing
Laser Speckle Extensometer ME 53
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Graphics Pipeline.
TERRESTRIAL LASER SCANNING (TLS): APPLICATIONS TO ARCHITECTURAL AND LANDSCAPE HERITAGE PRESERVATION – PART 1.
Micro-triangulation for high accuracy short range measurements of dynamic objects Vasileios Vlachakis 1 st PACMAN Workshop February 2015 CERN, Geneva,
SIGGRAPH Course 30: Performance-Driven Facial Animation Section: Markerless Face Capture and Automatic Model Construction Part 2: Li Zhang, Columbia University.
Augmented Reality: Object Tracking and Active Appearance Model
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
tomos = slice, graphein = to write
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
1 REAL-TIME IMAGE PROCESSING APPROACH TO MEASURE TRAFFIC QUEUE PARAMETERS. M. Fathy and M.Y. Siyal Conference 1995: Image Processing And Its Applications.
Duke University, Fitzpatrick Institute for Photonics BIOS Lab, Department of Biomedical Engineering Two-wavelength unwrapping for transmission-geometry.
بسم الله الرحمن الرحيم و قل رب زدنى علماً ﴿و قل رب زدنى علماً﴾ صدق الله العظيم.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Quality Assurance for a modern treatment planning system
Steven Marsh, James Eagle, Juergen Meyer and Adrian Clark
Test of the proposed method Introduction CCD Controller CCD Illuminator gel Filter 585nm Assembling the phantom before its irradiation. The phantom, ready.
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Real-Time Phase-Stamp Range Finder with Improved Accuracy Akira Kimachi Osaka Electro-Communication University Neyagawa, Osaka , Japan 1August.
Robotic Radiation Oncology
1 Imaging Techniques for Flow and Motion Measurement Lecture 18 Lichuan Gui University of Mississippi 2011 Large-scale PIV and Stereo High-Speed Imaging.
Photogrammetry for Large Structures M. Kesteven CASS, CSIRO From Antikythera to the SKA Kerastari Workshop, June
Medical Accelerator F. Foppiano, M.G. Pia, M. Piergentili
1 MADRID Measurement Apparatus to Distinguish Rotational and Irrotational Displacement Rafael Ortiz Graduate student Universidad de Valladolid (Spain)
1 Radiotherapy, hadrontherapy and treatment planning systems. Faiza Bourhaleb INFN-Torino University Med 1er-Morocco  Radiotherapy  Optimization techniques.
1 W. Rooney May 20, 2004 Imaging the Awake Animal MRI Efforts Overview.
Development of a laser slit system in LabView
High Speed 3D Imaging Technology
F. Foppiano, M.G. Pia, M. Piergentili
By: David Gelbendorf, Hila Ben-Moshe Supervisor : Alon Zvirin
Using Adaptive Tracking To Classify And Monitor Activities In A Site W.E.L. Grimson, C. Stauffer, R. Romano, L. Lee.
Objective Appearance Measurement. Appearance of surface finish Many factors can affect surface appearance quality These include:- Surface texture or waviness.
Introduction to Radiation Therapy
IPBSM Operation 11th ATF2 Project Meeting Jan. 14, 2011 SLAC National Accelerator Laboratory Menlo Park, California Y. Yamaguchi, M.Oroku, Jacqueline Yan.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Flair development for the MC TPS Wioletta Kozłowska CERN / Medical University of Vienna.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
European Geosciences Union General Assembly 2016 Comparison Of High Resolution Terrestrial Laser Scanning And Terrestrial Photogrammetry For Modeling Applications.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching Link: singhashwini.mesinghashwini.me.
Computed tomography. Formation of a CT image Data acquisitionImage reconstruction Image display, manipulation Storage, communication And recording.
Design and Calibration of a Multi-View TOF Sensor Fusion System Young Min Kim, Derek Chan, Christian Theobalt, Sebastian Thrun Stanford University.
range from cameras stereoscopic (3D) camera pairs illumination-based
Single Slice Spiral - Helical CT
AAFS 2004 Dallas Zeno Geradts
Mapping EP4.
Understanding Radiation Therapy Lecturer Radiological Science
Danfoss Visual Inspection System
Signal and Image Processing Lab
- Introduction - Graphics Pipeline
Design for Embedded Image Processing on FPGAs
제 5 장 스테레오.
INTRODUCTION TO GEOGRAPHICAL INFORMATION SYSTEM
Imaging and Depth Estimation in an Optimization Framework
Recent developments on micro-triangulation
IncuCyte Training Calibrations and Troubleshooting
Computed Tomography Image Reconstruction
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Optical Coherence Tomography
Bashar Mu’ala Ahmad Khader
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Vehicle Segmentation and Tracking in the Presence of Occlusions
Mixed Reality Server under Robot Operating System
Multiple View Geometry for Robotics
By: Mohammad Qudeisat Supervisor: Dr. Francis Lilley
LightGage™ Frequency Scanning Technology
Advanced radiotherapy for lung cancer
Computed Tomography (C.T)
Presentation transcript:

General Engineering Research Institute A Real-Time Multi-Sensor 3D Shape Surface Measurement System Using Fringe Analysis By Mohammad Al Sa’d www.megurath.org

Introduction General background Functional requirements of the system Stages of the 3D surface reconstruction process Specifications of the system Hardware design of the system Software design of the system Results

Background (1/2) Fringe Pattern Profilometry Projector Object Optical non-contact 3D surface method Fringes generation Laser Interference structured light projection Measurement precision: from 1μm Depends on the optical resolution of the fringes Fringes width and their optical quality (depth of field, camera resolution and display resolution) Light Wavelength Applications Inspection of components during the production process (turbine blades and circuit boards) Reverse engineering (CAD data from existing objects) Documenting objects of cultural heritage Medical applications: live measuring of human body shape Camera Projected Pattern Image plane patterns are generated by computer and displayed by a projector

Background (2/2) Metrology Guided Radiotherapy Radiation therapy is used since about more than 100 years for the treatment of cancer. The goal is to destruct the cancer cells with minimal radiation damage to the surrounding healthy cells. Pre-treatment stages: 3D planning models are created (CT, MR or others) to accurately guide treatment. Radiation treatment sessions are planned and radiation doses are calculated (dosimetry). Treatment stages: Radiation beam is shaped to precisely hit the target (site of tumour). Radiation is delivered from multiple angles, using the controlled gantry and patient table. Treatment is repeated over multiple sessions. Any small movement (like breathing) or patient’s body changes during successive sessions affect the goal of the treatment. Rotating Gantry The goal is to destruct the cancer cells with minimal radiation damage to the surrounding healthy cells

Functional Requirements 400mm Z Y X Field of View At least 400mm × 400mm × 400mm Spatial Resolution At least 100 x,y points Measurement Error (Accuracy) Not to exceed ±1 mm (according to the tolerance of the dosimteric models used in radiotherapy planning). Dynamic Real-Time Measurement At least FIVE measurements per second (to detect small movements, like breathing) Patients are asked to be static as possible, however movement in their bodies come from breathing

3D Surface Reconstruction Stages (1/5) Fringe Profilometry Analyses Phase Unwrapping Calibration

3D Surface Reconstruction Stages (2/5) Fringe Profilometry Analyses Spatial Fringe Analysis Techniques (modulation phase is generated from a single input image): Fourier Profilometry: Windowed Fourier Profilometry: Processing window passes through the image to find the phase at the centre pixel using forward and inverse FFT. Wavelet Profilometry: Generating phase from the wavelets of the image line- by-line Temporal Fringe Analysis Techniques (modulation phase is generated from multiple input images – at least three images): Phase-Stepping Profilometry: Using least square method to extract the phase. Forward FFT Filter out the fringes Inverse FFT Calculated Phase 1 3 2

3D Surface Reconstruction Stages (3/5) Phase Unwrapping To remove the phase ambiguity (2 modules). Types: Path-Dependent Unwrappers Schafer Unwrapping Algorithm Path-Independent Unwrappers Goldstein's Branch Cut Algorithm Quality-Guided Path Following Algorithm Flynn's Minimum Discontinuity Algorithm Preconditioned Conjugate Gradient (PCG) Algorithm Lp-Norm Algorithm Reliability Ordering Algorithm Synthesis Algorithm Differ in speed and robustness.

3D Surface Reconstruction Stages (4/5) Absolute Height Calibration Unwrapped phase map is converted to real world heights Height calibration process: Triangulation spot (embedded inside the fringe pattern) is detected. Unwrapped phase value at the spot (x,y) location is subtracted from the unwrapped phase map (to generate a relative phase map). Unwrapped phase map is linked to the real world heights via interpolation (using the height calibration volume). To compensate the geometric distortions by optics and perspective. X,Y world coordinates are generated for a number of height steps to generate the traversal calibration volumes X,Y world coordinates are retrieved depending on the correspondent height value of each pixel and using interpolation. Traversal (XY) Calibration

Specifications of the System (1/3) Deliverables (so far!) Speed: 8Hz (using Fourier Profilometry and Goldstein's unwrapper) 5Hz (using Fourier Profilometry and Reliability ordering unwrapper) Field-of-view: (X,Y,Z) = 400mm × 500mm × 400mm Spatial resolution: 262,144 x,y points Multiple sensors: Coverage area around 270° Measurement error: Accuracy around ±0.5 mm Pre-processing Techniques: Noise Reduction and gamma Correction Catalogue of measurement techniques: Ability to select different algorithms User interaction and multi-user modes: GUI to interact with the user Normal and advanced modes (for both metrology experts and normal users) Various operating modes: Online: for real-time measurements Offline: for pre-saved images and videos 3D visualisation and 2D plotting. Various image saving choices. Treatment coach Sensor2 Sensor3 Sensor1

Specifications of the System (2/3) Program snapshots

Specifications of the System (3/3) Program snapshots

Hardware Design of the System (1/3) Hardware Configuration Main Control & Processing Unit Synchronisation Unit Sensor Processing Unit Sensor2 Sensor3 Sensor1 Sensor Processing Unit Sensor Processing Unit

Hardware Design of the System (2/3) Sensor Components: Projector Canon XEED SX60 Conventional LCD projector LCoS projector

Hardware Design of the System (3/3) Sensor Components: Camera Gige Broadcasting Phase-Stepping Triangulation Fourier Controller Gige Camera Prosilica GE1380 GigE technology: Progressive scan CCD 20 fps @ 1360 × 1024 35 fps @ 512 x 512 Direct image registration to the system memory via a compatible Gigabit port Upto 100 meter cable length

Software Design of the System Software Configuration: Processing Core Multithreaded processing framework: Input Thread: Project and grab frames Processing thread: Apply measurement, unwrapping and calibration techniques Output thread: Stream/save/display results Grabbed Images from the Camera Input Thread Input Buffer Processing Thread Output Buffer Output Thread Display and/or store according to the user preferences

Results (1/3) Static Object Measurement – One Sensor

Results (2/3) Static Object Measurement – Multi-Sensor Sensor1 Sensor2 Treatment coach Sensor2 Sensor3 Sensor1

Results (3/3) Moving Object Measurement – One Sensor

Thank You!