We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byMeredith Brisley
Modified about 1 year ago
Copyright © Gregory Avady. All rights reserved. Electro-optical 3D Metrology Gregory Avady, Ph.D. Overview
Copyright © Gregory Avady. All rights reserved. 3D Metrology Purpose: –Find target’s coordinates or orientation in 3D space Resources: –One or more optoelectronic sensors (cameras) –Some knowledge of the target, such as: geometry (shape), color, brightness, approximate location
Copyright © Gregory Avady. All rights reserved. Control Device Typical Block-Diagram X Y Z O XOXO YOYO ZOZOO X1X1 Y1Y1 Z1Z1 O1O1 X2X2 Y2Y2 Z2Z2 O2O2 ZNZN XNXN YNYN ONON Sensor 1 Sensor 2 Sensor N Processor Acquisition Device Output Target Laser Designator (Optional)
Copyright © Gregory Avady. All rights reserved. System Classifications Target Profile –Cooperative –Non-Cooperative Illumination Type –Active Systems (using laser designator) –Passive Systems Sensor Type –2D sensors (standard cameras) –1D sensors (linear CCD cameras) –Single element sensors (photodiodes) Metrology Objective –Range or angular position –3D measurement –Orientation measurement
Copyright © Gregory Avady. All rights reserved. Cooperative Configuration Known target geometry (including distance between reference marks) Some reference marks under system control (each mark can be turned on or off in any time) Typical Procedure: –For each mark: Activate mark Acquire image Measure 3D mark’s coordinates –Calculate target orientation
Copyright © Gregory Avady. All rights reserved. Non Cooperative Configuration Known target geometry (including distance between reference marks) The most difficult steps are: –Identification of reference marks by using: Mutual location of marks Previous target and / or marks location –Finding corresponding marks on all camera images
Copyright © Gregory Avady. All rights reserved. Illumination Types Active System –External target illumination –If laser designator orientation is known then one 2D sensor (or two 1D sensors) may be removed Passive System –Only target’s features are used –System is not detectable from the target
Copyright © Gregory Avady. All rights reserved. Sensors Types 2D scanning sensors (standard cameras) –Most flexible –Have more information than any other sensors 1D scanning sensors (linear CCD cameras) –Highest resolution in one direction and, as result, the highest measurement accuracy –Fastest (fewer total amount of pixels) –Required special cylindrical optics –High probability of “false parallax” in case of multiple reference marks Single element sensors (photodiodes) –Least expensive –Low accuracy
Copyright © Gregory Avady. All rights reserved. F Individual Pixels … Z X Y 1D Scanning Sensors Cylindrical Optics
Copyright © Gregory Avady. All rights reserved. Single Element Sensor Concept Half Mirror Mirror Object Active SensorReference Sensor Optics Variable Density Filter
Copyright © Gregory Avady. All rights reserved. Active System Concept 2D Camera Laser Designator Target Surface Xc Yc Zc Oc Xd Yd Zd Od Reference Mark
Copyright © Gregory Avady. All rights reserved. Analytical Description For sensor # i eleven calibration coefficients are required: a i1, a i2, …, a i11 Dependence between 2D coordinates on sensor # i and object’s 3D coordinates: Here: [X ai *, Y ai *, Z ai *] – point’s # a 2D coordinates on sensor # i (N is number of sensors)
Copyright © Gregory Avady. All rights reserved. Analytical Description (Direct Task) The following system of linear equations is used for calculating 3D coordinates for selected point on the object: (N – number of sensors) Here: [X ai *, Y ai *, Z ai *] – point’s # a 2D coordinates on sensor # i
Copyright © Gregory Avady. All rights reserved. Analytical Description (Inverse Task) The following equations are used for sensor # i calibration, i.e. for calculating coefficients a i1, a i2, …, a i11 : (N – number of sensors) (M – number of calibration data points) Here:[X ai *, Y ai *, Z ai *]– point’s # a 2D coordinates on sensor # i [X aj, Y aj, Z aj ]– point’s # a 3D coordinates
Copyright © Gregory Avady. All rights reserved. Multi-Channel Processing Procedure For each channel: –Acquire image –Filter acquired image –Find center of gravity for all marks on the image –Identify each mark by using: Mutual location of reference marks Previous target / marks location (if known) Find corresponding marks on each camera image Calculate each mark’s 3D coordinates Calculate target’s orientation
A 3-D Reconstruction System for the Human Jaw Using a Sequence of Optical Images To Dr\ Ahmed Agamia Eng\ Safaa By: Eman Sayed.
Applications of one-class classification -- searching for comparable applications for negative selection algorithms.
Geometry of Aerial Photographs. Aerial Cameras Aerial cameras must be (details in lectures): – Geometrically stable – Have fast and efficient shutters.
Technical Support: (989) Introduction to GPS/GIS.
SENSORS FOR BIOMEDICAL APPLICATION Engr. Hinesh Kumar (Lecturer)O.
Active Perception We not only see but we look, we not only touch we feel, JJ.Gibson.
G. Pottie, Sensys, November 7, 2003 Multi-Terminal Information Theory Problems in Sensor Networks Gregory J Pottie Professor, Electrical Engineering Department.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
Computer Architecture Lecture 31 Fasih ur Rehman.
Localization. Introduction We are here ! Applications Wildlife Tracking Weather Monitoring Location-based Authentication Routing in ad-hoc networks Surveillances.
Activity 2 : Use of CCD Cameras. In this activity some of the practical considerations of using and building CCD cameras are described. Simon Tulloch Nik.
Sampling and monitoring the environment Marian Scott Sept 2006.
Automatic Projector Calibration with Embedded Light Sensors Johnny C. Lee 1,2 Paul H. Dietz 2 Dan Maynes-Aminzade 2,3 Ramesh Raskar 2 Scott E. Hudson 1.
Control System Terminology S Input – Excitation applied from external source S Output - Response obtained from a system S Feedback – System output returned.
Electronic Projection Mapping using Warp & Blend Introduction by: Tim Brooksbank, CEO, Calibre UK Ltd. Technical Presentation by: Paul Carey, President,
Lecture 11: Air Movement in Buildings Material prepared by GARD Analytics, Inc. and University of Illinois at Urbana-Champaign under contract to the National.
5th Intensive Course on Soil Micromorphology Naples th - 14th September Image Analysis Lecture 5 Thresholding/Segmentation.
Using Information Technology Chapter 5 Hardware--Input & Output.
1 Costing 2 Accounting Prof. Clive Vlieland-Boddy Academic Year
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Principles of Survey Design and Management Mel Kollander
Photo Album by ME -MANISHA RAJPUT. What is a Topology ? Network topologies describe the ways in which the elements of a network are mapped. They describe.
Selected Characteristics of NRI Aerial Photos … a brief introduction …
Descriptive Statistics-II Dr Mahmoud Alhussami. Shapes of Distribution A third important property of data – after location and dispersion - is its shape.
Weapons ON Target. Learning Objectives Know the definitions of the following terms: input, output, feedback, error, open loop, and closed loop Comprehend.
Wenke Lee and Nick Feamster Georgia Tech Botnet and Spam Detection in High-Speed Networks.
Lesson 1.1 Essential Ideas A relation is a set of ordered pairs mapping between two sets, the domain and range. A relation is a set of ordered pairs mapping.
Image Registration Mapping of Evolution. Registration Goals Assume the correspondences are known Find such f() and g() such that the images are best.
6-Hour Hands-On Introduction to LabVIEW. Course Goals Become comfortable with the LabVIEW environment and data flow execution Ability to use LabVIEW to.
© 2016 SlidePlayer.com Inc. All rights reserved.