3D Photorealistic Modeling Process. Different Sensors Scanners Local coordinate system Cameras Local camera coordinate system GPS Global coordinate system.

Slides:



Advertisements
Similar presentations
Geometry of Aerial Photographs
Advertisements

REQUIRING A SPATIAL REFERENCE THE: NEED FOR RECTIFICATION.
Single-view geometry Odilon Redon, Cyclops, 1914.
3D Morphing using Multiplanar Representation
Last 4 lectures Camera Structure HDR Image Filtering Image Transform.
4/30/2015 GEM Lecture 13 Content Orientation –Inner –Relative –Absolute.
Technical assistance from Republic of Latvia offered to Iraq in the field of architectural sites documentation by photogrammetrical methods Mr. G.Goldbergs.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Leica Photogrammetry Suite
Computer vision: models, learning and inference
Camera Calibration. Issues: what are intrinsic parameters of the camera? what is the camera matrix? (intrinsic+extrinsic) General strategy: view calibration.
Computer vision. Camera Calibration Camera Calibration ToolBox – Intrinsic parameters Focal length: The focal length in pixels is stored in the.
CS 128/ES Lecture 10a1 Raster Data Sources: Paper maps & Aerial photographs.
P.1 JAMES S. Bethel Wonjo Jung Geomatics Engineering School of Civil Engineering Purdue University APR Sensor Modeling and Triangulation for an.
Window Filling and Junk Data Removal Hadi Fadaifard.
Single-view metrology
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
MSU CSE 803 Stockman Perspective algebra: quick- and-dirty first look Geometry of similar triangles yields algebra for computing world-image transformation.
A Laser Range Scanner Designed for Minimum Calibration Complexity James Davis, Xing Chen Stanford Computer Graphics Laboratory 3D Digital Imaging and Modeling.
CS485/685 Computer Vision Prof. George Bebis
3D Measurements by PIV  PIV is 2D measurement 2 velocity components: out-of-plane velocity is lost; 2D plane: unable to get velocity in a 3D volume. 
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Uncalibrated Epipolar - Calibration
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Single-view geometry Odilon Redon, Cyclops, 1914.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Single-view Metrology and Camera Calibration Computer Vision Derek Hoiem, University of Illinois 02/26/15 1.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Conversion from Latitude/Longitude to Cartesian Coordinates
9. GIS Data Collection.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Image Processing & GIS Integration for Environmental Analysis School of Electrical & Electronic Engineering The Queen’s University of Belfast Paul Kelly.
Geometric Correction It is vital for many applications using remotely sensed images to know the ground locations for points in the image. There are two.
Universität Hannover Institut für Photogrammetrie und GeoInformation Issues and Method for In-Flight and On-Orbit Calibration (only geometry) Karsten Jacobsen.
Image Formation Fundamentals Basic Concepts (Continued…)
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Photorealistic Outcrop at Dallas Post Office Accuracy of a few centimeters Photo registration about 0.7 – 2.7 pixels. Bring outcrop back to office photorealistically.
Single-view Metrology and Camera Calibration Computer Vision Derek Hoiem, University of Illinois 01/25/11 1.
Photogrammetry for Large Structures M. Kesteven CASS, CSIRO From Antikythera to the SKA Kerastari Workshop, June
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
Vision Review: Image Formation Course web page: September 10, 2002.
© 2005 Martin Bujňák, Martin Bujňák Supervisor : RNDr.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Introduction to Soft Copy Photogrammetry
Single-view geometry Odilon Redon, Cyclops, 1914.
Ch. 3: Geometric Camera Calibration
EECS 274 Computer Vision Geometric Camera Calibration.
ST236 Site Calibrations with Trimble GNSS
Camera Model Calibration
Single-view geometry Odilon Redon, Cyclops, 1914.
Example: warping triangles Given two triangles: ABC and A’B’C’ in 2D (12 numbers) Need to find transform T to transfer all pixels from one to the other.
CAD to GIS Data Integration Part 1 Datum, Projection, Coordination Systems Presented by Doug Howe, PLS April 24, 2015.
PIXEL ladder alignment Hidemitsu ASANO. Photo analysis & survey beam data (with zero magnetic field) ① SVX standalone tracking Global Tracking Strategy.
Date of download: 7/9/2016 Copyright © 2016 SPIE. All rights reserved. Photo of measurement boat and unit. Figure Legend: From: Development of shape measurement.
Quality of Images.
Calibrating a single camera
CMSC5711 Image processing and computer vision
Computer vision: models, learning and inference
Geometric Model of Camera
Spatial Referencing.
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
Overview Pin-hole model From 3D to 2D Camera projection
CMSC5711 Image processing and computer vision
REDUCTION TO THE ELLIPSOID
Science of Crime Scenes
Definitions of the image coordinate system and rotation angles
3.2.6 Refinement of Photo Co-ordinates: -
Presentation transcript:

3D Photorealistic Modeling Process

Different Sensors Scanners Local coordinate system Cameras Local camera coordinate system GPS Global coordinate system

Coordinate Systems Individual local scanner coordinates (each scan) Object coordinate system (single coordinate system aligning all scans) Camera coordinate system (each photograph) Global coordinates

Scanner Coordinate Individual scanner local coordinate –Not necessary to level Y X Z

Y X Z Camera Coordinate System Each photograph has its own coordinates –Units: mm or pixel

Putting it together From individual scan coordinates to object coordinates From object (or global) coordinates to camera coordinates From object coordinates to global coordinates

Individual coordinates to object coordinates (1/2) Traditional survey approaches –Need to level the scanner –set up backsight –Knowing scanner location and backsight angle transform each point to the object coordinate system, usually global. –Advantage: easy to set up one-step from local to global coordinates. –Disadvantage: problem in generating mesh models.

From individual coordinates to object coordinates (2/2) Use mesh alignment techniques (Polyworks) –No need to level. –Requires overlap with common features to minimize the distance. Z X Y sc1sc2 T =

From Object to Camera (1/2) Two approaches –Polynomial fit (rubber sheeting) Low accuracy, No need to know camera intrinsic parameters –Projection transform (pinhole model) High accuracy

From Object to Camera (2/2) 1.From object to camera coordinate system (pin hole model) 2.Perspective projection to convert to image coordinates (uv, pixel, or mm) 6 unknowns assuming known f Nonlinear-needs initial value

Camera Calibration Correct lens distortion –Radial distortion –Tangential distortion –Calculate f, k1, k2, p2, p2 in the lab for each lens.

Example of the calibration (Canon 17mm) )Radial distortion 2)Tangential distortion 3)Complete model

Example Iteration = 8 Residuals pts51 = pts50 = pts2034 = pts 2010 = omage: phi: kappa: X: Y: Z:

Bundle Adjustment Adjust the bundle of light rays to fit each photo

Bundle Adjustment (2/2) Photo no : 7734 pt no U V Photo no omega phi kappa X Y Z Photo no : 7735 pt no U V

From Object to Global (1/2) 7-parameter conformal transformation s Where m11 = cos(phi) * cos(kappa); m12 = -cos(phi) * sin(kappa); m13 = sin(phi) m21 = cos(omega) * sin(kappa) + sin(omage) * sin(phi) * cos(kappa); m22 = cos(omage) * cos(kappa) – sin(omega) * sin(phi) * sin(kappa); m23 = -sin(omage) * cos(phi); m31 = sin(omage) * sin(kappa) – cos(omage) * sin(phi) * cos(kappa); m32 = siin(omage) * cos(kappa) + cos(omage) * sin(phi) * sin(kappa); m33 = cos(omage) * cos(phi); and s is scale factor

Transform to Global (2/2) Object GPS Iteration:5 scale : (*****) omega : phi : kappa : X trans: Y trans: Z trans: Pt: 1, X Y Z Pt: 2, X Y Z Pt: 3, X Y Z Pt: 4, X Y Z Pt: 5, X Y Z 0.006

REDUCTION TO THE ELLIPSOID h N H R Earth Radius 6,372,161 m 20,906,000 ft. Earth Center S D S = D x R R + h h = N + H S = D x R + N + H R

REDUCTION TO GRID S g = S (Geodetic Distance) x k (Grid Scale Factor) S g = x = meters

REDUCTION TO ELLIPSOID S = D x [R / (R + h)] D = meters (Measured Horizontal Distance) R = 6,372,162 meters (Mean Radius of the Earth) h = H + N (H = 158 m, N = - 24 m) = 134 meters (Ellipsoidal Height) S = [6,372,162 / 6,372, ] S = x S = meters

COMBINED FACTOR CF = Ellipsoidal Reduction x Grid Scale Factor (k) = x = CF x D = S g x = meters

Surface Generation Through merge process in Polyworks Through fitting through GoCad Through direct triangulation (Delauney triangulation, TIN)

Surface cleaning (in Polyworks) The single most time consuming part of entire process (90% of time). –Filling the holes (because of scan shadow) –Correct triangles

Summarize