12/01 - 1 Target Precision Determination and Integrated Navigation By Professors Dominick Andrisani and James Bethel, and Ph.D. students Aaron Braun, Ade.

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Navigation Fundamentals
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Inertial Navigation Systems and GPS Juan Jacobo Van der Dys April 20 th, 2005.
(Includes references to Brian Clipp
September, School of Aeronautics & Astronautics Engineering Performance of Integrated Electro-Optical Navigation Systems Takayuki Hoshizaki
Limits of static processing in a dynamic environment Matt King, Newcastle University, UK.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Space Weather influence on satellite based navigation and precise positioning R. Warnant, S. Lejeune, M. Bavier Royal Observatory of Belgium Avenue Circulaire,
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Fundamentals of Strapdown Inertial and GPS-Aided Navigation
Prepared By: Kevin Meier Alok Desai
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Autonomous Vehicle Positioning with GPS in Urban Canyon Environments
August, School of Aeronautics & Astronautics Engineering Optical Navigation Systems Takayuki Hoshizaki Prof. Dominick Andrisani.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
December, Simulation of Tightly Coupled INS/GPS Navigator Ade Mulyana, Takayuki Hoshizaki December, 2001 Purdue University.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
ROBOT MAPPING AND EKF SLAM
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
Airborne Attitude Determination and Ground Target Location Using GPS Information and Vision Technique Shan-Chih Hsieh, Luke K.Wang, Yean-Nong Yang †,Fei-Bin.
Simultaneous Estimations of Ground Target Location and Aircraft Direction Heading via Image Sequence and GPS Carrier-Phase Data Luke K.Wang, Shan-Chih.
Computer vision: models, learning and inference Chapter 19 Temporal models.
David Wheeler Kyle Ingersoll EcEn 670 December 5, 2013 A Comparison between Analytical and Simulated Results The Kalman Filter: A Study of Covariances.
Modern Navigation Thomas Herring MW 11:00-12:30 Room A
Kalman Filter 1 Early Planar IMU 14x28 mm. Kalman Filter 2 3DOF IMU - Measures Two States.
The Kalman Filter ECE 7251: Spring 2004 Lecture 17 2/16/04
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Modern Navigation Thomas Herring
Mathematical Engineering in Avionics Applications Dr. SK Chaudhuri Sc. ‘H’ Sc. ‘H’ Associate Director, RCI 9 th June 2007, IISc Bangalore.
Karman filter and attitude estimation Lin Zhong ELEC424, Fall 2010.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
HQ U.S. Air Force Academy I n t e g r i t y - S e r v i c e - E x c e l l e n c e Improving the Performance of Out-of-Order Sigma-Point Kalman Filters.
Guidance, Navigation and Controls Subsystem Winter 1999 Semester Review.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
A Trust Based Distributed Kalman Filtering Approach for Mode Estimation in Power Systems Tao Jiang, Ion Matei and John S. Baras Institute for Systems Research.
Pg 1 of 10 AGI Sherman’s Theorem Fundamental Technology for ODTK Jim Wright.
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
1 SVY 207: Lecture 12 Modes of GPS Positioning Aim of this lecture: –To review and compare methods of static positioning, and introduce methods for kinematic.
Optimizing Attitude Determination for Sun Devil Satellite – 1
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
Tracking with dynamics
EE 495 Modern Navigation Systems Wednesday, January 8 EE 495 Modern Navigation Systems Slide 1 of 18.
A Low-Cost and Fail-Safe Inertial Navigation System for Airplanes Robotics 전자공학과 깡돌가
10/31/ Simultaneous Estimation of Aircraft and Target Position With a Control Point Professor Dominick Andrisani Purdue University, School of Aeronautics.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
7/ 24/ Simultaneous Estimation of Aircraft and Target Position Professor Dominick Andrisani Purdue University, School of Aeronautics and Astronautics.
Strapdown Inertial Navigation Systems (INS) Sensors and UAVs Avionic
Copyright 2011 controltrix corpwww. controltrix.com Global Positioning System ++ Improved GPS using sensor data fusion
EE 495 Modern Navigation Systems Kalman Filtering – Part II Mon, April 4 EE 495 Modern Navigation Systems Slide 1 of 23.
10/31/ Simulation of Tightly Coupled INS/GPS Navigator Ade Mulyana, Takayuki Hoshizaki October 31, 2001 Purdue University.
Younis H. Karim, AbidYahya School of Computer University Malaysia Perlis 1.
Least Squares Measurement model Weighted LSQ  Optimal estimates  Linear  Unbiased  Minimum variance.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
ASEN 5070: Statistical Orbit Determination I Fall 2014
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Fig. 1 above portrays an overall closed-loop architecture of autonomous relative navigation of a client satellite with an integrated sensor System composed.
Inertial Measurement Unit (IMU) Basics
The Discrete Kalman Filter
Presentation transcript:

12/ Target Precision Determination and Integrated Navigation By Professors Dominick Andrisani and James Bethel, and Ph.D. students Aaron Braun, Ade Mulyana and Takayuki Hoshizaki Purdue University, West Lafayette, IN NIMA Meetings December 11-12,

12/ To provide an overview of the results of the Purdue Motion Imagery Group which is studying precision location of ground targets from a UAV. (This work started January ‘01). To suggest that integrated navigators have to be re-optimized in in regards to allowable errors in position and orientation of the aircraft for the problem of locating ground targets. To build a case for a new class of aircraft navigators that use imagery to improve aircraft navigation accuracy. To build a case for a new class of target locators that integrate aircraft navigation and target imagery to improve accuracy of aircraft location and target location. Purposes of this talk

12/ This summary talk will be short on mathematics, procedural details, and numerical results. This summary talk will be big on ideas and concepts that we have identified as being important in improving the accuracy of target location from an UAV. Our final report will be available through contract sponsor Dave Rogers in early April. Papers documenting the details of our work and work-in-progress can be found at (This site is password protected. Contact Dave for password.) Note to the Audience

12/ To study location of both target and aircraft using motion Imagery with multiple ray intersections, inertial sensors, and the GPS system, and to do this with as few simplifying assumptions as possible. To determine which sources of error contribute most to errors in locating a ground target. To determine an error budget that will guarantee a cep90% of 10 feet. Objectives of the Purdue Motion Imagery Group

12/ Multiple Ray Intersections to Define Target Location

12/ Aircraft Motion Aircraft Model Trajectory Input Time Input Turbulence Input Errors GPS Satellite Constellation Processing Mode Antennas Number, Location Errors INS Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Transformation to Sensor Position, Attitude, and Uncertainty Errors Sensor Parameters Image Acquisition Parameters Site Model Imaging System Target Coordinates Uncertainty, CE90 Graphic Animation Multi-Image Intersection Synthetic Image Generation Errors Target Tracking Our overall target location problem

12/ Aircraft Motion Aircraft Model Trajectory Input Time Input Turbulence Input Errors GPS Satellite Constellation Processing Mode Antennas Number, Location Errors INS Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Transformation to Sensor Position, Attitude, and Uncertainty Errors Sensor Parameters Image Acquisition Parameters Site Model Imaging System Target Coordinates Uncertainty, CE90 Graphic Animation Multi-Image Intersection Synthetic Image Generation Errors Target Tracking Given covariance of zero mean errors Find target position covariance (cep90) using linear methods Problem: Errors are not always zero mean Covariance Analysis

12/ See References by Aaron Braun Rigorous sensor modeling is important in determining target location. Aircraft orientation and aircraft position accuracy are both important in target location accuracy. The relative importance of various error sources to the CEP90 is being determined. Covariance Analysis See References by Professor James Bethel Sometimes errors are not zero mean but biased. The best example of this is in GPS positioning. Quoted GPS accuracy reflects the sum of the bias and random components. Biased aircraft positioning will lead to biased target positioning. Covariance analysis will not show this fact.

12/ Position Accuracy vs. Orientation Accuracy

12/ Stand-Alone Target Precision Calculator

12/ Frames, with Position Bias

12/ Aircraft Motion Aircraft Model Trajectory Input Time Input Turbulence Input Errors GPS Satellite Constellation Processing Mode Antennas Number, Location Errors INS Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Transformation to Sensor Position, Attitude, and Uncertainty Errors Sensor Parameters Image Acquisition Parameters Site Model Imaging System Target Coordinates Uncertainty, CE90 Graphic Animation Multi-Image Intersection Synthetic Image Generation Errors Target Tracking Today’s Integrated Inertial Navigator (Inertial + GPS)

12/ Status of our work on an Integrated Inertial Navigator Ade Mulyana and Taka Hoshizaki have completed the development of an integrated navigator of this form. Results show that improving the GPS subsystem produces a significant improvement in aircraft position accuracy. Results also show that improving the inertial navigation subsystem produces a significant improvement in aircraft orientation accuracy. Since both aircraft position and orientation are important in targeting, careful re-optimization of the INS and GPS systems is required for the ground targeting scenario. Our error budget will help in this re-optimization. See References by Mulyana and Hoshizaki

12/ Local Frame Position Errors: (true) – (estimated) (sec) dx (m) dy (m) dz (m) GPS performance directly affects position errors 200~300s covariance and nominal trajectory data are passed to imagery analysis

12/ (sec) droll (rad) dpitch (rad) dyaw (rad) Local Frame Euler Angle Errors: (true) – (estimated) INS accuracy helps orientation accuracy

12/ Aircraft Motion Aircraft Model Trajectory Input Time Input Turbulence Input Errors GPS Satellite Constellation Processing Mode Antennas Number, Location Errors INS Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Transformation to Sensor Position, Attitude, and Uncertainty Errors Sensor Parameters Image Acquisition Parameters Site Model Imaging System Target Coordinates Uncertainty, CE90 Graphic Animation Multi-Image Intersection Synthetic Image Generation Errors Target Tracking Do all this simultaneously for improved accuracy in aircraft positioning. The targets may include one or more known control points. Known control points improve aircraft accuracy. Proposed Imaging Navigator (Inertial+GPS+ Imagery)

12/ Status of our work on the Imaging Navigator A fully integrated nonlinear Imaging Navigator will be developed under a subsequent contract. Preliminary analysis by Andrisani using greatly simplified models and linear analysis are encouraging. Flying over known control points improve aircraft position accuracy. This is a standard INS update technique. Flying over stationary objects on the ground should minimize the effects of velocity biases and rate gyro biases in the inertial navigator. This should improve aircraft position and orientation accuracy.

12/ Aircraft Motion Aircraft Model Trajectory Input Time Input Turbulence Input Errors GPS Satellite Constellation Processing Mode Antennas Number, Location Errors INS Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Transformation to Sensor Position, Attitude, and Uncertainty Errors Sensor Parameters Image Acquisition Parameters Site Model Imaging System Target Coordinates Uncertainty, CE90 Graphic Animation Multi-Image Intersection Synthetic Image Generation Errors Target Tracking Do all this simultaneously for improved accuracy in target positioning The targets may include one or more known control points. Known control points improve target accuracy. Proposed Integrated Target Locator (Inertial+GPS+ Imagery)

12/ Status of our work on the Integrated Target Locator A nonlinear Integrated Target Locator will be developed under a subsequent contract. Preliminary analysis by Andrisani using greatly simplified models and linear analysis is encouraging. Flying over known control points improves target position accuracy. Flying over stationary objects on the ground should minimize the effects of velocity biases and rate gyro biases in the inertial navigator. This should improve target position accuracy.

12/ Hypothesis: Given a combined estimator of aircraft position and target position capable of imaging on a unknown target and a known control point. If a control point enters the field of view of the image system, the accuracy of simultaneous estimation of aircraft position and unknown target position will be significantly improved. Simplified Integrated Target Locator

12/ Use a linear low-order simulation of a simplified linear aircraft model, Use a simple linear estimator to gain insight into the problem with a minimum of complexity. A control point of known location will enter the field of view of the image processor only during the time from seconds. Technical Approach

12/ Unknown Target always visible Initial aircraft position time=0 sec Final aircraft position time=200 sec -10,00010,000 Range Meas., R (ft) Position (ft) Image Coord. Meas. x (micron) Position Meas X aircraft (ft) Focal Plane (f=150 mm) Camera always looks down. 20,000 Nominal speed=100 ft/sec Data every.1 sec., i.e., every 10 ft Control point Known location Visible only from time= seconds. Linear Simulation: Fly over trajectory

12/  Aircraft position = 1 feet  Image coordinate = 7.5 microns  Range = 1 feet Nominal Measurement Noise in the Simulation

12/ Linear state equation x(j+1)=  (j,j-1)x(j)+v(j)+w(j) Nonlinear measurement equation z(j)=h(x(j))+u(j) x(o)=x 0 (Gaussian initial condition) where v(j) is a known input w(j) is Gaussian white process noise u(j) is Gaussian white measurement noise State Space Model

12/ Initialize Predict one step Measurement update The Kalman Filter State Estimator

12/ Tabulated Results

12/ No measurement here Residuals of the Kalman Filter,  aircraft =100 ft Aircraft Position residual (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ Major impact of control point here Major impact of control point here Estimated State – Actual State,  aircraft =100 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ Major impact of control point here Expanded time scale for Estimated state -Actual state Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ Estimated State – Actual State,  aircraft =10 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ Major impact of control point here Expanded time scale,  aircraft =10 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ No impact of control point Little impact of control point here Estimated State – Actual State,  aircraft =1 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ Littler impact of control point here Expanded time scale,  aircraft =1 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ No impact of control point here Estimated State – Actual State,  Range =10 ft. Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ No impact of control point here Expanded time scale,  Range =10 ft. Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)

12/ ” Imaging Navigator” with camera #1 on target #1 and INS and GPS. Image-based target locator using camera #2 on target #2. Improved aircraft position Improved target position Aircraft and target #1 and #2 data “Integrated Target Locator” using one camera to simultaneously or sequentially track two targets and INS and GPS. Aircraft and target #1 data Improved target position Target #2 data Two Useful Scenarios

12/ Both aircraft position and orientation accuracy strongly effect the accuracy of target location. 2.Accuracy specifications for position and orientation in integrated inertial navigators should be re-optimized for the problem of achieving desired accuracy in target location. Our error budget to achieve 10 ft cep90% should help in this re-optimization. 3. Regarding our proposed “Integrated Target Locator,” when the measurement noise on aircraft position is large (  aircraft >>1 ft), the sighting of a known control point significantly improves the aircraft position accuracy AND the unknown target position accuracy. This suggests a that flying over control points is tactically useful! 4. A dramatic improvement of aircraft position estimation suggests a new type of navigator, the “Imaging Navigator” should be developed. This navigator would integrate INS, GPS, and image processor looking at known or unknown objects on the ground. One or two cameras might be used. Conclusions

12/ Presented at the The Motion Imagery Geolocation Workshop, SAIC Signal Hill Complex, 10/31/01 1. Dominick Andrisani, Simultaneous Estimation of Aircraft and Target Position With a Control Point 2. Ade Mulyana, Takayuki Hoshizaki, Simulation of Tightly Coupled INS/GPS Navigator 3. James Bethel, Error Propagation in Photogrammetric Geopositioning 4. Aaron Braun, Estimation Models and Precision of Target Determination References Presented at the The Motion Imagery Geopositioning Review and Workshop, Purdue University, 24/25 July, Dominick Andrisani, Simultaneous Estimation of Aircraft and Target Position 2. Jim Bethel, Motion Imagery Modeling Study Overview 3. Jim Bethel, Data Hiding in Imagery 4. Aaron Braun, Estimation and Target Accuracy 5. Takayuki Hoshizaki and Dominick Andrisani, Aircraft Simulation Study Including Inertial Navigation System (INS) Model with Errors 6. Ade Mulyana, Platform Position Accuracy from GPS

12/ B.H. Hafskjold, B. Jalving, P.E. Hagen, K. Grade, Integrated Camera-Based Navigation, Journal of Navigation, Volume 53, No. 2, pp Daniel J. Biezad, Integrated Navigation and Guidance Systems, AIAA Education Series, D.H. Titterton and J.L. Weston, Strapdown Inertial Navigation Technology, Peter Peregrinus, Ltd., A. Lawrence, Modern Inertial Technology, Springer, B. Stietler and H. Winter, Gyroscopic Instruments and Their Application to Flight Testing, AGARDograph No. 160, Vol. 15, A.K. Brown, High Accuracy Targeting Using a GPS-Aided Inertial Measurement Unit, ION 54th Annual Meeting, June 1998, Denver, CO. Related Literature

12/ GPS Receiver IMUNav Structure of Simulation Tightly Coupled INS/GPS Position Velocity Orientation Covariance UAV Kalman Filter + - INS Bias Correction Position, Velocity, Orientation and Covariance correction

12/ Simplified IMU Model where = Bias + White Noise : Sensor Output : Sensor Input Bias: Markov Process, tc=60s for all Accelerometer Outputs Rate Gyro Outputs

12/ GPS Receiver Model : Platform Position : Satellite Position : Pseudorange equvalent Clock Bias (Random Walk) : Pseudorange rate equivalent Clock Drift (Random Walk) : Normally Distributed Random Number Pseudorange Pseudorange Rate

12/ Kalman Filter: Error Dynamics Orientation Angle Errors 17 States Kalman Filter Velocity Errors Position Errors Gyro Biases Accelerometer Biases Clock Bias and Drift

12/ Kalman Filter: Output Equation Measurement:Random Noise: Output Equation: where

12/ Initial Error Condition Initial Errors Initial Covariance Values

12/ Error Source Specifications INS Accelerometers Bias White Noise (sqrt(PSD)) Bias White Noise (sqrt(PSD)) Notation LN-100GLN-200IMUUnits Rate Gyros (good) (worse) 2 levels of INS are used for Simulation (deg/hr/sqrt(Hz))

12/ Error Source Specifications GPS GPS Receiver Notation Receiver 1 Receiver 2 Units Pseudorange m Pseudorange Rate m/s ClockBias White Noise(PSD) ClockDrift White Noise(PSD) (good)(worse) 2 levels of GPS Receivers are used for Simulation

12/ Satellite Geometry during the Simulation

12/ Local Frame: x, y, z Xecef Yecef Zecef x y z x=Zecef y=-Yecef z=Xecef m Nominal Trajectory

12/ (sec) Local Frame Velocity Errors: (true) – (estimated) GPS performance directly affects velocity errors