Terrain Relative Navigation for Pinpoint Landing using Cubesats

Slides:



Advertisements
Similar presentations
Swish Sleeve Software Design Narrative Team 7: Stephen MacNeil, Michael Kobit, Sriharsh Achukola, Augustus Hong 1Team 7 - Swish Sleeve.
Advertisements

MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
VI. Descent and Terminal Guidance for Pinpoint Landing and Hazard Avoidance Session Chair: Dr. Sam W. Thurman.
1 DSN Network e-VLBI Calibration of Earth Orientation L. D. Zhang a,b, A. Steppe a, G. Lanyi a Presentation at 5-th International e-VLBI Workshop Westford,
PRE-DECISIONAL DRAFT; For planning and discussion purposes only 1 1 March 4-5, 2008 Evolution of Lunar to Planetary Landing A.Miguel San Martin Mars Science.
Optical Navigation System Michael Paluszek, Joseph Mueller, Dr. Gary Pajer Princeton Satellite Systems EUCASS July 4-8, 2011 St. Petersburg, Russia.
Progress in Two-second Filter Development Michael Heifetz, John Conklin.
Project Progress Presentation Coffee delivery mission Dec, 10, 2007 NSH 3211 Hyun Soo Park, Iacopo Gentilini 1.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Detecting and Tracking Moving Objects for Video Surveillance Isaac Cohen and Gerard Medioni University of Southern California.
Robust Lane Detection and Tracking
Autonomous Landing Hazard Avoidance Technology (ALHAT) Page 1 March 2008 Go for Lunar Landing Real-Time Imaging Technology for the Return to the Moon Dr.
Navigation Systems for Lunar Landing Ian J. Gravseth Ball Aerospace and Technologies Corp. March 5 th, 2007 Ian J. Gravseth Ball Aerospace and Technologies.
Ares Aloft: Martian Atmospheric Entry and In-Situ Resource Use via CubeSat Jeffrey Stuart Jet Propulsion Laboratory California Institute of Technology.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
Overview and Mathematics Bjoern Griesbach
Lunar Lander Phase B1 p. 0 9 th International Planetary Probe Workshop, Toulouse, June 21 th, 2012 Innovative Visual Navigation Solutions for ESA’s Lunar.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
StarNav Advanced CMOS Star Trackers: enhanced accuracy, reliability and speed John L. Junkins Texas A&M University May 16, 2002.
IPPW- 9 Royal Observatory of Belgium 20 June Von Karman Institute for Fluid Dynamics Obtaining atmospheric profiles during Mars entry Bart Van Hove.
Kalman filter and SLAM problem
Controlled Autonomous Proximity Technology with flUx pinning & Reconfiguration Experiments CAPTURE: David Bayard, Laura Jones, and Swati Mohan Jet Propulsion.
Jet Propulsion Laboratory California Institute of Technology National Aeronautics and Space Administration National Aeronautics and Space Administration.
Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
Robot Compagnion Localization at home and in the office Arnoud Visser, Jürgen Sturm, Frans Groen University of Amsterdam Informatics Institute.
Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision.
Enabling Technology Development: High cadence imaging spectrograph development Low mass/power instrumentation Advanced communication/DSN for future deployment.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
Space Applications Measurement Testbed N. B. Toomarian Jet Propulsion Laboratory
Printed by ACS 2 Gyro Mode Data Analysis Cheryl Pavlovsky, Marco Sirianni, Ken Sembach, ACS Instrument Team and the 2 Gyro Mode Team.
20a - 1 NASA’s Goddard Space Flight Center Attitude Control System (ACS) Eric Holmes, Code 591 Joe Garrick, Code 595 Jim Simpson, Code 596 NASA/GSFC August.
Improved Terrain Generation From UAV Sensors Nascent Systems By Wolfgang Baer Associate Research Prof. Naval Postgraduate School Monterey, California
3D SLAM for Omni-directional Camera
Kalman Filter 1 Early Planar IMU 14x28 mm. Kalman Filter 2 3DOF IMU - Measures Two States.
PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only Test Plan Review MSL Focused Technology Instrument Placement Validation Test Plan for 2D/3D.
Andrew Faulkner1 Technology Readiness Levels 4 th SKADS Workshop, Lisbon Technology Readiness Levels TRLs Andrew Faulkner.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Jet Propulsion Laboratory California Institute of Technology August 4, 2015 Austin Nicholas Landing Site Considerations Related to the Potential Sample.
Mars Science Laboratory Navfilter Trajectory Reconstruction Fred Serricchio Miguel San Martin, Edward C. Wong Jet Propulsion Laboratory, California Institute.
Phase Congruency Detects Corners and Edges Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia.
Nov 3, 2009 RN - 1 Jet Propulsion Laboratory California Institute of Technology Current Developments for VLBI Data Acquisition Equipment at JPL Robert.
1 Howard Schultz, Edward M. Riseman, Frank R. Stolle Computer Science Department University of Massachusetts, USA Dong-Min Woo School of Electrical Engineering.
JPL LASER MAPPER (LAMP) POC Bob Bunker (Task Manager) Jet Propulsion Laboratory Inter-Agency AR&C Working Group Meeting May , 2002 Naval Research.
Guidance, Navigation and Controls Subsystem Winter 1999 Semester Review.
Tracking CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Accelerated Long Range Traverse (ALERT) Paul Springer Michael Mossey.
Autonomous Robots Vision © Manfred Huber 2014.
State Estimation for Autonomous Vehicles
(c) 2009 California Institute of Technology. Government sponsorship acknowledged. Improving Predictions of the Earth’s Rotation Using Oceanic Angular Momentum.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Pixel Parallel Vessel Tree Extraction for a Personal Authentication System 2010/01/14 學生:羅國育.
Pre-decisional: For Planning and Discussion Purposes Only Jet Propulsion Laboratory California Institute of Technology Mars 2020 Project Engineering Assessment.
Pre-decisional – for Planning and Discussion Purposes Only 1 Technology Planning for Future Mars Missions Samad Hayati Manager, Mars Technology Program.
Astrobiology Science and Technology for Exploring Planets (ASTEP) Mid-Year Review August 4, 2004 Robust Autonomous Instrument Placement for Rovers (JPL:
Motion tracking TEAM D, Project 11: Laura Gui - Timisoara Calin Garboni - Timisoara Peter Horvath - Szeged Peter Kovacs - Debrecen.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
Session Chair: Dr. Sam W. Thurman
Terrain Reconstruction Method Based on Weighted Robust Linear Estimation Theory for Small Body Exploration Zhengshi Yu, Pingyuan Cui, and Shengying Zhu.
Summary of “Efficient Deep Learning for Stereo Matching”
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Anastasios I. Mourikis and Stergios I. Roumeliotis
Optical Flow For Vision-Aided Navigation
Mars as Seen from Phobos
Determining the Risk Level Regarding to the Positioning of an Exam Machine Used in the Nuclear Environment, based of polynomial regression Mihai OPROESCU1,
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Terrain Relative Navigation for Pinpoint Landing using Cubesats Mars Cubesat Workshop Swati Mohan, Andrew Johnson, Nikolas Trawny November 21, 2014 Copyright 2014 California Institute of Technology. Government sponsorship acknowledged. For planning and discussion purposes only. Cleared for unlimited release (CL#14-4997).

LVS is an extension of MER-DIMES MER-DIMES (2004) was the first use of descent images for navigation during EDL. descent image reference map position knowledge error before TRN knowledge error after TRN TRN automatically matches features in a descent image to landmarks in a map to obtain a position fix. landmarks image features DIMES tracked features between descent images to estimate velocity. Champollion, MER, NMP/ST9, Mars Tech Program First Pair Tracking Second Pair Tracking TRN is an extension of MER-DIMES to estimate position for accurate landing. Horizontal Velocity :  For Planning and Discussion Purposes Only.

LVS TRN Implementation VISINAV: Batch Initialization VISINAV: Extended Kalman Filter x outlier Threshold x x x x x x x Residuals x x x x x x x x x x x x x Time TRON: coarse matches TRON: coarse matches TRON: coarse matches TRON: fine matches TRON: fine matches fine match image templates coarse match image templates IMU IMU Image 1 IMU Image 2 IMU Camera Image 3 Image 10 Image 4 Image 5 Map move later add performance requirements outlier IMU Batch update EKF update EKF update Propagate Propagate Propagate Propagate Remove Large Position Uncertainty (50m 1-σ) Improve Position Accuracy (20m 1- σ) :  For Planning and Discussion Purposes Only.

Prototype LVS Block Diagram LVS Compute Element compact PCI backplane LEON 3 Flight Processor RTEMS OS Batch Init Estimator Extended Kalman Filter Image Processing Control Homography Generation Outlier Rejection Data Sequencing Lander GN&C initial state (p+perr,q,v+verr,A)M_B (position correction)M_B data products, data Data Logger Virtex 5 FPGA AMBA Bus Homography Warp Image Normalize FFT Correlation Interest Operator Spatial Correlation Sensor Timing Camera I/F IMU I/F accel, angular rates LVS IMU grayscale image LVS Camera SDRAM with map legend LVS LVS Technology S/C :  For Planning and Discussion Purposes Only.

Implementation on a Cubesat Hardware Compute Element: Existing TRN boards are larger than Cubesat scale Requires time and money to scale down to scale to a Cubesat size JPL developed IRIS radio includes Cubesat scale Virtex 5 implementation Deep space Cubesat processor in development in JPL Sec. 349 IMU: Current TRN designs use LN200 or MIMU Cubesat scale IMUs exist with similar performance Examples are BCT MEMS gyro, KVH FOG gyro Camera: Current TRN design use a 1024x1024 array with 90 deg field of view Cubesat scale detectors exist with similar performance Example: KAI-04022 (2048x2048) Optics would need to be designed for each mission Software TRN algorithms have been implemented and field tested Mission specific development needs to occur: Porting to the mission processor Interfacing with the mission GN&C Obtaining and implementing maps for mission landing site :  For Planning and Discussion Purposes Only.

Possible Mission Applications Technology Advancement Technology demonstration of pinpoint landing Technology demonstration of hazard detection Science Precision placement of probes ( seismometers, weather stations, etc.) Mapping of Phobos or Deimos Landing Site Support Pre-cursor probes to confirm “worthiness” of potential landing site targets And many more…! :  For Planning and Discussion Purposes Only.