University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
GRASP University of Pennsylvania NRL logo? Autonomous Network of Aerial and Ground Vehicles Vijay Kumar GRASP Laboratory University of Pennsylvania Ron.
Robotics, Intelligent Sensing and Control Lab (RISC) University of Bridgeport Department of Computer Science and Engineering Robotics, Intelligent Sensing.
Paper Presentation --- Grasping a novel object InInstructor: Student Name: Major: ID: Date: Marius C.Silaghi Sida Du M.E April
Hybrid Position-Based Visual Servoing
Vision Based Control Motion Matt Baker Kevin VanDyke.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Markovito’s Team (INAOE, Puebla, Mexico). Team members.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Cognitive Colonization The Robotics Institute Carnegie Mellon University Bernardine Dias, Bruce Digney, Martial Hebert, Bart Nabbe, Tony Stentz, Scott.
Localization David Johnson cs6370. Basic Problem Go from thisto this.
A new approach for modeling and rendering existing architectural scenes from a sparse set of still photographs Combines both geometry-based and image.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Copyright  Philipp Slusallek Cs fall IBR: Model-based Methods Philipp Slusallek.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Sampling Design: Determine Where to Take Measurements Sampling Design: Determine Where to Take Measurements Empirical Approaches to Sensor Placement: Mobile.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Today Introduction to MCMC Particle filters and MCMC
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Overview and Mathematics Bjoern Griesbach
Aeronautics & Astronautics Autonomous Flight Systems Laboratory All slides and material copyright of University of Washington Autonomous Flight Systems.
Vision Guided Robotics
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Pablo F. Alcantarilla, Luis M. Bergasa Department of Electronics, University of Alcalá, Madrid, Spain Olivier Stasse, Sebastien Druon Joint Robotics Laboratory,
A Brief Overview of Computer Vision Jinxiang Chai.
Mobile Distributed 3D Sensing Sandia National Laboratories Intelligent Sensors and Robotics POC: Chris Lewis
Multiple Autonomous Ground/Air Robot Coordination Exploration of AI techniques for implementing incremental learning. Development of a robot controller.
Cooperating AmigoBots Framework and Algorithms
1/20 Obtaining Shape from Scanning Electron Microscope Using Hopfield Neural Network Yuji Iwahori 1, Haruki Kawanaka 1, Shinji Fukui 2 and Kenji Funahashi.
A General Framework for Tracking Multiple People from a Moving Camera
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Third Quarterly IPR Meeting May 11, 1999 P. I.s: Leonidas J. Guibas and Jean-Claude.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Chapter 14 Part II: Architectural Adaptation BY: AARON MCKAY.
Stereo Object Detection and Tracking Using Clustering and Bayesian Filtering Texas Tech University 2011 NSF Research Experiences for Undergraduates Site.
Introduction to Software Testing. Types of Software Testing Unit Testing Strategies – Equivalence Class Testing – Boundary Value Testing – Output Testing.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Spatio-Temporal Case-Based Reasoning for Behavioral Selection Maxim Likhachev and Ronald Arkin Mobile Robot Laboratory Georgia Tech.
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
Cooperative Air and Ground Surveillance Wenzhe Li.
Senior Project Poster Day 2005, CIS Dept. University of Pennsylvania Surface Reconstruction from Feature Based Stereo Mickey Mouse, Donald Duck Faculty.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Hyperspectral remote sensing
Behavior-based Multirobot Architectures. Why Behavior Based Control for Multi-Robot Teams? Multi-Robot control naturally grew out of single robot control.
Internet of Things. IoT Novel paradigm – Rapidly gaining ground in the wireless scenario Basic idea – Pervasive presence around us a variety of things.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Tracking with dynamics
IEEE International Conference on Multimedia and Expo.
University of Pennsylvania 1 GRASP Cooperative Control and Coordination of Multiple Robots Vijay Kumar GRASP Laboratory University of Pennsylvania
Planning Tracking Motions for an Intelligent Virtual Camera Tsai-Yen Li & Tzong-Hann Yu Presented by Chris Varma May 22, 2002.
Probabilistic Robotics Introduction. SA-1 2 Introduction  Robotics is the science of perceiving and manipulating the physical world through computer-controlled.
Rescue Robots A social relevant application Arnoud Visser DOAS Kick-off 7 January 2008.
Rigid Needles, Steerable Needles, and Optimal Beam Algorithms Ovidiu Daescu Bio-Medical Computing Laboratory Department of Computer Science University.
1 Dongheng Sun 04/26/2011 Learning with Matrix Factorizations By Nathan Srebro.
Automation as the Subject of Mechanical Engineer’s interest
Presented by Prashant Duhoon
Vehicle Segmentation and Tracking in the Presence of Occlusions
Overview: Chapter 2 Localization and Tracking
Presentation transcript:

University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP Laboratory

University of Pennsylvania 2 GRASP Multiple Autonomous Robots Hybrid Systems Approach to Robot Software l modes as behaviors l composition of modes Cooperative Control of Multiple Robots l cooperative manipulation l formation control l tracking l pursuit Human interaction l visualization

University of Pennsylvania 3 GRASP Vision for Multi-Robot Teams l Mobile platforms for deploying cameras into an environment l The case for cameras t Small t Cheap t Passive t Low Power l Uses for imagery t Visualization of remote environments t Obtaining information about targets w Position, level of activity etc. t Basis for convenient human robot interfaces

University of Pennsylvania 4 GRASP Visualization of Remote Environments l Registered omnidirectional images can be used to visualize remote scenes

University of Pennsylvania 5 GRASP Visualizing the scene l Scene can be interactively explored and/or revisited with a new camera trajectory specified by the user

University of Pennsylvania 6 GRASP GRASP Laboratory

University of Pennsylvania 7 GRASP View Synthesis with Quasi-Sparse Correspondences Dense correspondences can be difficult to obtain due to..  Occluded regions  Homogenous image regions l Strategy t Focus on accurately reproducing the motion of edges in the scene t Use interpolation to estimate the motion of the other points l Basis for visualization in MARS 2020

University of Pennsylvania 8 GRASP Novel view movies

University of Pennsylvania 9 GRASP Freespace Reasoning l We can reason about the structure of space by considering the union of the freespace volumes induced by a collection of triangulated disparity maps.

University of Pennsylvania 10 GRASP Results with 3D reasoning

University of Pennsylvania 11 GRASP Multi-Eyed Stereo Systems l Locations of targets and objects in the environment can be deduced from image measurements acquired by the robots l The robot team can effectively be viewed as a multi eyed stereo system

University of Pennsylvania 12 GRASP Sensor Planning and Control l Interesting property of these robot teams, estimates for various parameters of interest are obtained by combining measurements from multiple, distributed sensors l We could choose to view our team as a multi-eyed stereo system where the eyes can be moved l Question t Given that the sensor platforms are mobile, how should they be deployed in order to produce the best estimates?

University of Pennsylvania 13 GRASP C R denotes the robot configuration space, and  is an element of C R and denotes an element of this configuration space C W denotes the feature configuration space, and  is an element of C W and denotes an element of this configuration space denotes the measurements obtained by the robot team 22 [x t, y t ] T [x 2, y 2,  2 ] T 11 [x 1, y 1,  1 ] T x y   = [ x 1, y 1,  1, x 2, y 2,  2 ] T  = [ x t, y t ] T = [  1,  2 ] T Theoretical Framework

University of Pennsylvania 14 GRASP Given this terminology, one can define a quality function: which reflects the expected error in estimating the feature state from a given robot configuration  Objective: : is a function that provides an estimate of the feature state given the robots configuration and the sensor measurements : is a function that returns the expected error between the estimate returned by Est and the actual feature state for a particular robot configuration,  This will depend upon our model of sensor errors P(  ) : is a probability density function on C W Optimization Problem

University of Pennsylvania 15 GRASP The optimization problem of minimizing Q(  ) is typically difficult to solve analytically Particle Filtering approach: Approximate P(  ) by a set (  j,  j ), where  j is a single sample from C W, and  j a weight reflecting the probability of  j representing the state  The integral can then be approximated by a tractable summation. Computational Approach The resulting function is typically piecewise continuous in  and can be optimized using standard techniques

University of Pennsylvania 16 GRASP Implementation Example Target Position (  ) {  i,  i } Particle Set Estimated Particle Positions Particle Disparities

University of Pennsylvania 17 GRASP Integrating Sensing and Control Sense Predict Particle Filter Q Optimization Sensor Planner {  i,  i } Piggyback on particle filtering approach for sensing to obtain the particle set {  i,  i } Framework offers a complementary relationship between sensing and control

University of Pennsylvania 18 GRASP Tracking Targets

University of Pennsylvania 19 GRASP Tracking Targets cont’d

University of Pennsylvania 20 GRASP Handling Obstacles

University of Pennsylvania 21 GRASP Technology Transfer and Transition Robot hardware and software l University of Colorado l Oklahoma State University l Georgia Tech Evolution Robotics l Jim Ostrowski DoD Programs l MARS Teams (2020)