RoboCup 2016 - KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
Automatic Color Gamut Calibration Cristobal Alvarez-Russell Michael Novitzky Phillip Marks.
Hough Transforms CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
ACIP Nathan Bossart Joe Mayer RASCAL ACIP. Background and Current Status RASCAL ACIP Boeing and SSRL Defining Mission Additional Constraints Sensor Software.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Gaze Tracking Using A Webcamera.
Facial feature localization Presented by: Harvest Jang Spring 2002.
Student: Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor: Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams Anaglyph Videos.
Advanced Cubesat Imaging Payload Robert Urberger, Joseph Mayer, and Nathan Bossart ECE 490 – Senior Design I – Department of Electrical and Computer Engineering.
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
Digital Cameras CCD (Monochrome) RGB Color Filter Array.
Traffic Sign Recognition Jacob Carlson Sean St. Onge Advisor: Dr. Thomas L. Stewart.
I-SOBOT SOCCER Padmashri Gargesa Intelligent Robotics I I (Winter 2011)
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
Facial Features Extraction Amit Pillay Ravi Mattani Amit Pillay Ravi Mattani.
Smart Traveller with Visual Translator. What is Smart Traveller? Mobile Device which is convenience for a traveller to carry Mobile Device which is convenience.
Behavior Analysis Midterm Report Lipov Irina Ravid Dan Kotek Tommer.
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
LOGO FACE DETECTION APPLICATION Member: Vu Hoang Dung Vu Ha Linh Le Minh Tung Nguyen Duy Tan Chu Duy Linh Uong Thanh Ngoc CAPSTONE PROJECT Supervisor:
Vision-Based Biometric Authentication System by Padraic o hIarnain Final Year Project Presentation.
Virtual painting project By: Leetal Gruper Tsafrir Kamelo Supervisor: Michael Kolomenkin Advisor from 3DV systems: Sagi Katz.
Intelligent Ground Vehicle Competition Navigation Michael Lebson - James McLane - Image Processing Hamad Al Salem.
Kalman Tracking for Image Processing Applications Student : Julius Oyeleke Supervisor : Dr Martin Glavin Co-Supervisor : Dr Fearghal Morgan.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Shane Tuohy.  In 2008, rear end collisions accounted for almost 25% of all injuries sustained in road traffic accidents on Irish roads [RSA Road Collision.
By Meidika Wardana Kristi, NRP  Digital cameras used to take picture of an object requires three sensors to store the red, blue and green color.
ELECTRONIC CONDUCTING SYSTEM Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connan Assisted by: Diego Mushfieldt.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
Knowledge Systems Lab JN 9/10/2002 Computer Vision: Gesture Recognition from Images Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Electronic Conducting System Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connon Mentored by: Diego Mushfieldt.
Senior Design Project Megan Luh Hao Luo March
REAL TIME FACE DETECTION
Augmented Reality and 3D modelling By Stafford Joemat Supervised by Mr James Connan.
1 Artificial Intelligence: Vision Stages of analysis Low level vision Surfaces and distance Object Matching.
Senior Design Project Megan Luh Hao Luo Febrary
Senior Design Project Megan Luh Hao Luo January
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
CSP Visual input processing 1 Visual input processing Lecturer: Smilen Dimitrov Cross-sensorial processing – MED7.
` Tracking the Eyes using a Webcam Presented by: Kwesi Ackon Kwesi Ackon Supervisor: Mr. J. Connan.
Magic Camera Master’s Project Defense By Adam Meadows Project Committee: Dr. Eamonn Keogh Dr. Doug Tolbert.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Automating Scoliosis Analysis By Amar Sahai Thomas Jefferson High School for Science and Technology
By: David Gelbendorf, Hila Ben-Moshe Supervisor : Alon Zvirin
The Optical Telemeter Supervisor - Mr Reg Dodds Co – Supervisor - Mentor- Mr Dane Brown Presented by Mutende Msiska.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Eye Tracking Using A Simple Webcamera.
Auxiliary Application for Mammography system calibration and testing Prepared by: Alex Shkatov Professional advice and images: Andrey Broisman, Ariel University.
October 1, 2013Computer Vision Lecture 9: From Edges to Contours 1 Canny Edge Detector However, usually there will still be noise in the array E[i, j],
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan.
Intelligent Robotics Today: Vision & Time & Space Complexity.
Alan Cleary ‘12, Kendric Evans ‘10, Michael Reed ‘12, Dr. John Peterson Who is Western State? Western State College of Colorado is a small college located.
Digital Image Processing
Student: Thabang Kgwefane Supervisor: James Connan.
Vision & Image Processing for RoboCup KSL League Rami Isachar Lihen Sternfled.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
LOGO FACE DETECTION APPLICATION Member: Vu Hoang Dung Vu Ha Linh Le Minh Tung Nguyen Duy Tan Chu Duy Linh Uong Thanh Ngoc CAPSTONE PROJECT Supervisor:
Robocup – Localization Yaniv Shachor Asaf Brezler.
OpenCV C++ Image Processing
CSC391/691 Intro to OpenCV Dr. Rongzhong Li Fall 2016
Depth Analysis With Stereo Cameras
UAV Vision Landing Motivation Data Gathered Research Plan Background
Introduction to Computer and Human Vision
- Aalhad Patankar, Bryan Li, David Watkins
Chapter 1: Image processing and computer vision Introduction
CS654: Digital Image Analysis
Lecture 16 Figures from Gonzalez and Woods, Digital Image Processing, Second Edition, 2002.
眼動儀與互動介面設計 廖文宏 6/26/2009.
Vision Tracking System
Vision Based UAV Landing
Grape Detection in Vineyards Introduction To Computational and Biological Vision final project Kobi Ruham Eli Izhak.
Tracking the Eyes using a Webcam
Presentation transcript:

RoboCup KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr. Amsalem Rafi Hen Shoob Assaf Rabinowitz

Vision Team Goals  The Vision module is responsible for image processing. The main goal is to detect meaningful objects - ball, goal and white lines. The implementation uses some functions from the OpenCV image processing library [1].  In the 1 st semester, we were responsible of Goal Detection.  Goal Detection is mandatory input for the other cores of the robot, such as the brain and the localization.  The localization output of where the robot is located in the field, and the brain decision of the kick direction, are based on if and where the goal is located in relation to the robot. [1] G. Bradski and A. Kaehler, Learning OpenCV: Computer Vision with the OpenCV Library, O’Reilly Media, October, 2008.

Calibration Tool  First, we were needed to be able to distinguish green (field) and white (goal posts) objects from other objects in a given image. We need to configure the HSV ranges of white and green [2]. In order to do so we developed a calibration tool.  The calibration tool adjusts our image-processing to the current environment colors. The robot defines the green and the white color spectrum.  The user clicks on the green\white pixels in the original image and the tool saves the configuration for each color. [2] R. Gonzalez and R. Woods, Digital Image Processing, Third Edition, Pearson Education, 2008.

Goal Detection Algorithm  Input: Raw image from the robot’s camera.  Output: Data structure that contains information about the goal, such as is the goal detected (full\partial detection), the center of the goal, etc.

Goal Detection Algorithm – cont.  The algorithm is based on finding objects in the input image, which are suspected as the goal’s posts, and selecting the most relevant ones.  Object is suspected as a post if it satisfies the following terms:  White: First, we use a simple threshold function on the HSV transform of the given image. We get a B&W image, in which only white objects are white.  Vertical: We perform a vertical erosion algorithm on the given image to remove any horizontal white objects from the image. Only vertical white objects are left.  Rectangle-shaped: We surround all these objects with minimum area rectangles. We check the ratio between the output rectangle and the white-object area, and we eliminate any rectangle that does not satisfy the threshold ratio.  Straight-angled: From the robot’s eyes, the posts are orthogonal to the field’s plane. We check that the rectangles angle is close to zero.  Inter-edge ratio: We eliminate any rectangle that does not characterized by long vertical edge and a short horizontal edge.  Goal Detection Video Goal Detection Video

Targets  Integrate our code into the system and maintenance.  Develop a robust algorithm for calculating the distance to all detected objects such as the ball, the goal and the white lines on the field.  Develop a goalie system, including brain (i.e. FSM), ball movement algorithm, etc.  Create an interface between the vision core and the localization core.