Gaze Controlled Robotic Camera System Anuj Awasthi Anand Sivadasan Veeral Patel.

Slides:



Advertisements
Similar presentations
A New Generation of Surgical Technique: Telesurgery Using Haptic Interfaces By Sarah L. Choy ~ A haptic interface is a force reflecting device which allows.
Advertisements

Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Voice Controlled Surgical Assistant ECE 7995 Dupindar ghotra, Muhammad Syed, Sam li, Sophia lei.
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
CP411 Computer Graphics, Wilfrid Laurier University Introduction # 1 Welcome to CP411 Computer Graphics 2012 Instructor: Dr. Hongbing Fan Introduction.
Facial feature localization Presented by: Harvest Jang Spring 2002.
Master’s Project Proposal Briefing Bill Champlin Java Quasi-Connected Components (JQCC) Tracking System March 10, 2009 Advisor - Dr. Terrance Boult.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
WebSTer: A Web-based Surgical Training System School of Computer Studies University of Leeds Nuha El-Khalili, Ken Brodlie and David Kessel
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
Application Programming Interface For Tracking Face & Eye Motion Team Members Tharaka Roshan Pathberiya Nimesh Saveendra Chamara Susantha Gayan Gunarathne.
Vision Guided Robotics
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Artificial Intelligence
Enhancing Fundamentals of Laparoscopic Surgery Trainer Box via Designing A Multi-Sensor Feedback System Qiongjie Tian, Lin Chen and Baoxin Li {Qiongjie.Tian,
REAL TIME EYE TRACKING FOR HUMAN COMPUTER INTERFACES Subramanya Amarnag, Raghunandan S. Kumaran and John Gowdy Dept. of Electrical and Computer Engineering,
Wireless Innovations for the Production Floor Visual Management Software Virtual Panel III Software Visual Messaging Software Factory Floor Communication.
IMPLEMENTATION ISSUES REGARDING A 3D ROBOT – BASED LASER SCANNING SYSTEM Theodor Borangiu, Anamaria Dogar, Alexandru Dumitrache University Politehnica.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Humanoid Robot Head May Team Members: Client/Faculty Advisor: Dan Potratz (CprE) Tim Meer (EE) Dr. Alex Stoytchev Cody Genkinger (CprE) Jason Pollard.
Eigenedginess vs. Eigenhill, Eigenface and Eigenedge by S. Ramesh, S. Palanivel, Sukhendu Das and B. Yegnanarayana Department of Computer Science and Engineering.
Robotic Radiation Oncology
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Two Handed and Gaze Input Stanford and Princeton Lecture Nov 29, 1999 Shumin Zhai.
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
The effects of relevance of on-screen information on gaze behaviour and communication in 3-party groups Emma L Clayes University of Glasgow Supervisor:
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
N n Debanga Raj Neog, Anurag Ranjan, João L. Cardoso, Dinesh K. Pai Sensorimotor Systems Lab, Department of Computer Science The University of British.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Amr Eshak, Amadou Bah, Michael Estevez
GAZE ESTIMATION CMPE Motivation  User - computer interaction.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Gaze-based Interfaces for Internet
Robotic Surgery Student Watch “Taking surgery beyond the limits of the human hand”™ Stuart Graham RN Robotic Surgery Coordinator.
CHROMATIC TRAILBLAZER 25 th November, 2008 University of Florida, Department of Electrical & Computer Engineering, Intelligent Machine Design Lab (EEL.
Contents  Teleoperated robotic systems  The effect of the communication delay on teleoperation  Data transfer rate control for teleoperation systems.
Jack Pinches INFO410 & INFO350 S INFORMATION SCIENCE Computer Vision I.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Automated Maze System Development Group 9 Tanvir Haque Sidd Murthy Samar Shah Advisors: Dr. Herbert Y. Meltzer, Psychiatry Dr. Paul King, Biomedical Engineering.
Student Name USN NO Guide Name H.O.D Name Name Of The College & Dept.
P15051: Robotic Eye Project Definition Review TIM O’HEARNANDREW DROGALISJORGE GONZALEZ KATIE HARDY DANIEL WEBSTER.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
Virtual Pointing Device Using Stereo Camera The 6th International Conference on Applications and Principles of Information Science Jan , 2007, Kuala.
Slicer IGT : Workflow based design Andinet Enquobahrie, PhD Kitware Inc.
  Computer vision is a field that includes methods for acquiring,prcessing, analyzing, and understanding images and, in general, high-dimensional data.
TUMOR BURDEN ANALYSIS ON CT BY AUTOMATED LIVER AND TUMOR SEGMENTATION RAMSHEEJA.RR Roll : No 19 Guide SREERAJ.R ( Head Of Department, CSE)
TECHNICAL SEMINAR ON. ABSTRACT INTRODUCTION USERS OF THE EYEGAZE SYSTEM SKILL NEEDED BY THE USERS PARTS AND WORKING HOW TO RUN THE EYEGAZE SYSTEM USES.
© CISST ERC, 2011 Integration of LARS and Snake Robots & System Development Project Plan February 15, – Computer Integrated Surgery II H.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
EYE TRACKING TECHNOLOGY
EYE-GAZE COMMUNICATION
AHED Automatic Human Emotion Detection
RGBD Camera Integration into CamC Computer Integrated Surgery II Spring, 2015 Han Xiao, under the auspices of Professor Nassir Navab, Bernhard Fuerst and.
GESTURE CONTROLLED ROBOTIC ARM
Vision for Robotic Applications
Robotic Arm Project Presentation
Video-based human motion recognition using 3D mocap data
Higher School of Economics , Moscow, 2016
IEEE/ASME TRANSACTIONS ON MECHATRONICS, VOL. 10, NO. 4, AUGUST 2005
Project #2 Multimodal Caricatural Mirror Intermediate report
AHED Automatic Human Emotion Detection
Jang Pyo Bae1, Dong Heon Lee2, Jae Soon Choi3, and Hee Chan Kim4
AHED Automatic Human Emotion Detection
PRELIMINARY DESIGN REVIEW
Higher School of Economics , Moscow, 2016
Presentation transcript:

Gaze Controlled Robotic Camera System Anuj Awasthi Anand Sivadasan Veeral Patel

Outline Background Significance Problem Statement Concept Methodology Specific aims Budget Project Participation Time Frame

Background Laparoscopic robotic surgery Eye tracker application  Visual mouse Human factors  Computer vision based control  Face mouse  Voice control

Requirements in Laparoscopic Surgery Maintain the surgical point of interest in the centre of the image. Provide the required magnification of the area. Produce and maintain a horizontal image of the point of interest. Perform the preceding actions automatically, although they can be modulated by the surgeon.

Visual Mouse Application Obtaining the horizontal and vertical coordinates with the eye tracker The technique of live streaming of the horizontal and vertical coordinates Interfacing of eye tracker and computer

Eye Tracker System

Human Factors Computer vision based control of robotic camera Camera control based on computer vision tracking of the surgical tools. Image processing used to differentiate surgical tool of interest from surroundings. No input required from the surgeon. Disadvantages Surgeon’s area of interest not taken into consideration Assumes surgical area to be surgeon’s area of interest always Surgeon ends up looking at corners of the screen often

Human Factors Face Mouse control for robotic camera Image based system Tracks facial features of surgeon real time Controls camera based on pitch, yaw and roll of surgeon’s face Disadvantages Constant face movements causes strain Difficult to keep pace with movement of tools

Human Factors Voice control of Robotic camera Uses voice and pedal controls Uses voice recognition techniques Set of voice commands the camera Disadvantages Considerable burden on surgeon Difficult to perform dual inputs

Significance Reduction in work load on surgeon Accuracy of surgical tasks Impact on surgical time Hands Free Control

Problem Statement “To develop a camera control system which reduces the work load on the surgeon without compromising on the quality of surgeon’s video display ”

Concept Gaze based robotic camera Acquire gaze of the surgeon with eye tracker. Camera manipulation using eye tracker data interfaced with robot controls

Robotic Hardware A small wireless 320 X 240 resolution camera with an inbuilt transmitter A Receiver Set Two Servomotors (HS 422) Links Usbor Servo Controller Pivot Post Gripper Washer, Set of Clamps, Bolts, Nuts Eye tracker System

Methodology Operation siteSurgeon Site

Surgical Site Server System (HOST Computer) Usbor Servo Controller  Visual C Coding Servo Motors Robot Arm End Effecter  Inverse Kinematics to be followed Wireless Camera  AAA Battery supplied Receiver

Surgeon’s Site Dedicated system (Client ) Image Acquisition through Internet  Streaming Video  Live Motion JPEG System Image Processing  Intel’s Open CV Library  Improve Brightness and Contrast Eye tracker System

Fuzzy Based Control Pupil Cluster 1 Cluster 2 Cluster 3 Cluster 4 Cluster 5 Cluster 6

Fuzzy C-Means Algorithm Point of Gaze keeps fluctuating. Entire Eye tracker screen supposed to be divided into clusters. Fuzzy C-Means Algorithm used. Degree of belongingness of the point of gaze to a cluster is supposed to be the Degree of Membership of the fuzzy function Point of Gaze co-ordinates assumed to be same as co-ordinates of cluster centers.

Specific Aims To cover the surgical area with camera. To obtain the point of gaze of the surgeon with eye tracker. To control the robotic camera based on the point of gaze coordinates. To facilitate Surgeon’s view.

Budget

Project Participation Robot Assembling : Veeral, Anuj & Anand Inverse kinematics : Anuj & Anand Software for Kinematics Control: Anuj & Veeral Interfacing Eye tracker and Robot : Anuj, Veeral & Anand Eye Tracker Output : Anand & Veeral

Time Frame TaskTime Duration ConceptualizationJan 10th-Jan 25th Literature ReviewJan 26th-Feb 10th Ordering HardwareFeb 15th Proposal WritingFeb 15th-Feb 27th Robot AssemblingFeb 28th-March 5th Software DevelopmentMarch 5th-March 25th Final Report WritingMarch 25th-April 10th TestingApril 10th-April 20th

References M. Farid,F. Murtagh,J.L. Starck.” Computer Display Control and Interaction using Eye-Gaze". School of Computer Science,Belfast,UK. Atsushi Nishikawa “Face Mouse : A Novel Human- Machine Interface for Controlling the Position of a Laparoscope” IEEE Transactions on Robotics and Automation,Vol. 19,No. 5,October Murtagh F. ”Eye Gaze Tracking System-Visual Mouse Application Development”,3rd Year Training Report,E.N.P.S. Engineering Degree, March- August 2001.

Reference (Contd..) M.E. Allaf. “Laparoscopic Visual Field – Voice vs. foot Pedal interfaces for control of AESOP Robot "Surgical Endoscopy.Feb A. Casals,J. Amat,E. Laporte. ”Automatic Guidance of an Assistant Robot in Laparoscopic Surgery” International Conference on Robotics and Automation, IEEE R. Hurteau,S. DeSantis “Laparoscopic Surgery Assisted by a Robotic Cameraman:Concept and Experimental Results”IEEE 1994.

References (Contd….) George P. Mylonas,Danail Satyanov. ”Gaze Contingent Soft tissue Deformation Tracking for Minimally Invasive Robotic Surgery” MICCAI 2005, LNCS 3749, pp. 843 – 850, Shamsi T. Iqbal,Brian P. Bailey. “Using Eye- Gaze Patterns to Identify User tasks”GHC04,2004

THANK YOU!!!!!! QUESTIONS???????