Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Voice Controlled Surgical Assistant ECE 7995 Dupindar ghotra, Muhammad Syed, Sam li, Sophia lei.
COMP322/S2000/L41 Classification of Robot Arms:by Control Method The Control unit is the brain of the robot. It contains the instructions that direct the.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Vision Based Control Motion Matt Baker Kevin VanDyke.
ECE 480 Design Team 3 Doug’s Kitchen Robot Team Members - Thomas Manner - Ali Alsatarwah - Ka Kei Yeung - Daniel Phan Team Facilitator - Professor Lixin.
IE 447 COMPUTER INTEGRATED MANUFACTURING CHAPTER 9 Material Handling System 1 IE CIM Lecture Notes - Chapter 9 MHS.
1 CMPUT 412 Actuation Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A.
ECE 480 Design Team 3 Doug’s Kitchen Robot Team Members - Thomas Manner - Ali Alsatarwah - Ka Kei Yeung - Daniel Phan Team Facilitator - Professor Lixin.
Inverse Kinematics Problem:
Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.
Medical Robotics Application for Motion Emulation of Parkinsonian ECE 7995 Advisor Dr. Abhilash Pandya Group #2 Ranvir Hara Ravanpreet Kaur Gaganjeet Hans.
Inverse Kinematics Problem: Input: the desired position and orientation of the tool Output: the set of joints parameters.
Gaze Controlled Robotic Camera System Anuj Awasthi Anand Sivadasan Veeral Patel.
Real-Time Object Tracking System Adam Rossi Meaghan Zorij
Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of.
Mahesh Sukumar Subramanian Srinivasan. Introduction Face detection - determines the locations of human faces in digital images. Binary pattern-classification.
Twitch Plays ECE477 ECE 477 Design Review Team 1 − Spring 2015 Hannan Harlan Root Tornquist.
Robots at Work Dr Gerard McKee Active Robotics Laboratory School of Systems Engineering The University of Reading, UK
PPT ON ROBOTICS AEROBOTICSINDIA.COM. ROBOTICS WHAT IS ROBOTICS THE WORD ROBOTICS IS USED TO COLLECTIVILY DEFINE A FIELD IN ENGINEERING THAT COVERS THE.
Introduction to Robotics Principles of Robotics. What is a robot? The word robot comes from the Czech word for forced labor, or serf. It was introduced.
Automated Inspection Using Machine Vision
Electronic Visualization Laboratory University of Illinois at Chicago Interaction between Real and Virtual Humans: Playing Checkers R. Torre, S. Balcisoy.
(CONTROLLER-FREE GAMING
IMPLEMENTATION ISSUES REGARDING A 3D ROBOT – BASED LASER SCANNING SYSTEM Theodor Borangiu, Anamaria Dogar, Alexandru Dumitrache University Politehnica.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
ISAT 303 Mod 1-1  M. Zarrugh Module I Sensors and Measurements in MFG  The objectives of this module are to –understand the role which sensors.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Álvaro Cassinelli, Stéphane Perrin, Masatoshi Ishikawa Ishikawa-Namiki-Laboratory Parallel Processing for Sensory information University of Tokyo.
LITERATURE SURVEY 1.EFFICIENT PATH PLANNING IN SEMI- FAULT TOLERANT ROBOTICS 2. Caesar robot.
IntroductionToSensorML Alexandre Robin – October 2006.
Service Robots A Tire Changing Manipulator. Overview n The robotic System changes the tires of Formula 1 type cars. n Considerations: Speed, Accuracy,
Full-body motion analysis for animating expressive, socially-attuned agents Elisabetta Bevacqua Paris8 Ginevra Castellano DIST Maurizio Mancini Paris8.
Human Interaction Development Using the Countess Quanta Robot Brad Pitney Yin Shi.
HARDWARE INTERFACE FOR A 3-DOF SURGICAL ROBOT ARM Ahmet Atasoy 1, Mehmed Ozkan 2, Duygun Erol Barkana 3 1 Institute of Biomedical Engineering, Bogazici.
B3AS Joseph Lewthwaite 1 Dec, 2005 ARL Knowledge Fusion COE Program.
INEMO™ Demonstration Kit DOF (Degrees of Freedom) platform  The STEVAL-MKI062V2 combines accelerometers, gyroscopes and magnetometers with pressure.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
MIR – Mobile Intelligence Robot By Jason Abbett and Devon Berry.
Kinect & 3D Scanning Mark Breedveld
Background Information systems requiring sensor data input must generally include means for sensor data fusion as well as powerful mechanisms for user.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Spatiotemporal Saliency Map of a Video Sequence in FPGA hardware David Boland Acknowledgements: Professor Peter Cheung Mr Yang Liu.
CS 351/ IT 351 Modeling and Simulation Technologies HPC Architectures Dr. Jim Holten.
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
IN 1900 ICT Project Final Presentation. Group name : Code Squad.
Robots.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
3DDI: 3D Direct Interaction John Canny Computer Science Division UC Berkeley.
Robotic Arm and Dexterous Hand Preliminary Design Review November 12, 2004.
KAASHIV INFOTECH – A SOFTWARE CUM RESEARCH COMPANY IN ELECTRONICS, ELECTRICAL, CIVIL AND MECHANICAL AREAS
ROBOTIC COMPONENTS, MOVEMENTS AND ARTICULATION DESIGN & APPLIED ENGINEERING II MR. RANDT.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
IEEE South East Conference 2016 MID-SEMESTER PRESENTATION.
Ali Ghadirzadeh, Atsuto Maki, Mårten Björkman Sept 28- Oct Hamburg Germany Presented by Jen-Fang Chang 1.
April 30 th, 2010 Freedom Innovation Research. Topics Covered Introduction System Overview Project Budget Timeline Future Development Question and Answers.
E8: Digital Humans Option E AHL: Human Factors Design IB Technology.
Fan Assembly Driven by Magnetic Fields
Lesson Objectives Aims You should be able to:
Intelligence Crane By: Maysoon Ashayer Muna Sholi Supervised by:
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Zaid H. Rashid Supervisor Dr. Hassan M. Alwan
Robotic Arm Project Presentation
Mixed Reality Server under Robot Operating System
ECE 477 Digital Systems Senior Design Project  Spring 2006
2 DOF Robotic Arm By: Zachary Guy ET 493 – Fall 2018
Presentation transcript:

Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”

Background Vrobotics is dedicated to the development of robotic technologies aimed at assisting people with interacting with their environment. Our first robotic manipulator, V, is designed to work through voice and visual cues to perform basic functions given by the user. Ultimately, V will be able to replace larger tasks such as cleaning your kitchen and even cooking your food! Vrobotics

V Voice Recognition – Processing audio input into strings that can be used as input for the robotic arm and its subsystems. Control Systems – The kinematic modelling and algorithms that allow the arm to perform its required motion efficiently and effectively. Image processing – Processing video input into tangible object characteristics and spatial information Sensor Integration – Transforming raw data streams into real-world information about V’s actions and its surroundings. Electronics and Mechanical Systems – Developing of efficient and effectively hardware capable of performing the tasks requested by the user in a timely manner. V Systems Overview

Current Progress Voice Recognition: Voice recognition of user defined dictionary with output as string to be used by other modules is complete. Control Systems : Inverse kinematic modelling and speed control algorithms for motion have been developed and tested successfully. Image processing : Image recognition of objects based on color in realtime is complete. Sensor Integration : Transformation of laser scan data with robotic system into a local inertial frame is complete. Electronics and Mechanical Systems: Prototype of final design joints is complete using servo. Prototype is designed primarily as a platform for testing control system algorithms Vrobotics

Demo Vrobotics

The Next Steps Voice Recognition: Control Systems : Development of Jacobians for more precise motor speed control as well as the inverse kinematics for the final arm design with multiple L- joints. Image processing : Including object shape as tracking criteria. Sensor Integration : Pressure Sensor, absolute encoders, temperature, and power sensing. Electronics and Mechanical Systems: Development of high load, high precision joints with simultaneous multi- motor control. Vrobotics