Examples of Nonlinear Systems: ROBOTS New Robotic Treatment Systems for Childhood Autism and Cerebral Palsy Joint Work with N. BugnariuA, D. HansonB, F.

Slides:



Advertisements
Similar presentations
HAPTICS.
Advertisements

Finger Gesture Recognition through Sweep Sensor Pong C Yuen 1, W W Zou 1, S B Zhang 1, Kelvin K F Wong 2 and Hoson H S Lam 2 1 Department of Computer Science.
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
David Rosen Goals  Overview of some of the big ideas in autonomous systems  Theme: Dynamical and stochastic systems lie at the intersection of mathematics.
Breakout session B questions. Research directions/areas Multi-modal perception cognition and interaction Learning, adaptation and imitation Design and.
Biological Arm Motion through Reinforcement Learning by Jun Izawa, Toshiyuki Kondo, Koji Ito Presented by Helmut Hauser.
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Robot Localization Using Bayesian Methods
In this project, a structure made of three servos, resembling a human neck, was constructed with three degrees of freedom (DOF). I first checked the performance.
Vision-Based Interactive Systems Martin Jagersand c610.
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
COMPUTER GAMES IN CEREBRAL PALSY (CP) THERAPY RYAN JACKSON, JEN FAITH, TARA SILIANOFF, LAURA MANSON, KATIE EMERY, DIANA AZOSE, ANJALI NIGAM.
Virtual reality therapy simulates real life learning incorporating increased sensory input by the use of technology. Interaction with the 3D technology.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
An Active Orthosis For Cerebral Palsy Children
Kalman filter and SLAM problem
1 Assistive Human-Machine Interfaces via Artificial Neural Networks Wei Tech Ang & Cameron N. Riviere The Robotics Institute Carnegie Mellon University.
A FACEREADER- DRIVEN 3D EXPRESSIVE AVATAR Crystal Butler | Amsterdam 2013.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
/09/dji-phantom-crashes-into- canadian-lake/
Part 4: Systematic, Planful Instruction, Including the Development of Social Interactions.
Wireless Networks Breakout Session Summary September 21, 2012.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
T. Bajd, M. Mihelj, J. Lenarčič, A. Stanovnik, M. Munih, Robotics, Springer, 2010 ROBOT CONTROL T. Bajd and M. Mihelj.
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
Tracking with CACTuS on Jetson Running a Bayesian multi object tracker on a low power, embedded system School of Information Technology & Mathematical.
Tracking with CACTuS on Jetson Running a Bayesian multi object tracker on an embedded system School of Information Technology & Mathematical Sciences September.
Semi-automated Coaching for Elderly Collaborative effort between UCBerkeley, OHSU and NorthEastern University.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Multimedia System and Networking UTD Slide- 1 University of Texas at Dallas B. Prabhakaran Rigging.
Curiosity-Driven Exploration with Planning Trajectories Tyler Streeter PhD Student, Human Computer Interaction Iowa State University
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Project Overview: The purpose of this project is to make use of existing robotic behaviors to develop intuitive, easy to use robot / human interfaces.
MindRACES, First Review Meeting, Lund, 11/01/ Anticipatory Behavior for Object Recognition and Robot Arm Control Modular and Hierarchical Systems,
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Extended Kalman Filter
Simulation and Experimental Verification of Model Based Opto-Electronic Automation Drexel University Department of Electrical and Computer Engineering.
REU 2009 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Information Processing.
What is Multimedia Anyway? David Millard and Paul Lewis.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Simulation of Characters in Entertainment Virtual Reality.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
Assisted Cognition Systems Henry Kautz Department of Computer Science.
MHMR T ARRANT S UPPORTING I NDIVIDUALS WITH A UTISM S PECTRUM D ISORDER AND I NTELLECTUAL D ISABILITY Monica Durham, PsyD Michael J. Parker, PhD MFP Webinar.
National University of Singapore
VIRTUAL INTELLIGENCE PROJECT NATAL (Kinect & Xbox 360)
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
College of Engineering and Applied Sciences
IPAB Research Areas and Strengths
Artificial Intelligence (CS 370D)
Chapter 6: Temporal Difference Learning
POPULAR TYPES OF AUTOMATION SYSTEMS
When to engage in interaction – and how
PETRA 2014 An Interactive Learning and Adaptation Framework for Socially Assistive Robotics: An Interactive Reinforcement Learning Approach Konstantinos.
GESTURE CONTROLLED ROBOTIC ARM
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Manipulation in Human Environments
Iterative Optimization
Towards lifelike Computer Interfaces that learn
Multimodal Human-Computer Interaction New Interaction Techniques 22. 1
HERACLEIA Human Centered Computing Lab,
October 6, 2011 Dr. Itamar Arel College of Engineering
Chapter 6: Temporal Difference Learning
Extended Kalman Filter
Experimental Evaluation
Extended Kalman Filter
Robot Programming Through Augmented Trajectories in Augmented Reality
Cengizhan Can Phoebe de Nooijer
Presentation transcript:

Examples of Nonlinear Systems: ROBOTS New Robotic Treatment Systems for Childhood Autism and Cerebral Palsy Joint Work with N. BugnariuA, D. HansonB, F. MakedonC ADepartment of Physical Therapy, University of North Texas Health Sciences Center (UNT HSC) BHanson Robotics Inc., Plano, TX, USA CDepartment of Computer Science & Engineering Department, University of Texas at Arlington, USA NGS Focus: Human-Robot Interaction Ph.D. Students: Isura Ranatunga, Nahum Torres This work was supported by: US National Science Foundation Grants #CPS 1035913, and #CNS 0923494. TxMed consortium grant: “Human-Robot Interaction System for Early Diagnosis and Treatment of Childhood Autism Spectrum Disorders (RoDiCA)”

NGS Robots with HRI and pHRI Two assistive robotic systems aimed at the treatment of children with certain motor and cognitive impairments. In the Neptune project [1] Mobile manipulator for children suffering from Cerebral-Palsy. Mobile robot base and a 6DOF robotic arm, interfaced via: Wii Remote, iPad, Neuroheadset, the Kinect, and Force sensing robotic skin Therapeutic outcomes Hand and head gesture recognition and reward. Hand motion excercises using IPAD Games (CPlay, CPMaze, ProlloquoToGo) held by the robot. The RoDiCA project [2] focuses on treating cognitive impairments in children suffering from ASD Zeno is a robotic platform developed by Hanson Robotics, based on a patented realistic skin. Real time subject tracking/joint attention Advanced head-eye and hand coordination Facial gesture recognition and synthesis Data logging and analysis. Neptune Mobile manipulator with iPad attached. Zeno (by Hanson RoboKind Inc.) generating facial expressions and maintaining eye contact. 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Advanced Control for Human Robot Interaction Realistic & Intuitive Human-Robot Interaction Physical HRI Recognize & Synthesize poses and gestures Adaptive Interfaces Visual HRI Robot Touch HRI Neptune Control through Neural Headband Zeno Video 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Neptune: Assistive Robotic System for CP 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Multiscale Robots and Systems Lab University of Texas Arlington Adaptive Interfaces The supervisory control of multi-DOF robots is a demanding application. If a single operator is tasked with direct control, performing coordinated tasks becomes non-intuitive. We use Reinforcement Learning TD(lambda) scheme in order to adaptively change the mapping of DOF’s from the operator user interface to the robot. State Propagation Interface Mapping System Metrics Evaluation state Interface input action Update Reward function reward Critic Actor TD error Value Function Policy 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Interface Mapping of Neural Headband to Robot Experiments No Mapping Update “Emotion Energy” 11th episode “mental workload” plot EE = 1867.19 With Mapping Update 11th episode “mental workload” plot 21th episode “mental workload” plot EE = 1644.07 EE = 1467.91

Advanced Control for Human Robot Interaction Realistic & Intuitive Human-Robot Interaction Physical HRI Recognize and control poses and gestures Adaptive Interfaces Visual HRI Robot Touch HRI Neptune Control through Neural Headband Zeno Video 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Gesture Recognition and Synthesis with Zeno Synthesis of realistic motion distribution in redundant mechanisms, for instance: coordination of motion between neck and eyes during object tracking hand-body gesturing and facial gestures We formulated online optimization algorithms, including reinforcement learning, combined with visual servoing for these problems [5, 6, 7]. Block diagram of neck-eye motion control system for conversational interaction with Zeno Object Pose Tracking Error Good match with human tracking response 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Control Diagram: Convergence of the Error Correction Algorithm Object Pose Tracking Error I’m going say a few words about the implementation very quickly. Proof of exponentially stable tracking on the paper. This error correction scheme yields exponentially stable tracking

Interaction through hand gestures Gestures performed by the user are recognized by Kinect or Wii Mote in real-time, then played back by robot arm or used as rewards during rehabilitation excercises Match percentage for Line gesture with 100 neurons Testing sets Neural Net Size Match (X) percentage Match (Y) Percentage MSE Set 1 100 78.26 93.23 Set 2 69.34 95.06 Set 3 79.98 95.14 Set 4 83.01 97.30 Set 5 68.88 89.97 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Multiscale Robots and Systems Lab University of Texas Arlington Zeno mimicking user Zeno has both hand-gesture playback from user as well as scripted sequences to encourage hand coordination. 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Advanced Control for Human Robot Interaction Realistic & Intuitive Human-Robot Interaction Physical HRI (pHRI) Recognize & Synthesize poses and gestures Adaptive Interfaces Visual HRI Robot Touch HRI Neptune Control through Neural Headband Zeno Video 9/18/2018 Multiscale Robots and Systems Lab University of Texas Arlington

Physical HRI using Robotic Skin A CRS A465 robot arm and an artificial skin patch used as a one dimensional force sensor Pressure sensors mounted on Neptune Ipad Actual force reading from PZT Artificial Skin

Physical HRI – Algorithm and Results Kalman filter, impedance controller, and computed-torque control Force measurement in 1D is sufficient for estimating interaction forces in all 3 directions Push force Model uncertainty leads to poor estimation J. Rajruangrabin, D.O. Popa, "Enhancement of Manipulator Interactivity Through Compliant Skin and Extended Kalman Filtering," in Proc. of IEEE Conference on Automation Science and Technology (CASE), Scottsdale, AZ, September 2007. With one dimension force measurement along x the estimation result in y and z is better even with the presence of 2% rms noise.