Boston University--Harvard University--University of Illinois--University of Maryland Distributed Sensor Fields and Uncertainty: Bio-mimetic Methods for.

Slides:



Advertisements
Similar presentations
Auditory Localisation
Advertisements

Sound Localization Superior Olivary Complex. Localization: Limits of Performance Absolute localization: localization of sound without a reference. Humans:
Evaluation of Dummy-Head HRTFs in the Horizontal Plane based on the Peak-Valley Structure in One-degree Spatial Resolution Wersényi György SZÉCHENYI ISTVÁN.
V-1 Part V: Collaborative Signal Processing Akbar Sayeed.
University of North Carolina at Chapel Hill Spatial Sound Localization for Robots Nikunj Raghuvanshi.
Binaural Hearing Or now hear this! Upcoming Talk: Isabelle Peretz Musical & Non-musical Brains Nov. 12 noon + Lunch Rm 2068B South Building.
Purpose The aim of this project was to investigate receptive fields on a neural network to compare a computational model to the actual cortical-level auditory.
Extracting the pinna spectral notches Vikas C. Raykar | Ramani Duraiswami University of Maryland, CollegePark B. Yegnanaryana Indian Institute of Technology,
3-D Sound and Spatial Audio MUS_TECH 348. Wightman & Kistler (1989) Headphone simulation of free-field listening I. Stimulus synthesis II. Psychophysical.
Localizing Sounds. When we perceive a sound, we often simultaneously perceive the location of that sound. Even new born infants orient their eyes toward.
Minimum Audible Angle Measured in Young and Old CBA Mice Using Prepulse Inhibition of Startle Paul D. Allen, Jordan Bell, Navin Dargani, Catherine A. Moore,
Control & Sensing Research in the Intelligent Servosystems Lab P. S. Krishnaprasad Institute for Systems Research & Department of Electrical and Computer.
Active Contour Models (Snakes)
ICA Madrid 9/7/ Simulating distance cues in virtual reverberant environments Norbert Kopčo 1, Scott Santarelli, Virginia Best, and Barbara Shinn-Cunningham.
3-D Spatialization and Localization and Simulated Surround Sound with Headphones Lucas O’Neil Brendan Cassidy.
Hearing & Deafness (3) Auditory Localisation
AUDITORY PERCEPTION Pitch Perception Localization Auditory Scene Analysis.
Spectral centroid 6 harmonics: f0 = 100Hz E.g. 1: Amplitudes: 6; 5.75; 4; 3.2; 2; 1 [(100*6)+(200*5.75)+(300*4)+(400*3.2)+(500*2 )+(600*1)] / = 265.6Hz.
Boston University--Harvard University--University of Illinois--University of Maryland A System Identification Problem from Lord Rayleigh P. S. Krishnaprasad.
Gaussian Mixture-Sound Field Landmark Model for Robot Localization Talker: Prof. Jwu-Sheng Hu Department of Electrical and Control Engineering National.
Speech Segregation Based on Sound Localization DeLiang Wang & Nicoleta Roman The Ohio State University, U.S.A. Guy J. Brown University of Sheffield, U.K.
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
CSP Auditory input processing 1 Auditory input processing Lecturer: Smilen Dimitrov Cross-sensorial processing – MED7.
Robot Autonomous Perception Model For Internet-Based Intelligent Robotic System By Sriram Sunnam.
Biologically-inspired Visual Landmark Navigation for Mobile Robots
LINKWITZ LAB What are the On-axis & Off-axis
Applied Psychoacoustics Lecture: Binaural Hearing Jonas Braasch Jens Blauert.
Development of Airborne Potassium Magnetometer Dr. Ivan Hrvoic, Ph.D., P.Eng. President, GEM Advanced Magnetometers Exploration 2007 & KEGS.
42nd IEEE Conference on Decision and Control, Dec 9-12, 2003
Methods Neural network Neural networks mimic biological processing by joining layers of artificial neurons in a meaningful way. The neural network employed.
Virtual Worlds: Audio and Other Senses. VR Worlds: Output Overview Visual Displays: –Visual depth cues –Properties –Kinds: monitor, projection, head-based,
Signal Processing & Communication for Smart Dust Networks Haralabos (Babis) Papadopoulos ECE Department Institute for Systems Research University of Maryland,
3-D Sound and Spatial Audio MUS_TECH 348. Main Types of Errors Front-back reversals Angle error Some Experimental Results Most front-back errors are front-to-back.
Sounds in a reverberant room can interfere with the direct sound source. The normal hearing (NH) auditory system has a mechanism by which the echoes, or.
THE MANIFOLDS OF SPATIAL HEARING Ramani Duraiswami | Vikas C. Raykar Perceptual Interfaces and Reality Lab University of Maryland, College park.
Week 6: (March 15, 2011) Auditory Maps and orienting: need for Coordinate Transformations.
Audio Systems Survey of Methods for Modelling Sound Propagation in Interactive Virtual Environments Ben Tagger Andriana Machaira.
3-D Sound and Spatial Audio MUS_TECH 348. Physical Modeling Problem: Can we model the physical acoustics of the directional hearing system and thereby.
L INKWITZ L AB S e n s i b l e R e p r o d u c t i o n & R e c o r d i n g o f A u d i t o r y S c e n e s Hearing Spatial Detail in Stereo Recordings.
Jens Blauert, Bochum Binaural Hearing and Human Sound Localization.
Figures for Chapter 14 Binaural and bilateral issues Dillon (2001) Hearing Aids.
Spatial and Spectral Properties of the Dummy-Head During Measurements in the Head-Shadow Area based on HRTF Evaluation Wersényi György SZÉCHENYI ISTVÁN.
Human Detection and Localization of Sounds in Complex Environments W.M. Hartmann Physics - Astronomy Michigan State University QRTV, UN/ECE/WP-29 Washington,
A Model of Binaural Processing Based on Tree-Structure Filter-Bank
Auditory Neuroscience 1 Spatial Hearing Systems Biology Doctoral Training Program Physiology course Prof. Jan Schnupp HowYourBrainWorks.net.
Fundamentals of Sensation and Perception THE AUDITORY BRAIN AND PERCEIVING AUDITORY SCENE ERIK CHEVRIER OCTOBER 13 TH, 2015.
Exploiting cross-modal rhythm for robot perception of objects Artur M. Arsenio Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory.
Listeners weighting of cues for lateral angle: The duplex theory of sound localization revisited E. A. MacPherson & J. C. Middlebrooks (2002) HST. 723.
3-D Sound and Spatial Audio MUS_TECH 348. Are IID and ITD sufficient for localization? No, consider the “Cone of Confusion”
On the manifolds of spatial hearing
PSYC Auditory Science Spatial Hearing Chris Plack.
Feel the beat: using cross-modal rhythm to integrate perception of objects, others, and self Paul Fitzpatrick and Artur M. Arsenio CSAIL, MIT.
Fletcher’s band-widening experiment (1940)
Boston University--Harvard University--University of Illinois--University of Maryland Distributed Sensing, Control, and Uncertainty (Maryland Overview)
SPATIAL HEARING Ability to locate the direction of a sound. Ability to locate the direction of a sound. Localization: In free field Localization: In free.
Fundamentals of Sensation and Perception
3-D Sound and Spatial Audio MUS_TECH 348. What do these terms mean? Both terms are very general. “3-D sound” usually implies the perception of point sources.
Sound Localization and Binaural Hearing
Auditory Localization in Rooms: Acoustic Analysis and Behavior
PSYCHOACOUSTICS A branch of psychophysics
Spatial Audio - Spatial Sphere Demo Explained
Sound 101: What is it, Why is it, Where is it?
Kocaeli University Introduction to Engineering Applications
Christine V. Portfors, Henrique von Gersdorff  Neuron 
Distributed Sensing, Control, and Uncertainty
Hearing Spatial Detail
Localizing Sounds.
3 primary cues for auditory localization: Interaural time difference (ITD) Interaural intensity difference Directional transfer function.
Localization behavior in ITD-alone condition and head turn analysis.
Week 13: Neurobiology of Hearing Part 2
Presentation transcript:

Boston University--Harvard University--University of Illinois--University of Maryland Distributed Sensor Fields and Uncertainty: Bio-mimetic Methods for Acoustic Source Localization P. S. Krishnaprasad University of Maryland, College Park Department of Electrical and Computer Engineering & Institute for Systems Research Center for Communicating Networked Control Systems ARO-MURI01 Review, Boston University October 20-21, 2003

Boston University--Harvard University--University of Illinois--University of Maryland Sensor Field - Motivation Dynamic Sound Source Localization Outline Problems and Models Technical Approach References Demonstration This is joint work with Amir Handzel, Sean Andersson, and Martha Gebremichael. Also thanks to Shihab Shamma for inspiration. Vinay Shah did recent measurements and demos.

Boston University--Harvard University--University of Illinois--University of Maryland Sensor Field A sensor field is heterogeneous (acoustic, seismic, thermal, RF, magnetic, optical…) and often mobile on various platforms (e.g. UGV, UAV…), which are networked and in contact with key gateway nodes - (Eicke/Lavery (1999); Srour (discussions 1998, 2000); NRC (2000) NMAB-495; Emmerman (discussions 2000, 2001); Scanlon/Young (discussions 2003))

Boston University--Harvard University--University of Illinois--University of Maryland Photo: courtesy of Michael Scanlon, ARL

Boston University--Harvard University--University of Illinois--University of Maryland Control over noisy, limited bandwidth, communication channels Intelligent Servosystems Laboratory (ISL)

Boston University--Harvard University--University of Illinois--University of Maryland Dynamic Sound Source Localization - or why we need to move our head? Biologically inspired algorithms

Boston University--Harvard University--University of Illinois--University of Maryland Barn Owl and Robot Can we capture the barn owl’s auditory acuity in a binaural robot?

Boston University--Harvard University--University of Illinois--University of Maryland Sound Localization in Nature Localization: spatial aspect of auditory sense Sensory organ arrangement: Vision -- spatial “topographic” Audition -- tonotopic, transduction to sound pressure in frequency bands special computation required, performed in dedicated brainstem circuits and cortex

Boston University--Harvard University--University of Illinois--University of Maryland Acoustic Cues for Localization Binaural/Inter-aural: Level/Intensity Difference (ILD) Time/Phase Difference (IPD) On-set difference/precedence effect Monaural: spectral-directional filtering by Pinna, mostly for elevation

Boston University--Harvard University--University of Illinois--University of Maryland Place Theory (L. Jeffress) J. Comp. Physiol. & Psychol., (1948) 41:35-39 Jeffress model and schematic of brainstem auditory circuits for detection of interaural time (ITD) differences; from Carr & Amagai (1996)

Boston University--Harvard University--University of Illinois--University of Maryland Stereausis (S. Shamma et. al.) J. Acoust. Soc. Am. (1989) 86: Ipsi-lateral cochlea Characteristic frequency Sound Characteristic frequency Contara- lateral cochlea AVCN Ipsi- center contra- lateral C kk +1 C kk C kk -1 YjYj C ij XiXi or

Boston University--Harvard University--University of Illinois--University of Maryland -45 deg (left) Stereausis shifts from the main diagonal according to the source location. 45 deg (right)0 deg center Incoming sound: a pure tone Stereausis scheme (courtesy Shihab Shamma, UMd)

Boston University--Harvard University--University of Illinois--University of Maryland Lord Rayleigh and Binaural Perception ILD and ITD both needed for azimuth (the concept of HRTF). What about elevation? See section 385 of The Theory of Sound 1945 edition

Boston University--Harvard University--University of Illinois--University of Maryland Initial Motivation All the above are static, but real life usually dynamic, and psychophysical experiments show active horizontal head rotations improve localization, break inter-aural symmetry, and thus provide information on elevation (Perret & Noble 1997, Wightman & Kistler 1999). One can explain the above theoretically. Understanding such effects would matter in guiding robots towards acoustic source.

Boston University--Harvard University--University of Illinois--University of Maryland Coordinate Systems  zimuthal  Polar  Elevation  zimuth Microphones at poles on horizontal plane

Boston University--Harvard University--University of Illinois--University of Maryland Static Solution Pressure field proportional to Does not depend on azimuthal angle (  Head Related Transfer Function (HRTF) Numerical (e.g. FMP), and empirical methods for non-spherical heads

Boston University--Harvard University--University of Illinois--University of Maryland

Feature Plane (cylinder) and Signatures ILD & IPD constitute an intermediate computational space for localization At each frequency a source gives rise to a point in the ILD-IPD plane A (broadband) point source imprints a signature curve on this feature plane (cylinder) according to its location

Boston University--Harvard University--University of Illinois--University of Maryland

Symmetry of Static Localization Sound pressure and resulting inter-aural functions depend only on polar angle; azimuth invariant -- SO(2) symmetry Sources on same circle of directions have identical signatures. Hence the localization confusion

Boston University--Harvard University--University of Illinois--University of Maryland

Symmetry and Rotations  zimuthal  Polar  Elevation  zimuth

Boston University--Harvard University--University of Illinois--University of Maryland Breaking the Symmetry Azimuthal invariance, but polar rotations do change the localization functions Key mathematical step: infinitesimal rotations act as derivative operator -- generate vector fields on signatures. Derivatives ‘modulated’ by Cos(  -- thus elevation extracted from horizontal rotation!

Boston University--Harvard University--University of Illinois--University of Maryland

Experimental Results Broad band source - sum of pure tones 43 Hz – 11 KHz in steps of 43Hz. Passed through anti-aliasing filter and sampled at 22KHz. Knowles FG-3329 microphones used on head of 22.6 cm maximum diameter. To determine ILD and IPD, each 512 point segment (23 ms) of data was passed through an FFT. Measured IPD and ILD were smoothed by a nine-point moving average. This yields empirically determined (discrete) signature curves on ILD-IPD space. Localization computations based on minimizing distance functions. Implementation of this step on mobile robot achieved as a table lookup.

Boston University--Harvard University--University of Illinois--University of Maryland Pumpkin head side-view (left) and top view (right). Minimum diameter 19 cm and maximum diameter 22.6 cm.

Boston University--Harvard University--University of Illinois--University of Maryland Plot on left displays smoothed ILD against theoretical ILD for source at 17.5 degrees in horizontal plane. Plot on right shows smoothed IPD against theoretical IPD for same source.

Boston University--Harvard University--University of Illinois--University of Maryland Plot on left shows distance functions for source at 15 deg and 17.5 deg. Plot on right shows distance functions for source at 72.5 deg and 75 deg.

Boston University--Harvard University--University of Illinois--University of Maryland Performance plots for IPD-ILD algorithm (left) and traditional ITD Algorithm (right)

Boston University--Harvard University--University of Illinois--University of Maryland New experiments in summer 2003 yielded raw data for further investigation of HRTF dependence on elevation. Front-back ambiguity resolution via dynamic IPD-ILD algorithm implemented on robot. (See demo.) Plans to use soldier-helmet from ARL. Photo: Courtesy of Michael Scanlon, ARL

Boston University--Harvard University--University of Illinois--University of Maryland Accomplishments First theoretical analysis and derivation of localization under rotation (no pinnae) Showed analytically that information on elevation can be extracted from active horizontal rotation (in particular front- back) binaurally, with omni-directional sensors. Demonstration in acoustically cluttered environment

Boston University--Harvard University--University of Illinois--University of Maryland Implications and Applications Psychophysics: auditory displays, auditory component of virtual environments and hearing aids. Bio-mimetic active robot head References: A. A. Handzel and P. S. Krishnaprasad, “Bio-mimetic Sound Source Localization”, IEEE Sensors Journal, 2(6), , A. A. Handzel, S. B. Andersson, M. Gebremichael, and P. S. Krishnaprasad. “A Bio-mimetic Apparatus for Sound Source Localization”, Proc. 42nd IEEE Conf. on Decision and Control, Dec (in press).

Boston University--Harvard University--University of Illinois--University of Maryland Sound following

Boston University--Harvard University--University of Illinois--University of Maryland Front Back Demo Without front-back distinctionWith front-back distinction