Presentation on theme: "Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive."— Presentation transcript:
Activity Recognition Taiwoo Park May 7, Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin Heidelberg, Park, Taiwoo, et al. "E-gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices." Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems. ACM, Some slides are from CSCI 546 course materials by James Reinebold, USC
Activity? Higher level activities – Giving a lecture, having a breakfast, playing soccer… Lower level activities – Lying on a bed, standing still, running, walking, … 2
An easy example Assumptions: – Sensors on smartphones are only available Accelerometer, compass, gyroscope, light, … You can attach smartphones on your body – Only three target activities to recognize Running Standing still Lying on a bed How can we recognize activities? 3
An easy example (cont’d) 4 Is the phone being shaken? Phone orientation Activity NoUpright YesUpright NoLying down YesLying down Variance of accelerometer sensor signal for the last 3 seconds Average value of accelerometer y-axis sensor signals for the last 3 seconds y xz Standing still Running Lying on a bed Nothing
Activity recognition pipeline 5 Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and Intention Recognition." (2011).
An easy example (revisited) 6 Is the phone being shaken? Phone orientation Activity NoUprightStanding still YesUprightRunning NoLying downLying on a bed YesLying down…? Variance of accelerometer sensor signal for the last 3 seconds Average value of accelerometer y-axis sensor signals for the last 3 seconds Windowing Feature extraction Classification
Data collection Semi-Naturalistic, User-Driven Data Collection – Obstacle course / worksheet – No researcher supervision while subjects performed the tasks Timer synchronization Discard data within 10 seconds of start and finish time for activities 12:30 – 12:50 Walking … Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin Heidelberg,
Activities Walking Sitting and Relaxing Standing Still Watching TV Running Stretching Scrubbing Folding Laundry Brushing Teeth Riding Elevator Walking Carrying Items Working on Computer Eating or Drinking Reading Bicycling Strength-training Vacuuming Lying down & relaxing Climbing stairs Riding escalator
Data collection Source: Bao 2004
Sensors Used Five ADXL210E accelerometers (manufactured by Analog Devices) – Range of +/- 10g – 5mm x 5mm x 2mm – Low Power, Low Cost – Measures both static and dynamic acceleration Sensor data was stored in a memory card by using “Hoarder Board” Source:
Example Signals Source: Bao 2004
Activity recognition pipeline 12 Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and Intention Recognition." (2011).
Classification 13 ‘Running’ ‘Standing still’ ‘Lying on a bed’ ‘Walking’ Data sample Collected data samples (in advance) Question: What is the most similar samples to the current one? Methods: Naïve bayes, nearest neighbor, decision table/tree, HMM (Hidden Markov Models), … ?
Decision Table 14 Is the phone being shaken? Phone orientation Activity NoUpright YesUpright NoLying down YesLying down y xz Standing still Running Lying on a bed Nothing
Decision Trees Make a tree where the non-leaf nodes are the features, and each leaf node is a classification. Each edge of the tree represents a value range of the feature. Move through the tree until you arrive at a leaf node Generally, the smaller the tree the better. – Finding the smallest is NP-Hard Source:
Decision Tree Example Phone Orient ation? Not lying down: Is the phone being shaken? Running Standing still Lying on a bed Lying down Upright No Yes
Nearest Neighbor Split up the domain into various dimensions, with each dimension corresponding to a feature. Classify an unknown point by having its K nearest neighbors “vote” on who it belongs to. Simple, easy to implement algorithm. Does not work well when there are no clusters. Source:
Nearest Neighbor Example Average value of accel Y Variance of accel Lying on a bedStanding still Running
Naïve Bayes Classifier Multiplies the probability of an observed datapoint by looking at the priority probabilities that encompass the training set. – P(B|A) = P(A|B) * P(B) / P(A) Assumes that each of the features are independent. Relatively fast. Source: cis.poly.edu/~mleung/FRE7851/f07/naiveBayesianClassifier.pdf
Activity recognition pipeline 20 Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and Intention Recognition." (2011).
Feature Extraction Time-domain features [Maurer 2006] – Mean (average), Root Mean Square, Variance, … FFT-based feature computation [Bao 2004] – Sample at Hz – 512 sample windows (about 6.71 sec) – Extract mean energy, entropy, and correlation features Maurer, Uwe, et al. "Activity recognition and monitoring using multiple sensors on different body positions." Wearable and Implantable Body Sensor Networks, BSN International Workshop on. IEEE, 2006.
Source: Bao 2004
Results Decision tree was the best performer, but… ClassifierClassification Accuracy (%, Leave-one-subject-out Training) Decision Table / Nearest Neighbor / Decision Tree / Naïve Bayes /
Per-activity accuracy breakdown
Trying With Less Sensors Accelerometer (s) Left InDifference in Recognition Activity Hip / Wrist / Arm / Ankle / Thigh / Thigh and Wrist / Hip and Wrist / With only two accelerometers we can get good performance
Lessons Accelerometers can be used to affectively distinguish between everyday activities. Decision trees and nearest neighbor algorithms are good choices for activity recognition. Some sensor locations are more important than others. Selecting a feature set is important to increase recognition accuracy.
E-Gesture: A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices 28
Motivation 29 Smartphone Mobile Gesture Interaction Framework … Mobile applications using hand gestures Wristwatch-type motion sensor (Accelerometer, Gyroscope) Mobility!!!!!
Challenge: Energy and Accuracy 30 Conventional gesture processing pipeline (for gesture recognition in stationary setting) Data Sensing Gesture Segmentation (Button, Algorithms) Classification (HMM, DTW) Gesture Samples (Candidate) Result Sensor Mobile Device Accel Gyro Continuous Raw Data Gesture ‘A’ or ‘B’ or non-gesture
Challenge: Energy and Accuracy 31 Data Sensing Gesture Segmentation (Button, Algorithms) Classification (HMM, DTW) Gesture Samples (Candidate) Result Sensor Mobile Device Accel Gyro Continuous Raw Data Sensor: 20hrs (250mAh) Smartphone: 24hrs 17hrs Mobility Noises Continuous data transmission Energy Accuracy Mobility noises Energy Energy-hungry Gyroscope (56%) Accuracy Mobility noises Over 90% False segmentation Only 70% Classification
E-Gesture Architecture 32 Collaborative Gesture Sensing and Segmentation Classification (Adaptive and Multi-Situation HMM) Gesture Samples (Candidate) Result (e.g. laydown) Accel Gyro Trigger Adaptation Wristwatch Sensor DeviceMobile Device 1. Device-wise collaboration Detection on wristwatch, classification on smartphone 2. Sensor-wise collaboration Accel turns on gyro for energy efficiency Gyro adapts accel’s sensitivity for mobility changes Accelerometer: (+)Energy-efficient, (-)Mobility-vulnerable Gyroscope: (-)Energy-hungry, (+)Mobility-robust
Sensor-side Energy Savings 33 46mW 39mW (↓15%) 19mW (↓59%) 59% less energy consumption, 2.4x longer lifetime 250mAh Li-ion Battery Continuous sensing + transmission: 20 hrs Device-wise collaboration (reduced transmission) 23.7 hrs (1.2x) Device-wise, Sensor-wise collaboration (gyroscope power control, reduced transmission) 48.7 hrs (2.4x) Energy Consumption
Mobile-side Energy Savings 34 All processing on mobile: 42.1hrs Device-wise collaboration (reduced transmission) 74hrs (1.8x) 122mW 70mW(↓43%) Nexus One 1400mAh Li-ion Battery 3G/WiFi on Energy Consumption
Implementation Sensor node – Atmega128L MCU – Bluetooth, ZigBee – Sensors 3-Axis Accelerometer (ADXL335) 3-Axis Gyroscope (3 XV-3500CB) 40Hz Sensing – Vib motor Smartphones – Nokia N96, Google Nexus One – Bluetooth Radio 35
Sample Applications Swan Boat [Ubicomp09][MM09][ACE09] – Collaborative boat-racing exertion game – Utilizes hand gestures as additional game input Punching together, flapping together Mobile Music Player, Phone Call Manager – Featuring eye-free, touch-free controls – User can control the application by hand gestures 36
Conclusion Mobile gestural interaction platform – Collaborative gesture processing 1.8x longer battery lifetime for Smartphone 2.4x longer battery lifetime for Hand-worn Sensor + preserving gyroscope’s detection performance – Mobility-robust gesture classification using HMM Up to 94.6% of classification accuracy on mobile usage by mobility-considered classification architecture design – It will greatly facilitate gesture-based mobile applications Provided a novel sensor fusion scheme Serial fusion + feedback control – Saves energy + preserves detection accuracy 37