Presentation is loading. Please wait.

Presentation is loading. Please wait.

Amin Rasekh, Chien-An Chen, Yan Lu CSCE 666 Project Presentation.

Similar presentations


Presentation on theme: "Amin Rasekh, Chien-An Chen, Yan Lu CSCE 666 Project Presentation."— Presentation transcript:

1 Amin Rasekh, Chien-An Chen, Yan Lu CSCE 666 Project Presentation

2  Introduction ◦ Human Activity Recognition ◦ Active Learning  Goals  Literature Review  Methods ◦ Data Collection and Feature Extraction ◦ Classification Techniques ◦ Query Strategies of active learning  Results  Conclusions

3 Using sensors to identify human activities such as walking, jogging, limping.  Motivation ◦ Human survey (study human daily activities) ◦ Medical care (diabetes, elderly, rehabilitation)  Sensors types ◦ Inertial sensors (accelerometer, gyroscope) ◦ Camera ◦ GPS  Smartphone is small and convenient to carry around and its computational resource is powerful enough for our purpose.

4 Passive Learning: What we have studied in class We can achieve greater accuracy with fewer training labels if we choose the data from which we learn  Motivation: To minimize the time and labor for labeling abundant data

5  Design a simple, light weight, and accurate system that can learn human activity with minimum user interaction. ◦ Compare and find a model that best fit our system in terms of accuracy and efficiency. ◦ Reduce the labeling time and labor works using active learning.

6  Use one or multiple camera to do a vision-based recognition [5,6].  Install multiple inertial sensors on the body. [1, 2, 3,4]  A mixture between vision-based and inertial sensor system.[7]  Classifiers such as Bayesian Decision Making, KNN, SVM, ANN were studied before. [10,11]  Features from time domain, frequency domain and wavelet analysis have been studied.[8,9]

7  Data Collection ◦ Smartphone: HTC EVO 4G ◦ Sensor: 3D accelerometer,50 Hz ◦ Cellphone in pockets around waist ◦ 3 people 5 activities: walking, biking, walking upstairs, walking downstairs, jogging, limping  Feature Generation (Total 31 features) ◦ Sampling Window: 256 samples (5.12 seconds) ◦ Time Domain:  Variance, Mean, 25% Percentile, 75% Percentile, Correlation, Average Resultant Acceleration ◦ Frequency Domain:  Energy, Entropy, Centroid Frequency, Peak Frequency

8  Classification Techniques ◦ Quadratic ◦ K-Nearest Neighbors ◦ Support Vector Machines ◦ Artificial Neural Networks  Query Strategies based on Uncertainty ◦ Quadratic:Distance from discriminant curve ◦ KNN:Entropy ◦ SVM:Distance from the boundary ◦ ANNDiscriminant function values

9 Query is performed for the unlabeled instance that is nearest to the discriminant curve or SVM boundary Random QueryActive Query

10 Query is performed for the unlabeled instance that has the maximum entropy:

11

12

13 ◦ Sequential Forward Selection (Wrapper) ◦ Algorithm: SVM ◦ 10-Fold Cross Validation for each feature subset ◦ Best Features  Variance, 25% Percentile, Frequency-Domain Entropy, Peak Frequency ◦ Classification Rate of SVM+LDA:78% ◦ Classification Rate of SVM+SFS:84%

14 First LDA Component Second LDA Component

15 KNN SVM Quadratic

16 Active learning with SVM Random sampling with SVM Quadratic KNNSVM

17  Improving the performance of active learning for activity recognition problem ◦ Clustering ◦ Hybrid query strategies  Adding more activities such as biking

18  We achieved a classification rate of over 80% on 5 human activities using a smartphone.  The result is robust to common positions and orientations of cellphone.  SVM+SFS gives the best performance and is promising to run on mobile devices.  Performance of active learning is highly sensitive to the type of problem

19 Thank you! Questions?

20 1) L. Bao and S. S. Intille, “Activity recognition from user-annotated acceleration data,” Pers Comput., Lecture Notes in computer Science, vol. 3001, pp. 1–17, 2004. 2) U. Maurer, A. Rowe, A. Smailagic, and D. Siewiorek, “Location and activity recognition using eWatch: A wearable sensor platform,” Ambient Intell. Everday Life, Lecture Notes in Computer Science, vol. 3864, pp. 86–102, 2006. 3) J. Parkka, M. Ermes, P. Korpipaa, J. Mantyjarvi, J. Peltola, and I. Korhonen, “Activity classification using realistic data from wearable sensors,” IEEE Trans. Inf. Technol. Biomed., vol. 10, no. 1, pp. 119–128, Jan. 2006. 4) N.Wang, E. Ambikairajah,N.H. Lovell, and B.G. Celler, “Accelerometry based classification of walking patterns using time- frequency analysis,” in Proc. 29th Annu. Conf. IEEE Eng. Med. Biol. Soc., Lyon, France, 2007, pp. 4899–4902. 5) T.B.Moeslund,A.Hilton,V.Kr ¨ uger, Asurveyofadvancesinvision-based human motioncaptureandanalysis,Comput.VisionImageUnderstanding 104 (2–3)(2006)90–126. 6) T.B. Moeslund, E. Granum, A survey of computer vision-based human motion capture, Comput. Vision Image Understanding 81 (3) (2001) 231–268. 7) Y. Tao, H. Hu, H. Zhou, Integration of vision and inertial sensors for 3D arm motion tracking in home-based rehabilitation, Int. J. Robotics Res. 26 (6) (2007) 607–624. 8) Preece S J, Goulermas J Y, Kenney L P J and Howard D 2008b A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data IEEE Trans. Biomed. Eng. at press 9) N. Ravi, N. Dandekar, P. Mysore, and M. L. Littman. Activity recognition from accelerometer data. In AAAI, pages 1541– 1546, 2005. 10) S.J. Preece, J.Y. Goulermas, L.P.J. Kenney, D. Howard, K. Meijer and R. Crompton, Activity identification using body-mounted sensors—a review of classification techniques. Physiol Meas, 30 (2009), pp. R1–R33. 11) Altun, K., Barshan, B., Tun¸cel, O.: Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recogn. 43(10), 3605–3620 (2010), doi:10.1016/j.patcog.2010.04.019

21

22  Support Vector Machine

23


Download ppt "Amin Rasekh, Chien-An Chen, Yan Lu CSCE 666 Project Presentation."

Similar presentations


Ads by Google