Presentation is loading. Please wait.

Presentation is loading. Please wait.

PULSAR Perception Understanding Learning Systems for Activity Recognition Theme: Cognitive Systems Cog C Multimedia data: interpretation and man-machine.

Similar presentations


Presentation on theme: "PULSAR Perception Understanding Learning Systems for Activity Recognition Theme: Cognitive Systems Cog C Multimedia data: interpretation and man-machine."— Presentation transcript:

1 PULSAR Perception Understanding Learning Systems for Activity Recognition Theme: Cognitive Systems Cog C Multimedia data: interpretation and man-machine interaction Multidisciplinary team: Computer vision, artificial intelligence, software engineering P U L S A R

2 September 20072 5 Research Scientists: François Bremond (CR1 Inria, HDR) Guillaume Charpiat (CR2 Inria, 15 December 07) Sabine Moisan (CR1 Inria, HDR) Annie Ressouche (CR1 Inria) (team leader) Monique Thonnat (DR1 Inria, HDR) 1 External Collaborator: Jean-Paul Rigault (Prof. UNSA) 1 Post-doc: Sundaram Suresh (PhD Bangalore, ERCIM) 5 Temporary Engineers: B. Boulay (PhD), E. Corvee (PhD) R. Ma (PhD), L. Patino (PhD), V. Valentin 8 PhD Students: B. Binh, N. Kayati, L. Le Thi, M.B. Kaaniche, V. Martin, A.T. Nghiem, N. Zouba, M. Zuniga 1 External visitor: Tomi Raty (VTT Finland) Team presentation

3 P U L S A R September 20073 Objective: Cognitive Systems for Activity Recognition Activity recognition: Real-time Semantic Interpretation of Dynamic Scenes Dynamic scenes: Several interacting human beings, animals or vehicles Long term activities (hours or days) Large scale activities in the physical world (located in large space) Observed by a network of video cameras and sensors Real-time Semantic interpretation: Real-time analysis of sensor output Semantic interpretation with a priori knowledge of interesting behaviors PULSAR

4 September 20074 Objective: Cognitive Systems for Activity Recognition Cognitive systems: perception, understanding and learning systems Physical object recognition Activity understanding and learning System design and evaluation Two complementary research directions: Scene Understanding for Activity Recognition Activity Recognition Systems PULSAR Scientific objectives:

5 P U L S A R September 20075 Two application domains: Safety/security (e.g. airport monitoring) Healthcare (e.g. assistance to the elderly) PULSAR target applications

6 P U L S A R September 20076 Cognitive Systems for Activity Recognition Airport Apron Monitoring Outdoor scenes with complex interactions between humans, ground vehicles, and aircrafts Aircraft preparation: optional tasks, independent tasks, temporal constraints

7 P U L S A R September 20077 Cognitive Systems for Activity Recognition Monitoring Daily Living Activities of Elderly Goal: Increase independence and quality of life: Enable people to live at home Delay entrance in nursing home Relieve family members and caregivers Approach: Detecting changes in behavior (missing activities, disorder, interruptions, repetitions, inactivity) Calculate the degree of frailty of elderly people Example of normal activity: Meal preparation (in kitchen) (11h– 12h) Eat (in dinning room) (12h -12h30) Resting, TV watching, (in living room) (13h– 16h) …

8 P U L S A R September 20078 Gerhome laboratory (CSTB,PULSAR) http://gerhome.cstb.fr Water sensor Contact sensors to detect “open/close” Presence sensor

9 P U L S A R September 20079 Orion contributions 4D semantic approach to Video Understanding Program supervision approach to Software Reuse VSIP platform for real-time video understanding  Keeneo start-up LAMA platform for knowledge-based system design From ORION to PULSAR

10 P U L S A R September 200710 1) New Research Axis: Software architecture for activity recognition 2) New Application Domain: Healthcare (e.g. assistance to the elderly) 3) New Research Axis: Machine learning for cognitive systems (mixing perception, understanding and learning) 4) New Data Types: Video enriched with other sensors (e.g. contact sensors, ….) From ORION to PULSAR

11 P U L S A R September 200711 Perception for Activity Recognition (F Bremond, G Charpiat, M Thonnat) Goal: to extract rich physical object description Difficulty: to obtain real-time performances and robust detections in dynamic and complex situations Approach: Perception methods for shape, gesture and trajectory description of multiple objects Multimodal data fusion from large sensor networks sharing same 3D referential Formalization of the conditions of use of the perception methods PULSAR research directions

12 P U L S A R September 200712 Understanding for Activity Recognition (M Thonnat F Bremond S Moisan) Goal: physical object activity recognition based on a priori models Difficulty: vague end-user specifications and numerous observations conditions Approach: Perceptual event ontology interfacing the perception and the human operator levels Friendly activity model formalisms based on this ontology Real-time activity recognition algorithms handling perceptual features uncertainty and activity model complexity PULSAR research directions

13 P U L S A R September 200713 Learning for Activity Recognition (F Bremond, G Charpiat, M Thonnat) Goal: learning to decrease the effort needed for building activity models Difficulty: to get meaningful positive and negative samples Approach: Automatic perception method selection by performance evaluation and ground truth Dynamic parameter setting based on context clustering and parameter value optimization Learning perceptual event concept detectors Learning the mapping between basic event concepts and activity models Learning complex activity models from frequent event patterns PULSAR research directions

14 P U L S A R September 200714 Activity Recognition Systems (S Moisan, A Ressouche, J-P Rigault) Goal: provide new techniques for easy design of effective and efficient activity recognition systems Difficulty: reusability vs. efficiency  From VSIP library and LAMA platform to AR platform Approach: Activity Models: models, languages and tools for all AR tasks Platform Architecture: design a platform with real time response, parallel and distributed capabilities System Safeness: adapt state of the art verification & validation techniques for AR system design PULSAR research directions

15 P U L S A R September 200715 PULSAR: Scene Understanding for Activity Recognition Perception: multi-sensor fusion, interest points and mobile regions, shape statistics Understanding: uncertainty, 4D coherence, ontology for activity recognition Learning: parameter setting, event detector, video mining PULSAR: Activity Recognition Systems From LAMA platform to AR platform: Model extensions: modeling time and scenarios Architecture: real time response, parallelization, distribution User-friendliness and safeness of use: theory and tools for component framework, scalability of verification methods Objectives for the next period

16 P U L S A R September 200716 Person recognition Multimodal Fusion for Monitoring Daily Living Activities of Elderly Meal preparation activity Resting in living room activity Person recognition Multimodal recognition 3D Posture recognition

17 P U L S A R September 200717 Multimodal Fusion for Monitoring Daily Living Activities of Elderly Resting in living room activity Person recognition3D Posture recognition

18 P U L S A R September 200718 Person recognition Multimodal Fusion for Monitoring Daily Living Activities of Elderly Meal preparation activity Multimodal recognition

19 P U L S A R September 200719 Understanding and Learning for Airport Apron Monitoring European project AVITRACK (2004-2006) predefined activities European project COFRIEND (2008-2010) activity learning, dynamic configurations

20 Activity Recognition Platform Architecture Component level Task level Application level Understanding components Learning components Perception components Object recognition and tracking Scenario recognition Airport monitoring Program supervision Vandalism detection Elderly monitoring Configuration and deployment tools Communication and interaction facilities Ontology management Parser generation Usage support tools Component assembly Verification Simulation & testing

21 PULSAR Project-team Any Questions?

22 P U L S A R September 200722 Video Data Mining Objective: Knowledge Extraction for video activity monitoring with unsupervised learning techniques. Methods: Trajectory characterization through clustering (SOM) and behaviour analysis of objects with relational analysis 1. m1m1 m2m2 mkmk mKmK Self Organizing Maps (SOM)Relational Analysis Analysis of the similarity between two individuals i,i’ given a variable : 1 BENHADDA H., MARCOTORCHINO F., Introduction à la similarité régularisée en analyse relationnelle, revue de statistique, Vol. 46, N°1, pp. 45-69, 1998

23 P U L S A R September 200723 Video Data Mining Results 2052 trajectories Step 1:Trajectory clustering (SOM) Trajectory Cluster 1: Walk from north doors to vending machines Step 2: Behaviour Relational Analysis Behavior Cluster 19: Individuals and not Groups buy a ticket at the entrance Trajectory Cluster 9: Walk from north gates to south exit.

24 P U L S A R September 200724 Scenario for Meal preparation Composite Event (Use_microwave, Physical Objects ( (p: Person), (Microwave: Equipment), (Kitchen: Zone)) Components ((p_inz: PrimitiveState inside_zone (p, Kitchen)) (open_mw: PrimitiveEvent Open_Microwave (Microwave)) (close_mw: PrimitiveEvent Close_Microwave (Microwave)) ) Constraints ((open_mw during p_inz ) (open_mw->StartTime + 10s StartTime) )) Multimodal Fusion for Monitoring Daily Living Activities of Elderly Detected by video camera Detected by contact sensor


Download ppt "PULSAR Perception Understanding Learning Systems for Activity Recognition Theme: Cognitive Systems Cog C Multimedia data: interpretation and man-machine."

Similar presentations


Ads by Google