ETISEO Benoît GEORIS, François BREMOND and Monique THONNAT ORION Team, INRIA Sophia Antipolis, France Nice, May 10-11 th 2005.

Slides:



Advertisements
Similar presentations
INRETS, Villeneuve dAscq, December 15 th -16 th 2005 Project Overview Video Understanding Evaluation David CHER R&D department R&D department SILOGIC S.A.,
Advertisements

MICHAEL MARINO CSC 101 Whats New in Office Office Live Workspace 3 new things about Office Live Workspace are: Anywhere Access Store Microsoft.
1 Early Pest Detection in Greenhouses Vincent Martin, Sabine Moisan INRIA Sophia Antipolis Méditerranée, Pulsar project-team, France.
ETISEO Project Corpus data - Video sequences contents - Silogic provider.
Christian Delbe1 Christian Delbé OASIS Team INRIA -- CNRS - I3S -- Univ. of Nice Sophia-Antipolis November Automatic Fault Tolerance in ProActive.
ETISEO, Nice, May PETS International Workshops on Performance Evaluation of Tracking and Surveillance James Ferryman Computational Vision Group.
Object Specific Compressed Sensing by minimizing a weighted L2-norm A. Mahalanobis.
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
Matthias Wimmer, Ursula Zucker and Bernd Radig Chair for Image Understanding Computer Science Technische Universität München { wimmerm, zucker, radig
PETS’05, Beijing, October 16 th 2005 ETISEO Project Ground Truth & Video annotation.
Towards a Video Camera Network for Early Pest Detection in Greenhouses
ELIS – Multimedia Lab Steven Verstockt T. Beji, B. Merci & R. Van de Walle RABOT2012 Presentation of a Multi-View Video Dataset of the Full-Scale (‘Rabot’)
Patch to the Future: Unsupervised Visual Prediction
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Annotation rules Data structure Annotation tool and format Ground truth creation rules Reference.
Performance Evaluation Measures for Face Detection Algorithms Prag Sharma, Richard B. Reilly DSP Research Group, Department of Electronic and Electrical.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
MUSCLE e-Team: Content Analysis Showcase (CAS)
T.Sharon 1 Internet Resources Discovery (IRD) Video IR.
Microsoft Visio is diagramming software for Microsoft Windows. It uses vector graphics to create diagrams. The 2007 Standard and Professional editions.
Trinity College Dublin PixelGT: A new Ground Truth specification for video surveillance Dr. Kenneth Dawson-Howe, Graphics, Vision and Visualisation Group.
Subjectif tests requirements for HDR
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
INRIA, NICE, December 7 th -8 th 2006 Data Set and Ground Truth.
Object detection, tracking and event recognition: the ETISEO experience Andrea Cavallaro Multimedia and Vision Lab Queen Mary, University of London
MobSched: An Optimizable Scheduler for Mobile Cloud Computing S. SindiaS. GaoB. Black A.LimV. D. AgrawalP. Agrawal Auburn University, Auburn, AL 45 th.
INRIA, Nice. December 7 th -8 th 2006 Evaluation protocol Evaluation process.
G O D D A R D S P A C E F L I G H T C E N T E R 1 Global Precipitation Measurement (GPM) GV Data Exchange Protocol Mathew Schwaller GPM Formulation Project.
Università degli Studi di Modena and Reggio Emilia Dipartimento di Ingegneria dell’Informazione Prototypes selection with.
Orion Image Understanding for Object Recognition Monique Thonnat INRIA Sophia Antipolis.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May Dr. Sadiye Guler Sadiye Guler - Northrop Grumman.
ViPER Video Performance Evaluation Resource University of Maryland.
Communications Skills (ELE 205)
March 2, Help the USPTO improve the MPEP using an Online Collaboration Tool Robert A. Clarke Deputy Director Office of Patent Legal Administration.
1 ETISEO: Video Understanding Performance Evaluation Francois BREMOND, A.T. Nghiem, M. Thonnat, V. Valentin, R. Ma Orion project-team, INRIA Sophia Antipolis,
ETISEO Evaluation Nice, May th 2005 Evaluation Cycles.
Recognition of Human Behaviors with Video Understanding M. Thonnat, F. Bremond and B. Boulay Projet ORION INRIA Sophia Antipolis, France 08/07/2003 Inria/STMicroelectronics.
Project on Visual Monitoring of Human Behavior and Recognition of Human Behavior in Metro Stations with Video Understanding M. Thonnat Projet ORION INRIA.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
Tracking and event recognition – the Etiseo experience Son Tran, Nagia Ghanem, David Harwood and Larry Davis UMIACS, University of Maryland.
Communications Skills (ELE 205) Dr. Ahmad Dagamseh Dr. Ahmad Dagamseh.
AVITRACK Project FP INRIA WP1 - Apron Activity Model WP3 - Scene Tracking WP4 - Scene Understanding Brussels, January 17th 2006.
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
ETISEO Project Evaluation for video understanding Nice, May th 2005 Evaluation du Traitement et de l’Interprétation de Séquences vidEO.
Human Activity Recognition at Mid and Near Range Ram Nevatia University of Southern California Based on work of several collaborators: F. Lv, P. Natarajan,
INRIA, NICE, December 7 th -8 th 2006 Etiseo Tools.
Christopher M. Bishop Object Recognition: A Statistical Learning Perspective Microsoft Research, Cambridge Sicily, 2003.
Recent Advances in ViPER David Mihalcik David Doermann Charles Lin.
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Project Evaluation process.
ViPER Video Performance Evaluation Toolkit viper-toolkit.sf.net.
Data Mining for Surveillance Applications Suspicious Event Detection Dr. Bhavani Thuraisingham.
ViPER Video Annotation and Performance Evaluation viper-toolkit.sf.net.
Using decision trees to build an a framework for multivariate time- series classification 1 Present By Xiayi Kuang.
A Comparative Tool for Evaluating Campus Sustainability Sustainability Tracking, Assessment & Rating System (STARS)
ETISEO François BREMOND ORION Team, INRIA Sophia Antipolis, France.
Image features and properties. Image content representation The simplest representation of an image pattern is to list image pixels, one after the other.
Multi-view Synchronization of Human Actions and Dynamic Scenes Emilie Dexter, Patrick Pérez, Ivan Laptev INRIA Rennes - Bretagne Atlantique
REAL-TIME DETECTOR FOR UNUSUAL BEHAVIOR
Data Mining for Surveillance Applications Suspicious Event Detection
Data Mining Jim King.
Tasks processing Strong participation . Large results comparison on
Scene Understanding Francois BREMOND
Vehicle Segmentation and Tracking in the Presence of Occlusions
Data Mining for Surveillance Applications Suspicious Event Detection
SILOGIC S.A. , Toulouse, France
Supervised Classification
Knowledge-based event recognition from salient regions of activity
Data Mining for Surveillance Applications Suspicious Event Detection
Evaluation of UMD Object Tracking in Video
Presentation transcript:

ETISEO Benoît GEORIS, François BREMOND and Monique THONNAT ORION Team, INRIA Sophia Antipolis, France Nice, May th 2005

2 Fair Evaluation (1/4) Unbiased and transparent evaluation protocol Large participation Meaningful evaluation

3 Fair Evaluation (2/4) Unbiased and transparent evaluation protocol  Each participant should have the same effective time for testing and evaluation  Main decision should be collegiate : common agreement on videos, ground truth and metrics  Everybody keeps control on result diffusion

4 Fair Evaluation (3/4) Large participation: what people want  Meet partner expectation: large variety of videos  Need minimum efforts for tuning and testing: small video sets Large participation: what can we propose  Possibility to adapt algorithm experimentation (having a mask, choosing options such as shadows, low contrast, wind…)  Enabling a graduation of difficulties from easy to hard  Minimizing overhead for result creation (format, data exchange,…)

5 Fair Evaluation (4/4) Meaningful evaluation  Clear visualization (straightforward metrics and graphical result presentation)  For each partner, presentation of results for problems (one after one) that will be studied by ETISEO with variations from easy to hard  For a specific problem, global comparison of partner performances (e.g., sensibility of event recognition wrt image resolution)

6 Video Selection (1/4) New or already published videos ? Contextual information associated to videos Video characterization

7 Video Selection (2/4) New or already published videos ?  Advantages of new videos:  Fair, since available time is the same for everyone  Dedicated to specific problems, with graduations of difficulties  Advantages of old videos:  Easy because they are available and many people have already tested them  Enable to compare with algorithms outside ETISEO project  Mix solution  Using both old and new videos  Sharing videos with other ongoing evaluation programs

8 Video Selection (3/4) Contextual information associated to videos  3D empty scene model  Minimum information : few 3D distances drawn on the image  Maximum information: 3D scene model made available  Camera calibration  Set of 2D and 3D points  Calibration matrix taking into account distorsion or not  3D models of objects of interest

9 Video Selection (4/4) Video characterization  Partners should specify what they want and what they don’t among the 30 possibilities  From what partners want, we can select x (5?) problems (e.g., dynamic occlusion) to be studied and generate a series of y (10?) videos  From what partners do not want, we should provide tools to prevent disturbation of other simultaneous problems (e.g., wind while occlusion)

10 Data terminology (1/2) Video sequences, video clips and scenes Blobs, moving regions, physical objects of interest and contextual objects Ground truth, annotation and reference data Criteria and metrics

11 Data terminology (2/2) Definition of video analysis tasks  1) Detection of physical objects of interest  2) Classification of physical objects of interest  3) Tracking of physical objects of interest  4) Event recognition Delimitation of tasks to be evaluated  Different types of combination (only task 1 vs combined task 1&2)  Evaluation of each task whatever the combination

12 Ground truth and Metrics For each task, definition of ground truth and metrics Annotation tool (Viper, Reading tool,…) Format for ground truth (XML, MPEG7,…)

13 MPEG7 Example T12:55:00:25F1000 PT0H1M5S725N1000F T12:55:00:25F1000 PT0H1M5S725N1000F T12:55:00:25F T12:55:00:75F T12:55:00:125F

14 XML Example

15 Ground Truth Definition : example