ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December 15-16 th 2005.

Slides:



Advertisements
Similar presentations
INRETS, Villeneuve dAscq, December 15 th -16 th 2005 Project Overview Video Understanding Evaluation David CHER R&D department R&D department SILOGIC S.A.,
Advertisements

HealthCare Monitoring: GERHOME Project Monique Thonnat, Francois Bremond & Nadia Zouba PULSAR, INRIA Date.
ETISEO Project Corpus data - Video sequences contents - Silogic provider.
Christian Delbe1 Christian Delbé OASIS Team INRIA -- CNRS - I3S -- Univ. of Nice Sophia-Antipolis November Automatic Fault Tolerance in ProActive.
ETISEO, Nice, May PETS International Workshops on Performance Evaluation of Tracking and Surveillance James Ferryman Computational Vision Group.
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
PETS’05, Beijing, October 16 th 2005 ETISEO Project Ground Truth & Video annotation.
- Recovering Human Body Configurations: Combining Segmentation and Recognition (CVPR’04) Greg Mori, Xiaofeng Ren, Alexei A. Efros and Jitendra Malik -
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Annotation rules Data structure Annotation tool and format Ground truth creation rules Reference.
EVENTS: INRIA Work Review Nov 18 th, Madrid.
CSCI 4163/6904, Summer Diary studies…  Participants collect data about events  As they happen  In the context of the event (in situ)  Can think.
Real Time Video Segmentation Feng Xie. Motivation 4 Video compositing & layering 4 Video Avatar 4 Object Recognition 4 Video understanding 4 Video Surveillence.
Trinity College Dublin PixelGT: A new Ground Truth specification for video surveillance Dr. Kenneth Dawson-Howe, Graphics, Vision and Visualisation Group.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
WP -6: Human Tracking and Modelling Year–I Objectives: Simple upper-body models and articulated tracks from test videos. Year-I Achievements: Tracking.
INRIA, NICE, December 7 th -8 th 2006 Data Set and Ground Truth.
Preparing for Data Collection Need to recognize that data collection is a high level activity that cannot be just passed off to graduate assistant Need.
Object detection, tracking and event recognition: the ETISEO experience Andrea Cavallaro Multimedia and Vision Lab Queen Mary, University of London
INRIA, Nice. December 7 th -8 th 2006 Evaluation protocol Evaluation process.
Information Extraction from Cricket Videos Syed Ahsan Ishtiaque Kumar Srijan.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May Dr. Sadiye Guler Sadiye Guler - Northrop Grumman.
Slide Medium 35mm Slides. Slide Medium 35mm Slides DEFINITION: A slide is a small format photographic transparency, individually mounted for one-at-a-
DIFFERENTIATING “COMBINED” FUNCTIONS ---PART I CONSTANT MULTIPLES, SUMS AND DIFFERENCES.
D31 Entity Recognition Results with Auto- associative Memories Nicolas Gourier INRIA PRIMA Team GRAVIR Laboratory CAVIAR Project.
Issues in Assessment Design, Vertical Alignment, and Data Management : Working with Growth Models Pete Goldschmidt UCLA Graduate School of Education &
ETISEO Benoît GEORIS, François BREMOND and Monique THONNAT ORION Team, INRIA Sophia Antipolis, France Nice, May th 2005.
1 ETISEO: Video Understanding Performance Evaluation Francois BREMOND, A.T. Nghiem, M. Thonnat, V. Valentin, R. Ma Orion project-team, INRIA Sophia Antipolis,
ETISEO Evaluation Nice, May th 2005 Evaluation Cycles.
Knowledge Management and competitive advantages - A Case of Agile Product Development Model Samson Tam Chairman, Group Sense (International) Limited 18.
Project on Visual Monitoring of Human Behavior and Recognition of Human Behavior in Metro Stations with Video Understanding M. Thonnat Projet ORION INRIA.
1 INRIA’s strategy Jean-Pierre Banâtre. 2 Some figures. About 3,500 persons including: –1,600 employed by INRIA –staff from partner institutions –personnel.
Differentiating “Combined” Functions ---Part I Constant Multiples, Sums and Differences.
Eye Tracking In Evaluating The Effectiveness OF Ads Guide : Dr. Andrew T. Duchowski.
Tracking and event recognition – the Etiseo experience Son Tran, Nagia Ghanem, David Harwood and Larry Davis UMIACS, University of Maryland.
AVITRACK Project FP INRIA WP1 - Apron Activity Model WP3 - Scene Tracking WP4 - Scene Understanding Brussels, January 17th 2006.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
1 ETISEO S eminar May 10 th -11 th 2005 Sébastien AMBELLOUIS Louahdi KHOUDOUR Amaury FLANCQUART INRETS-LEOST 20 rue Elisee Reclus, BP Villeneuve.
ETISEO Project Evaluation for video understanding Nice, May th 2005 Evaluation du Traitement et de l’Interprétation de Séquences vidEO.
Human Activity Recognition at Mid and Near Range Ram Nevatia University of Southern California Based on work of several collaborators: F. Lv, P. Natarajan,
INRIA, NICE, December 7 th -8 th 2006 Etiseo Tools.
Recent Advances in ViPER David Mihalcik David Doermann Charles Lin.
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Project Evaluation process.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Suspicious Behavior in Outdoor Video Analysis - Challenges & Complexities Air Force Institute of Technology/ROME Air Force Research Lab Unclassified IED.
Data Mining for Surveillance Applications Suspicious Event Detection Dr. Bhavani Thuraisingham.
Principal Axis-Based Correspondence between Multiple Cameras for People Tracking Dongwook Seo
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
ETISEO François BREMOND ORION Team, INRIA Sophia Antipolis, France.
© The Delos Partnership 2005 Dairygold Workshop Strategic Sourcing Process.
Image features and properties. Image content representation The simplest representation of an image pattern is to list image pixels, one after the other.
PETS’05, Beijing, October 16 th 2005 ETISEO Project Video Providers Corpus Data Video contents.
Date of download: 5/29/2016 Copyright © 2016 SPIE. All rights reserved. From left to right are camera views 1,2,3,5 of surveillance videos in TRECVid benchmarking.
Introduction To Computational and Biological Vision Max Binshtok Ohad Greenshpan March 2006 Shot Detection in video.
Online Public Interface and Outreach ASTEP Website and Remote Science Collaboration Server 7/30/03.
Data Mining for Surveillance Applications Suspicious Event Detection
Tasks processing Strong participation . Large results comparison on
3.3 Fundamentals of data representation
Within- Subjects Design
Proposal contents: Results achieved to date
Within- Subjects Design
Scene Understanding Francois BREMOND
Data Mining for Surveillance Applications Suspicious Event Detection
Global Safety Cranes Management
SILOGIC S.A. , Toulouse, France
Progress Report Meng-Ting Zhong 2015/9/10.
Data Mining for Surveillance Applications Suspicious Event Detection
Evaluation of UMD Object Tracking in Video
Learning complex visual concepts
Presentation transcript:

ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005

2 Fair Evaluation Unbiased and transparent evaluation protocol Large participation Meaningful evaluation

3 Fair Evaluation Meaningful evaluation  Etiseo video sequences: dedicated to specific problems, with several levels of difficulties  Definition of specific ground truth, reference data and metrics taking advantage of contextual information  For each partner, presentation of results for problems (one after one) that will be studied by ETISEO with variations from easy to hard  For a specific problem, global comparison of partner performances (e.g., sensitivity of tracking wrt occlusion)

4 First Evaluation Cycle Analysis Evaluation of the participation:  Questionnaire & video clips  Number of participants and focus From simple (common) conditions: no occlusion, middle view, well contrasted, no artifact, no light change, color, mono-camera To complex conditions:  Detection  Classification  Tracking  Multiple cameras  Events What can be modified for the 2nd evaluation cycle?  Video clips & ground truth (now)  Reference data, metrics & comparison (later)