INRIA, NICE, December 7 th -8 th 2006 Data Set and Ground Truth.

Slides:



Advertisements
Similar presentations
INRETS, Villeneuve dAscq, December 15 th -16 th 2005 Project Overview Video Understanding Evaluation David CHER R&D department R&D department SILOGIC S.A.,
Advertisements

An Introduction to Digital Filmmaking
1 CEA List – ETISEO - 1 st Seminar – Nice – 10/11 May 2005 CEA List : Data Provider ETISEO CEA List Patrick Sayd
ETISEO Project Corpus data - Video sequences contents - Silogic provider.
ETISEO, Nice, May PETS International Workshops on Performance Evaluation of Tracking and Surveillance James Ferryman Computational Vision Group.
PETS’05, Beijing, October 16 th 2005 ETISEO Project Ground Truth & Video annotation.
Doc.: IEEE /0604r1 Submission May 2014 Slide 1 Modeling and Evaluating Variable Bit rate Video Steaming for ax Date: Authors:
Finding Structure in Home Videos by Probabilistic Hierarchical Clustering Daniel Gatica-Perez, Alexander Loui, and Ming-Ting Sun.
Evidences collection Audio-video tools : Class 08/10/2004 ERT production of educational formats.
Scientific Development Branch Dataset Production and Performance Evaluation for Event Detection and Tracking Paul Hosmer Detection and Vision Systems Group.
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Annotation rules Data structure Annotation tool and format Ground truth creation rules Reference.
Internet Vision - Lecture 3 Tamara Berg Sept 10. New Lecture Time Mondays 10:00am-12:30pm in 2311 Monday (9/15) we will have a general Computer Vision.
Abandoned Object Detection for Public Surveillance Video Student: Wei-Hao Tung Advisor: Jia-Shung Wang Dept. of Computer Science National Tsing Hua University.
Visual Event Detection & Recognition Filiz Bunyak Ersoy, Ph.D. student Smart Engineering Systems Lab.
i-LIDS i magery L ibrary for I ntelligent D etection S ystems Luke Sands.
MITRE © 2001 The MITRE Corporation. ALL RIGHTS RESERVED. What Works, What Doesn’t -- And What Needs to Work Lynette Hirschman Information Technology Center.
Environment in different weather conditions Milestone 1 presentation Lisa Hellström.
Vigilant Real-time storage and intelligent retrieval of visual surveillance data Dr Graeme A. Jones.
Background Subtraction for Urban Traffic Monitoring using Webcams Master Graduation Project Final Presentation Supervisor: Rein van den Boomgaard Mark.
Image Subtraction for Real Time Moving Object Extraction Shahbe Mat Desa, Qussay A. Salih, CGIV’04.
HACS HOME IN YOUR POCKET. Today’s Topic User Interaction User Interaction User Authentication User Authentication Database Database Design of Devices.
Red Light Photo Enforcement & Crime Scene Investigation William Fisher Photo Safety Program.
                      Digital Video 1.
3rd Street Light Rail Process and Challenges of Developing Transit Signal Priority Javad Mirabdal, Jack Fleck & Britt Thesen Department of Parking and.
Submission doc.: IEEE 11-14/0070r0 Jan 2014 Josiam et.al., SamsungSlide 1 Joint MAC/PHY Evaluation Methodology Date: Authors:
Object detection, tracking and event recognition: the ETISEO experience Andrea Cavallaro Multimedia and Vision Lab Queen Mary, University of London
INRIA, Nice. December 7 th -8 th 2006 Evaluation protocol Evaluation process.
3D SLAM for Omni-directional Camera
Windows Movie Maker Create and Download a Movie. Objectives □ Create a small video □ Download a video from a video camera to a computer.
Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May Dr. Sadiye Guler Sadiye Guler - Northrop Grumman.
Animation Portfolio Student Name :. Task 1 – Learn the Basics Your task is to complete a simple 3 second animation to get used to the interface and the.
Control Charts An Introduction to Statistical Process Control.
Processing the Crime Scene CSP. What does it involve? the sequence of events by which all evidence at a scene is located, recorded and collected Exercise.
ETISEO Benoît GEORIS, François BREMOND and Monique THONNAT ORION Team, INRIA Sophia Antipolis, France Nice, May th 2005.
1 ETISEO: Video Understanding Performance Evaluation Francois BREMOND, A.T. Nghiem, M. Thonnat, V. Valentin, R. Ma Orion project-team, INRIA Sophia Antipolis,
ETISEO Evaluation Nice, May th 2005 Evaluation Cycles.
Project on Visual Monitoring of Human Behavior and Recognition of Human Behavior in Metro Stations with Video Understanding M. Thonnat Projet ORION INRIA.
THE SUPPORTING ROLE OF ONTOLOGY IN A SIMULATION SYSTEM FOR COUNTERMEASURE EVALUATION Nelia Lombard DPSS, CSIR.
AVITRACK Project FP INRIA WP1 - Apron Activity Model WP3 - Scene Tracking WP4 - Scene Understanding Brussels, January 17th 2006.
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
1 ETISEO S eminar May 10 th -11 th 2005 Sébastien AMBELLOUIS Louahdi KHOUDOUR Amaury FLANCQUART INRETS-LEOST 20 rue Elisee Reclus, BP Villeneuve.
ETISEO Project Evaluation for video understanding Nice, May th 2005 Evaluation du Traitement et de l’Interprétation de Séquences vidEO.
The Easy Tool Training For Customer Service Representatives By : Veronica Flores
Human Activity Recognition at Mid and Near Range Ram Nevatia University of Southern California Based on work of several collaborators: F. Lv, P. Natarajan,
INRIA, NICE, December 7 th -8 th 2006 Etiseo Tools.
Confidential | RBEI / NBD | 27/08/2015 | © Robert Bosch Engineering and Business Solutions Private Limited All rights reserved, also regarding any.
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Project Evaluation process.
Video on the Semantic Web Experiences with Media Streams CWI Amsterdam Joost Geurts Jacco van Ossenbruggen Lynda Hardman UC Berkeley SIMS Marc Davis.
Suspicious Behavior in Outdoor Video Analysis - Challenges & Complexities Air Force Institute of Technology/ROME Air Force Research Lab Unclassified IED.
Data Mining for Surveillance Applications Suspicious Event Detection Dr. Bhavani Thuraisingham.
ETISEO François BREMOND ORION Team, INRIA Sophia Antipolis, France.
PETS’05, Beijing, October 16 th 2005 ETISEO Project Video Providers Corpus Data Video contents.
TRECVID IES Lab. Intelligent E-commerce Systems Lab. 1 Presented by: Thay Setha 05-Jul-2012.
Common Core State Standards Mathematics (K-6)
Objective % Understand advanced production methods for digital video.
WHAT IS TEACHING ? The best approach to understanding the nature of teaching is establishing a harmonious relationship between teacher, student and.
Online Driver Education and Virtual Classroom
Tasks processing Strong participation . Large results comparison on
Animation.
Pre-Production Determine the overall purpose of the project.
Overview What is Multimedia? Characteristics of multimedia
SILOGIC S.A. , Toulouse, France
Interior Camera - A solution to Driver Monitoring Status
NWC601COM – 3D Modelling and Animation
Need more help? Attend after school sessions
NWC601COM – 3D Modelling and Animation
Online Driver Education and Virtual Classroom
Real World PM Interactive Exercise
TRAFFIC SAFETY GAME SHOW
Presentation transcript:

INRIA, NICE, December 7 th -8 th 2006 Data Set and Ground Truth

INRIA, NICE, December 7 th -8 th 2006 Data set and Ground truth Data set contents Data set resources Ground truth contents Ground truth resources

INRIA, NICE, December 7 th -8 th 2006 Data set contents Goal: Capture realistic video sequences with graduate difficulties: lighting variations, occlusions … covering predefined scenarios.

INRIA, NICE, December 7 th -8 th 2006 Data set contents Video providers, main characteristics Inrets: Building Entrance, Indoor/Outdoor, Multi-view CEA: Corridor + Road, Indoor/Outdoor, Visible/IR RATP: Metro, Indoor, Crowded environment iLids: Road, Outdoor, Real street context Silogic: Aircraft parking zone, Multi-view

INRIA, NICE, December 7 th -8 th 2006 Data set contents 5 Topics Apron Building Corridor Building Entrance Metro Road

INRIA, NICE, December 7 th -8 th 2006 Data set contents APRON 2 points of view Different weather conditions

INRIA, NICE, December 7 th -8 th 2006 Data set contents BUILDING CORRIDOR Visible and IR clip

INRIA, NICE, December 7 th -8 th 2006 Data set contents BUILDING ENTRANCE Entrance and Car Park 4 points of view

INRIA, NICE, December 7 th -8 th 2006 Data set contents METRO Real subway scene

INRIA, NICE, December 7 th -8 th 2006 Data set contents ROAD Visible and IR clip Real scene

INRIA, NICE, December 7 th -8 th 2006 Data set and Ground truth Data set contents Data set resources Ground truth contents Ground truth resources

INRIA, NICE, December 7 th -8 th 2006 Data set resources Data set12Total Nb sequences 5 AP 3 BC 19 = 3 BE 4 MO 4 RD 5 AP 3 BC 21 = 3 BE 5 MO 5 RD 10 AP 6 BC 40 = 6 BE 9 MO 9 RD Nb clips (x Nb camera) 14 AP 6 BC 41 = 9 BE 4 MO 8 RD 10 AP 6 BC 37 = 9 BE 5 MO 7 RD 24 AP 12 BC 78 = 18 BE 9 MO 15 RD

INRIA, NICE, December 7 th -8 th 2006 Data set resources Data set12Total Total sequences Duration 38 min 03 s 12 min 04s AP 06 min 40s BC 02 min 24s BE 11 min 52s MO 05 min 03s RD 36 min 58 s 06 min 33s AP 07 min 24s BC 02 min 46s BE 10 min 16s MO 09 min 59s RD 1 h 15 min 01 s 18 min 37s AP 14 min 04s BC 05 min 10s BE 22 min 08s MO 15 min 02s RD Total Duration1 h 23 min 19 s1 h 00 min 02 s 2 h 23 min 21 s Mean Duration / sequence 1 min 56 s 02 min 25s AP 02 min 13s BC 00 min 48s BE 02 min 58s MO 01 min 16s RD 1 min 45 s 01 min 19s AP 02 min 28s BC 00 min 55s BE 02 min 03s MO 02 min 00s RD 1 min 50 s

INRIA, NICE, December 7 th -8 th 2006 Data set resources Data set1 Participation (13 participants) 2 Participation (16 participants) Priority sequences 1 AP 0 BC 1 BE 1 MO 1 RD 92 % % 100 % 1 AP 0 BC 1 BE 1 MO 2 RD 94 % % 75 % 94 % Wished sequences 1 AP 1 BC 1 BE 2 MO 0 RD 46 % 15 % 8 % 0 % AP 1 BC 1 BE 2 MO 1 RD 25 % 19 % 22 % 19 % Others AP BC BE MO RD 25 % 0 % 7 % 0 % 5 % AP BC BE MO RD 23 % 15 % 25 % 22 % 19 %

INRIA, NICE, December 7 th -8 th 2006 Data set resources 2 video data set (test + evaluation) 2*2 DVD provided Documentation (metrics, annotation rules, structures and formats) Context and calibration information Data base (xls file)

INRIA, NICE, December 7 th -8 th 2006 Data set resources TopicsEvent of interest AP Vehicle movements: empty_area, inside_zone, enters_zone, stopped BC Person and objects interactions: opens/closes (door), picks_up/puts_down, abandoned_baggage, changes_zone, running BE Person, vehicle and objects interactions (complex scene): opens/closes (door), gets_in/gets_out, go_down/up_stairs, door_control, inside_zone, enters/exist_zone, changes_zone MO Abandoned baggage in crowded environment: Waiting, picks_up/puts_down, abandoned_baggage, people counting, exchange_objects, (ticket machine) RD Person, vehicle and objects interactions (complex scene): opens/closes (door), gets_(in/on)/gets_(out/of), biking, crossing, walking_with, overtaking

INRIA, NICE, December 7 th -8 th 2006 Data set and Ground truth Data set contents Data set resources Ground truth contents Ground truth resources

INRIA, NICE, December 7 th -8 th 2006 Ground truth contents Context information Physical objects of interest and events annotation Annotation rules provided to participant in a document and updated during the project Examples…

INRIA, NICE, December 7 th -8 th 2006 Data set and Ground truth Data set contents Data set resources Ground truth contents Ground truth resources

INRIA, NICE, December 7 th -8 th 2006 Ground truth resources Data set12Total Nb ground truth files 14 AP 3 BC 34 = 9 BE 4 MO 4 RD 10 AP 3 BC 32 = 9 BE 5 MO 5 RD 24 AP 6 BC 66 = 18 BE 9 MO 9 RD Sequence time 1 h 11 min 36 s ( frames) 48 min 31 s ( frames) 2 h 00 min 07s ( frames)

INRIA, NICE, December 7 th -8 th 2006 Conclusion & Feedback Difficulties encountered: Definition of event Annotation subjectivity Proposition for other evaluation project: Ground truth done before data set distribution: Easier to identify difficulties that may encountered participants and add explanations and specific annotation rules Ground truth given between the deadline submission and the results processing to be able to correct ground truth before evaluation -> Time consuming