ETISEO Evaluation Nice, May 10-11 th 2005 Evaluation Cycles.

Slides:



Advertisements
Similar presentations
INRETS, Villeneuve dAscq, December 15 th -16 th 2005 Project Overview Video Understanding Evaluation David CHER R&D department R&D department SILOGIC S.A.,
Advertisements

Web Mining.
FAA/Eurocontrol TIM 9 on Performance Metrics – INTEGRA Rod Gingell 16 May 2002.
Automatic Timeline Generation from News Articles Josh Taylor and Jessica Jenkins.
WP4 – Task 4.4 LCA Activities
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
PETS’05, Beijing, October 16 th 2005 ETISEO Project Ground Truth & Video annotation.
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
ImageCLEF breakout session Please help us to prepare ImageCLEF2010.
Sunita Sarawagi.  Enables richer forms of queries  Facilitates source integration and queries spanning sources “Information Extraction refers to the.
Kresge Business Administration Library 1 EVALUATING VIDEO TUTORIALS: Measuring Excellence And Outcomes Sally Ziph Instruction Coordinator Kresge Business.
Mouse Movement Biometrics, Pace University, Fall'20071 Mouse Movement Biometrics Fall 2007 Capstone -Team Members Rafael Diaz Michael Lampe Nkem Ajufor.
Academic Advisor: Prof. Ronen Brafman Team Members: Ran Isenberg Mirit Markovich Noa Aharon Alon Furman.
UGDIE PROJECT MEETING Bled September WP6 – Assessment and Evaluation Evaluation Planning  Draft Evaluation plan.
Scholars of the Future Increasing Diversity in Information Technology.
Learning Logistics: a case study José Fernando Oliveira Maria Antónia Carravilla Reflectir Bolonha: Reformar o Ensino Superior, 29 Abril 2003 Departamento.
Introduction Social Media Mining. 2 Measures and Metrics 2 Social Media Mining Introduction Facebook How does Facebook use your data? Where do you think.
INRIA, NICE, December 7 th -8 th 2006 Data Set and Ground Truth.
Igal Kaptsan Research Director Applied Research, Bentley Systems Change Management.
Write to Learn K-3 Partner Share With your partner discuss your current understanding and use of writing to learn.
Free Mini Course: Applying SysML with MagicDraw
INRIA, Nice. December 7 th -8 th 2006 Evaluation protocol Evaluation process.
The Evolution of Shared-Task Evaluation Douglas W. Oard College of Information Studies and UMIACS University of Maryland, College Park, USA December 4,
1 Lender Development Program Requirement Understanding Document September 17, 2008.
Event Metadata Records as a Testbed for Scalable Data Mining David Malon, Peter van Gemmeren (Argonne National Laboratory) At a data rate of 200 hertz,
ARDA VACE Advanced Research and Development Activity (ARDA) Video Analysis and Content Extraction (VACE)
ITEC224 Database Programming
Maintenance and operation of the data catalog (portal site) Second half of FY 2013 FY 2014 Organize and present views on releasing data of local public.
NRTSI/NRC Project Framework for the Assessment of the State, Performance and Management of Canada’s CPI.
Developing a Public Sector Award A General Guideline.
What is ? Free service offered by Google The most widely used website statistics service* Provides statistics and reports about visitors and transactions.
Scholars of the Future Increasing Diversity in Information Technology.
CROSSMARC Web Pages Collection: Crawling and Spidering Components Vangelis Karkaletsis Institute of Informatics & Telecommunications NCSR “Demokritos”
Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May Dr. Sadiye Guler Sadiye Guler - Northrop Grumman.
Learn Academy Orientation – Chapter 3 This Session… 1.Instructor Training 2.Certification Vouchers 3.Continuing Education 4.Alumni 5.On Site Support.
United Nations Statistics Division Registry of national Classifications.
FLAVIUS Description of Overblog, Qype, TVTrip - WP5 Evaluation and dissemination.
Computerized Testing System in Science Based on Clickit platform Michal Biran Moshinsky R&D and Training Center - Ort Israel Wingate Seminar - May 2005.
The Initiative For School Empowerment and Excellence (i.4.see) “Empowering teachers, administrators, policy makers, and parents to increase student achievement.”
Community Plan Implementation Training 5-1 Community Plan Implementation Training 5-1.
ETISEO Benoît GEORIS, François BREMOND and Monique THONNAT ORION Team, INRIA Sophia Antipolis, France Nice, May th 2005.
1 ETISEO: Video Understanding Performance Evaluation Francois BREMOND, A.T. Nghiem, M. Thonnat, V. Valentin, R. Ma Orion project-team, INRIA Sophia Antipolis,
The Planning and Implementation of a Webinar An Interactive “How To” Event 1.Please Dial: Enter Pin: and press # 3.Your line will.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 5.
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
ETISEO Project Evaluation for video understanding Nice, May th 2005 Evaluation du Traitement et de l’Interprétation de Séquences vidEO.
INRIA, NICE, December 7 th -8 th 2006 Etiseo Tools.
Unsupervised Mining of Statistical Temporal Structures in Video Liu ze yuan May 15,2011.
Recent Advances in ViPER David Mihalcik David Doermann Charles Lin.
QuickSuite VIP-QuickSuite: A Collection of tools designed to adjust to solve your company needs.
INRETS, Villeneuve d’Ascq, December 15 th -16 th 2005 ETISEO Project Evaluation process.
An analysis of the performance of research libraries in Poland – project description Elżbieta Górska The Polish Librarians’ Association Leistungsmessung.
U.S. Department of Agriculture eGovernment Program eDeployment Kickoff August 26, 2003.
ETISEO François BREMOND ORION Team, INRIA Sophia Antipolis, France.
WHAT IS INTERNET?.  Today the internet offers the opportunity to access to any information, to correspond with someone who has an account, or.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Ian Bird All Activity Meeting, Sofia
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
Web Analytics Fundamentals Presented by Tejaswi, Chandrika, Sunil.
Modeling Adversarial Activity (MAA)
Preparing a Logic Model and Gantt Chart:
Tasks processing Strong participation . Large results comparison on
FIZZ Database General presentation.
SILOGIC S.A. , Toulouse, France
(VIP-EDC) Point 6 of the agenda
CVE.
Evaluation of UMD Object Tracking in Video
CSE591: Data Mining by H. Liu
WEB DESIGN Cross 11, Tapovan Enclave Nala pani Road, Dehradun : ,
EDIT data validation system Ewa Stacewicz EUROSTAT VALIDATION TEAM
Presentation transcript:

ETISEO Evaluation Nice, May th 2005 Evaluation Cycles

Basic evaluation process - Participant registration, - Ask for corpus data trough ETISEO web-site formulary, - Acceptance of limited usage of the videos. - Receive video corpus, - Process tests on its own, - Send back results to the evaluator. -The evaluator process the comparison with ground truth, - Send metric values to the participant.

3 Phases - Definition & Preparation ( 2005 ) - Metrics definition, tools creation - video sequences recording - Test cycle for validation ( 1st semester 2006 ) - realisation of an entire evaluation cycle with participants on the test data set. - ETISEO evaluation cycle ( 2rd semester 2006 ) - final ETISEO evaluation cycle. ETISEO project contains 3 main phases :

3 Seminars - 1st seminar : Nice, may community participations, - Definition process, - starting corpus creation. - 2rd seminar : INRETS, end of resources distribution, - starting test cycle with participants. - 3rd seminar : INRIA, end of evaluation results communication, - new knowlegde analysis, - closing ETISEO project.… Launch evaluation process …

Data set Three data sets are planed 1- Work data set: - Representative of the various sequences ( but non- exhaustive), - Distributed at the beginning of the collecting phase, - In order to give participants the maximum amount of time to run their algorithms on ETISEO corpus. - non evaluation process

Data set Test data set: - Representative of next evaluation data set. - Sequences illustrating several cases predefined in the evaluation process. - Distributed to the participant for test cycle : validation of the evaluation cycle. - Associated ground truth, - Metrics and automatic comparison tool validation, - Complete « test evaluation cycle », - Evaluation results comparison.

Data set Evaluation data set: - Exhaustive representation of elected topics, - Signifiant for statistical evaluation, - Contents not used in previous test cycle, - Created to assess performances of participant algorithms : ETISEO evaluation.

Evaluation Cycles 3 Seminaries 3 data sets 2 Evaluation Cycles

Video processing tasks The following tasks are considered all along the video processing chain: Task 1: Detection of physical objects of interest, Task 2: Classification of physical objects of interest, Task 3: Tracking of physical objects of interest, Task 4: Event recognition. Video data set, ground truth and metrics should enable to highlight different tasks processing to extract detailed analysis of the algorithms. Participants will evaluate tasks corresponding to their usual research topics.

To Be Define - video data set contents & format, - suitable and realistic ground truth, - metrics offering optimal added value, - participants algorithms classification, - results, ground truth format, - questionnaire associated to algorithms results, - results communication process. Several elements have to be define, during the 1 st Seminar and following :

Web Site