Presentation on theme: "PETS’05, Beijing, October 16 th 2005 ETISEO Project Ground Truth & Video annotation."— Presentation transcript:
PETS’05, Beijing, October 16 th 2005 ETISEO Project Ground Truth & Video annotation
PETS’05, Beijing, October 16 th 2005 ETISEO - GT For each video sequence, the evaluator will generate the corresponding GT: - Using the Viper-GT Annotation tool ( U Maryland ). Video sequences are characterized considering: - day time : day, sunrise, nightfall, night - day time : day, sunrise, nightfall, night - Weather conditions: sun, cloud, rain, fog, snow - Illumination variations: none, slow or fast - Shadows: no shadows, weak shadow or contrasted - … Enabling a graduation of difficulties from easy to hard.
PETS’05, Beijing, October 16 th 2005 Ground Truth Physical object annotations include: - Bounding Box - Type of the object: Person, Vehicle, Group, … - Sub-Type : as Car, Truck or Loader for vehicle … - States: Static, occluded … Event annotations include: - Event type ( Ontology ) - Starting time (frame), - Ending time (frame),
PETS’05, Beijing, October 16 th 2005 Contextual Information Camera calibration : - Set of 2D and 3D points - Calibration matrix (with or without distorsion) Contextual information associated to videos: - 3D empty scene model Minimum information : few 3D distances drawn on the image Maximum information: 3D scene model made available
PETS’05, Beijing, October 16 th 2005 ETISEO Project Tasks&Metrics
PETS’05, Beijing, October 16 th 2005 Tasks evaluated GT & Metrics are designed to evaluate tasks all along the video processing chain: Task 1: Detection of physical objects, Task 2: Localisation of physical objects, Task 3: Classification of physical objects, Task 4: Tracking of physical objects, Task 5: Event recognition.
PETS’05, Beijing, October 16 th 2005 ETISEO Metrics A variety of metrics is proposed for a detailed analysis of the algorithms : - Based on quantitative evaluations, - Based on quantitative evaluations, - Applied on large diversity of sequences. - Applied on large diversity of sequences. Working documents are on www.etiseo.net : - metrics definition, - Video annotation rules.
PETS’05, Beijing, October 16 th 2005 ETISEO Criteria - Number of correctly detected objects in each frame, - Precision of 2D localisation (centroid, bounding box) - Objects fragmentation (splitting & merging), -Tracking persistence with partial occlusion (static or dynamic), - Objects’ ID persistent across the video, - Object classification & Object recognition, - Number of recognised events in the video sequences, - Correct scenario recognition on time (starting & ending time) - Precision 3D trajectories. Following criteria are applied to qualify algorithms:
PETS’05, Beijing, October 16 th 2005 Partners Results Partners must submitted their results : - in suitable time, - with XML compatible format. Participants answer questionnaire associated to algorithms results (time processing, learning phase, parameters data set, mask, calibration…). The Evaluator provides necessary utilities in order to allow participants to exploit existing software being compatible with GT.
PETS’05, Beijing, October 16 th 2005 Basic evaluation process - Participant registration, - Ask for corpus data trough ETISEO web-site formulary, - Acceptance of limited usage of the videos. - Receive video corpus, - Process tests on its own, - Send back results to the evaluator. - The evaluator computes the comparison with GT, - Send metric values to the participant.
PETS’05, Beijing, October 16 th 2005 3 Phases - Definition & Preparation, 2005: - Metrics definition, tools creation - video sequences recording - Test cycle for validation, 1st semester 2006: Realisation of an entire evaluation cycle with participants on the test data set. - ETISEO evaluation cycle, 2rd semester 2006: Final ETISEO evaluation cycle.
PETS’05, Beijing, October 16 th 2005 3 Seminars - 1st seminar : 10-11 may 2005, Nice, France : - Definition process & ressources generation. - 2rd seminar : INRETS, 15-16 Dember, Villeneuve d’Asc,France Villeneuve d’Asc,France - resources distribution, - starting test cycle with participants. - 3rd seminar : INRIA, end of 2006 Sophia-Antipolis, France - evaluation results communication, - new knowlegde analysis, - End of ETISEO project ….
PETS’05, Beijing, October 16 th 2005 ETISEO Project Results&Dissemination
PETS’05, Beijing, October 16 th 2005 Participants Results - Results are transmitted by the evaluator : - to each participant, - to INRIA, the scientific leader - During the project results are not diffused by the evaluator. - Results publication will be realised with participant’s agreement.
PETS’05, Beijing, October 16 th 2005 ETISEO Dissemination - New knowledge on the evaluation processing, pertinence of the video dataset, the annotation and of the metrics will be discuss. - The assembly of the results will be presented during the last ETISEO seminar (end 2006). - ETISEO Ressources & tools will be diffused contributing to evaluation good practices diffusion.
PETS’05, Beijing, October 16 th 2005 Project coordinator Mr. David CHER www.silogic.fr www.etiseo.net david.cher@ silogic.fr Phone : + 33 (0)5 34 61 93 57 Scientific leader Mr. François BREMOND www-sop.inria.fr /orion francois.bremond@ sophia.inria.fr Phone : + 33 (0)4 92 38 76 59 Phone : + 33 (0)4 92 38 76 59
Your consent to our cookies if you continue to use this website.