Towards real-time camera based logos detection Mathieu Delalandre Laboratory of Computer Science, RFAI group, Tours city, France Osaka Prefecture Partnership.

Slides:



Advertisements
Similar presentations
Courseware Scheduling of Distributed Real-Time Systems Jan Madsen Informatics and Mathematical Modelling Technical University of Denmark Richard Petersens.
Advertisements

1/1/ / faculty of Electrical Engineering eindhoven university of technology Introduction Part 3: Input/output and co-processors dr.ir. A.C. Verschueren.
Low Complexity Keypoint Recognition and Pose Estimation Vincent Lepetit.
EVENTS: INRIA Work Review Nov 18 th, Madrid.
Adviser : Ming-Yuan Shieh Student ID : M Student : Chung-Chieh Lien VIDEO OBJECT SEGMENTATION AND ITS SALIENT MOTION DETECTION USING ADAPTIVE BACKGROUND.
CMPT 300: Final Review Chapters 8 – Memory Management: Ch. 8, 9 Address spaces Logical (virtual): generated by the CPU Physical: seen by the memory.
1Rémi Devinant DII5 / Devices synchronization for modeling 3D plane.
Operating System Support Focus on Architecture
1 Soft Timers: Efficient Microsecond Software Timer Support For Network Processing Mohit Aron and Peter Druschel Rice University Presented By Jonathan.
03/09/2007CSCI 315 Operating Systems Design1 Memory Management Notice: The slides for this lecture have been largely based on those accompanying the textbook.
Introduction to Operating Systems – Windows process and thread management In this lecture we will cover Threads and processes in Windows Thread priority.
CMPT 300: Final Review Chapters 8 – Memory Management: Ch. 8, 9 Address spaces Logical (virtual): generated by the CPU Physical: seen by the memory.
Computer System Structures memory memory controller disk controller disk controller printer controller printer controller tape-drive controller tape-drive.
Chapter 11 Operating Systems
Real-Time Kernels and Operating Systems. Operating System: Software that coordinates multiple tasks in processor, including peripheral interfacing Types.
Device Management.
Computer Organization and Architecture
Complexity 19-1 Parallel Computation Complexity Andrei Bulatov.
Silberschatz, Galvin and Gagne  Operating System Concepts Multistep Processing of a User Program User programs go through several steps before.
Instructor : Dr. K. R. Rao Presented by: Rajesh Radhakrishnan.
03/05/2008CSCI 315 Operating Systems Design1 Memory Management Notice: The slides for this lecture have been largely based on those accompanying the textbook.
1 Chapter 13 Embedded Systems Embedded Systems Characteristics of Embedded Operating Systems.
CS364 CH08 Operating System Support TECH Computer Science Operating System Overview Scheduling Memory Management Pentium II and PowerPC Memory Management.
Layers and Views of a Computer System Operating System Services Program creation Program execution Access to I/O devices Controlled access to files System.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
Automatic Camera Calibration
Introduction to Embedded Systems
Finish Adaptive Space Carving Anselmo A. Montenegro †, Marcelo Gattass ‡, Paulo Carvalho † and Luiz Velho † †
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
Swapping and Contiguous Memory Allocation. Multistep Processing of a User Program User programs go through several steps before being run. Program components.
© Oxford University Press 2011 DISTRIBUTED COMPUTING Sunita Mahajan Sunita Mahajan, Principal, Institute of Computer Science, MET League of Colleges, Mumbai.
Marco Pedersoli, Jordi Gonzàlez, Xu Hu, and Xavier Roca
Scheduling policies for real- time embedded systems.
The Functions of Operating Systems Interrupts. Learning Objectives Explain how interrupts are used to obtain processor time. Explain how processing of.
Lecture 3 Process Concepts. What is a Process? A process is the dynamic execution context of an executing program. Several processes may run concurrently,
COMPUTER ORGANIZATIONS CSNB123 NSMS2013 Ver.1Systems and Networking1.
Object Detection with Discriminatively Trained Part Based Models
By Teacher Asma Aleisa Year 1433 H.   Goals of memory management  To provide a convenient abstraction for programming.  To allocate scarce memory.
Finish Hardware Accelerated Voxel Coloring Anselmo A. Montenegro †, Luiz Velho †, Paulo Carvalho † and Marcelo Gattass ‡ †
Main Memory. Chapter 8: Memory Management Background Swapping Contiguous Memory Allocation Paging Structure of the Page Table Segmentation Example: The.
End-To-End Scheduling Angelo Corsaro & Venkita Subramonian Department of Computer Science Washington University Distributed Systems Seminar, Spring 2003.
CSCI1600: Embedded and Real Time Software Lecture 33: Worst Case Execution Time Steven Reiss, Fall 2015.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
IT3002 Computer Architecture
OSes: 2. Structs 1 Operating Systems v Objective –to give a (selective) overview of computer system architectures Certificate Program in Software Development.
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 31 Memory Management.
Embedded Computer - Definition When a microcomputer is part of a larger product, it is said to be an embedded computer. The embedded computer retrieves.
Slides created by: Professor Ian G. Harris Operating Systems  Allow the processor to perform several tasks at virtually the same time Ex. Web Controlled.
Embedded Real-Time Systems Processing interrupts Lecturer Department University.
Work in progress in graphics recognition Mathieu Delalandre DAGMinar, 12th of May 2009, CVC, Barcelone, Spain.
Silberschatz, Galvin and Gagne ©2009 Operating System Concepts – 8 th Edition, Chapter 8: Main Memory.
Real-Time Operating Systems RTOS For Embedded systems.
Processing visual information for Computer Vision
Lecture Topics: 11/1 Processes Process Management
William Stallings Computer Organization and Architecture
Main Memory Management
Computer Architecture
CSCI1600: Embedded and Real Time Software
Multistep Processing of a User Program
Processor Fundamentals
Operating Systems.
Chapter 8: Memory management
Outline Module 1 and 2 dealt with processes, scheduling and synchronization Next two modules will deal with memory and storage Processes require data to.
Processes Hank Levy 1.
Course 6 Stereo.
Processes Hank Levy 1.
CSCI1600: Embedded and Real Time Software
Virtual Memory.
Presentation transcript:

Towards real-time camera based logos detection Mathieu Delalandre Laboratory of Computer Science, RFAI group, Tours city, France Osaka Prefecture Partnership meeting Tours city, France Friday 9 th of September

Towards real-time camera based logos detection 1.Introduction 2.Devices synchronization for 3D frame tagging 3.Frame partitioning and selection 2

Towards real-time camera based logos detection “Introduction” (1) Logo detection from video capture using some handled interactions, to display context based information (tourist check points, bus stop, meal, etc.). This constitutes a hard computer vision application, due to the complexity of the recognition task and the real time constraints. To support the real time, two basic paths could be considered 1.To reduce complexity of the algorithms 2.To reduce the amount of data CameraSelection Pattern Recognition Frames 3

Towards real-time camera based logos detection “Introduction” (2) Static object: without motion and appearance modification Dynamic object: with motion then with appearance modification “With static objects, one capture (in time and space) could be enough for recognition, if recognition is perspective, scale and rotation invariant and if occlusions neither appear” “Capture instance could be detected if the embedded system can track its own positioning, and if objects are static” “Then, self-tracking embedded system can be set for single capture of static objects. It can support real-time recognition by reducing the amount of data to process, without miss-case (i.e. one capture is here, at least)” is object is camera t0t0 t1t1 t2t2 4

Towards real-time camera based logos detection 1.Introduction 2.Devices synchronization for 3D frame tagging 3.Frame partitioning and selection 5

Towards real-time camera based logos detection “Device synchronization for 3D frame tagging” (1) Camera device, to capture images Accelerometer device, that measures proper acceleration. Gyroscope device, for measuring or maintaining orientation The combination of these devices allows to tag frames in 3D space. 6 x, y, z the frame Embedded system orientation Embedded system positioning (from root) Frame coordinates d

Towards real-time camera based logos detection “Devices synchronization for 3D frame tagging” (2) Most of the commercial wearable systems (e.g. smartphones) can support frame tagging, but the multimodality is designed in a separate way, not in the sense of combination of these modalities. The device synchronization at hardware level is not done, and must achieved at the operating system level. How to do it ? Device controller CPU data control Memory data control Polling exchange with device (accelerometer, gyroscope) DMA exchange with device (camera) Device controller DMA data control Memory data control CPU controlinterrupt real-life event (t E ) memory writing (t w ) t  value depends of the device, considering -Acquisition delay of the device -Data transference time on bus -Execution time of control instruction -Interrupt execution time -Etc. value is an estimation, it depends of -Mean access bus rate -operating scheduling and interrupt queuing -Etc. 7

Towards real-time camera based logos detection “Devices synchronization for 3D frame tagging” (3) is the “root” and interrupt based device, every device will synchronize itself with it The device to be synchronized with the root device T i0 The “coarse” timer, in charge of the “root” device at level 0 T0T0 Period of timer 0 T i1 The “finer” timer, in charge of the device to synchronize at level 1 T1T1 Period of timer 1, with L1 is frame length for T1, N the whished synchronization precision,  the bounded parameter I0I0 Is the first interrupt time Root Device D 0 Device D 1 e.g. At I 0, run T i0 Every T0, run T i1 Synchronization will be done using a two timers framework - The “coarse” timer will be scheduled on the root device - The “finer” timer will be used within a “upstream” frame, to be opened previously to the next “coarse” timer period. It will allow to catch events of the device to be synchronized t E0 t E1 I0I0 T0T0 L1L1 t I 0 +T 0 T i0 T i1 I 0 +2T 0 8

Towards real-time camera based logos detection “Devices synchronization for 3D frame tagging” (4) t s+(k=1)T 1 s=I 0 +T 0 s+(k=2)T 1 s+(k=3)T 1 I1I1 T i1 General synchronization algorithm of the Ti1 timer k = 0 Every T 1 period k = k+1 When Ii occurs is the “root” and interrupt based device, every device will synchronize itself with it The device to be synchronized with the root device T i0 The “coarse” timer, in charge of the “root” device at level 0 T0T0 Period of timer 0 T i1 The “finer” timer, in charge of the device to synchronize at level 1 T1T1 Period of timer 1, with L1 is frame length for T1, N the whished synchronization precision,  the bounded parameter I0I0 Is the first interrupt time Root Device D 0 Device D 1 t E0 t E1 9

Towards real-time camera based logos detection 1.Introduction 2.Devices synchronization for 3D frame tagging 3.Frame partitioning and selection 10

Towards real-time camera based logos detection “Frame partitioning and selection” (1) Device synchronization can support 3D image tagging The open problems now are how to detect overlapping between frames, how to achieve the frame selection in case of overlapping,and how to access the obtained partition. 11 is the set of frame is the intersection polygon and set of regions, such as,obtained next to overlapping Is the selection method F1F1 F2F2 F3F3 F4F4 P1P1 P3P3 P4P4 P5P5 P6P6 P7P7 P2P2 e.g. x, y, z the frame orientation Positioning Frame coordinates d

Towards real-time camera based logos detection “Frame partitioning and selection” (2) 12 To detect the overlapping, frames can be projected into a plane D to be computed with line intersection and closed polygon detection algorithms at complexity k  O(n  log(n)). P To do it, it is necessary to fix the position of P in the 3D space and define an updating protocol F1F1 F2F2 P can be obtained by meaning positioning of frames Updating of positioning is not necessary at any frame capture, only when important differences start to appear between the current plan and recent frame captures. D1 D2 t t1t1 t2t2 At t 1, D1 is computed from the current frames At t 2, differences between D1 and D2 (corresponding to recent frame captures) is too important, D1 is shifted to D3 D3

Towards real-time camera based logos detection “Frame partitioning and selection” (3) 13 F1F1 F2F2 P1P1 P3P3 P2P2 Once overlapping are detected, at every overlap a region (coming from the overlapping frames) must be selected using a selection method e.g. This selection can be done using a spatial criterion d1d1 d2d2 c1c1 c2c2 c 1, c 2 are projected gravity centers of frames Video frame processing is a producer/consumer synchronization problem, where producer (i.e. frame capture) are blocked on memory constraint, and consumer (i.e. image process) are blocked when the frame stack is empty. Here, we are working “up” to the frame with partition object. Intelligent access must be driven with RAG (Region Adjacency Graph) structure and graph coloring techniques. R1F1R1F1 R2F1R2F1 R3F1R3F1 R4F2R4F2 R5F2R5F2 e.g. adjacent side F1F1 F2F2 to process together

Towards real-time camera based logos detection 1.Introduction 2.Devices synchronization for 3D frame tagging 3.Frame partitioning and selection 14