Presentation is loading. Please wait.

Presentation is loading. Please wait.

Big Data and IOT Laboratory by Dr. Dan Feldman. Asaf Slilat & Omri Cohen Fly With Me Zone 1 Zone 2 Target Anchor Point – Odroid U3 – PS Eye Camera –

Similar presentations


Presentation on theme: "Big Data and IOT Laboratory by Dr. Dan Feldman. Asaf Slilat & Omri Cohen Fly With Me Zone 1 Zone 2 Target Anchor Point – Odroid U3 – PS Eye Camera –"— Presentation transcript:

1 Big Data and IOT Laboratory by Dr. Dan Feldman

2 Asaf Slilat & Omri Cohen Fly With Me Zone 1 Zone 2 Target Anchor Point – Odroid U3 – PS Eye Camera –

3 Main People Tracking (openCV) Zone 1 X,Y,Z Helicopter Server Mode Coordinates Asaf Slilat & Omri Cohen Fly With Me Zone 2 Coordinates People Tracking (openCV)

4 Server Code Zone – Holds information about each camera (Sony PsEye camera). World – Linked list of Zones, lives in Control class. Choosing from four modes: 1) Free mode – Capturing, no tracking. 2) Tracking mode – People tracking is active. Returns person’s coordinates (converted to the quad-copter's world units). 3) Learning mode – Runs before the other modes, detects noises and misdetections in the scene. 4) FlyWithMe mode – Combines learning mode and tracking. Output: Person’s coordinates (x,y,z) sent by TCP message to the quad- copter’s system. Functionality: The server “decides” which zone is active and sends the right coordinates to the quad-copter. Asaf Slilat & Omri Cohen

5 Zone Class Each camera is defined as an autonomy zone. The class Holds properties for each zone: 1) Zone limits – Used to clean unwanted detections and noises (concentrate in a given area). 2) Bad points – A data structure holds all bad points that were detected in the specific zone. 3) Anchors – A list of anchors: key-points in the zone. Used as mark points for the quad-copter (Generally 3-4 points). Output: Closest anchor to the person. Functionality: Calculates closest anchor to the person: 1) After de-noising (According to the bad-points database). 2) As a result of averaging last person’s coordinates (for better results). Asaf Slilat & Omri Cohen

6 Tracking Class Input: A single frame (opencv ‘Mat’ structure). Output: Coordinates of the detected person (lower-left edge). Visually drawing a square around the detected person (on the frame). Based on opencv library code: http://goo.gl/gbeKAT.http://goo.gl/gbeKAT

7 Other Classes point3D – Managing the structure and functionality of a 3D point. Anchor – Holds pre-defined key points of a zone in two sets of coordinates: 1) Our system’s units. 2) Quad-copter’s system units (conversion). Each anchor holds a special fields stands for the next camera, which inform the server in real-time, which camera is going to be active.

8 First Stage Demonstration Video 1: Quad-copter follows after the person. Video 1 Video 2: Quad-copter leads the person, to the board in the far side of the room. Video 2 Pictures: Laboratory, accessories and more… Pictures

9 While Working… Zone 1 Zone 2 Setup


Download ppt "Big Data and IOT Laboratory by Dr. Dan Feldman. Asaf Slilat & Omri Cohen Fly With Me Zone 1 Zone 2 Target Anchor Point – Odroid U3 – PS Eye Camera –"

Similar presentations


Ads by Google