Presentation on theme: "Website: https://www.uantwerpen.be/op3mech/"— Presentation transcript:
Website: https://www.uantwerpen.be/op3mech/ https://www.uantwerpen.be/op3mech/ Contact Website: https://www.uantwerpen.be/op3mech/https://www.uantwerpen.be/op3mech/ TETRA-project: SMART DATA CLOUDS (2014 – 2016). Flemish Agency for Innovation by Science and Technology
TETRA: Smart Data Clouds ( ): industrial cases a. Security, track and trace [Haven van Antwerpen]. b. Traffic control and classifications [Macq, LMS, Melexis] c. (Smart) Navigation [Wheelshairs, PIAM, mo-Vis, Automotive LMS] d. Dynamic 3D body scan [RSscan, ICRealisations, ODOS] Data Fusion: ind. Vision linked with CAE. Data from RGB, IR, ToF and HS cameras.
TETRA: Smart Data Clouds ( ): healthcare applica tions a. Wheelchair control - navigation: [mo-Vis, PIAM, Absolid] b. Gesture & Body-analysis: [RSScan, ICrealisation, Phaer, ODOS] ‘Data Fusion’ usable for future Healthcare.
Time of Flight: cameratypes: 03D2xx-cameras PMD[Vision]CamCube 3.0 Fotonic RGB_C-Series IFM-Electronics 352 x 288 pix! 160 x 120 pixel 64x50 pix. Recent: pmd PhotonICs® 19k-S3 Previous models: P – E series. Swiss Ranger SR4000 (MESA) Optrima OptriCam ( 176 x 144 pix. ) ( 160 x 120 pix. ) Swiss Ranger 4500 MESA 176 x 144 pix. DepthSense 325 ( 320 x 240 pix ) Melexis: EVK x60 pix. Near Future: MLX75023 automotive QVGA ToF sensor. ODOS: 1024x1248 pix. Real.iZ-1K vision system BV-ToF 128x120 pix.
j i I 1 1 J u.k u v.k v d r D x y z (0,0,f) f = 1 R φ u = j – j 0 ; u k = k u *u v = i – i 0 ; v k = k v *v tg φ = u k /f tg ψ = v k /f r = √(u k ²+f²) d = √(u k ²+v k ²+f²) J/2 I/2 ToF VISION: world to image, image to world - conversion horizon N Every world point is unique with respect to a lot important coordinates: x, y, z, N x, N y, N z, k r, k c, R, G, B, N R, N G, N B, t°, t... k r,k c A ψ P = [ tg(φ) tg(ψ) 1 d/D ], P’ = [ tg(φ) tg(ψ) 1 ]. ( f can be chosen to be the unit.) i0i0 j0j0 D/d = x/u k = y/v k = z/f The basis of our TETRA-project: ‘Smart Data Clouds’
D/d-related calculations. (1) For navigation purposes, the free floor area can easily be found from: d i /D i = e/E = [ f.sin(a 0 ) + v i.cos(a 0 ) ] / E = [ tg(a 0 ) + tg(ψ i ) ].f.cos(a 0 )/E. Since (d/D) i = f /z i this is equivalent with: z i. [ tg(a 0 ) + tg(ψ i ) ] = E/cos(a 0 ). a0a0 vivi f E didi e DiDi Camera bounded parallel to the floor. Floor. Camera sensor zizi Camera inclination = a 0. ψiψi
n x ~ f. (D 4 /d 4 - D 3 /d 3 )/(D 4 /d 4 + D 3 /d 3 ) = f.(z 4 – z 3 )/(z 4 + z 3 ) n y ~ f. (D 2 /d 2 - D 1 /d 1 )/(D 2 /d 2 + D 1 /d 1 ) = f.(z 2 – z 1 )/(z 2 + z 1 ) n z ~ - (u.n x + v.n y + 1 ) ▪ n d D d² = u² + v² + f². O The world normal vector n, at a random image position (v,u). D/d-related calculations. (2) Fast calculations !! f u v
f a P v t y Pc z Pc zwzw ywyw 1.Camera x // World x // Robot x 2.World (y w, z w ) = Robot (y r, z r ) + t y 3.y Pw = - (z 0 – z Pc ).sin(a) + y Pc.cos(a) = y Pr + t y 4.z Pw = (z 0 – z Pc ).cos(a) + y Pc.sin(a) = z Pr a Coordinate transformations Work plane = reference. yryr zrzr tyty A camera re-calibration for ToF cameras is easy and straightforward !!
Pepper handling First image = empthy world plane Next images = random pepper collections. Connected peppers can be distinguished by means of local gradients. Gradients can easily be derived from D/d-ratios. Thickness in millimeter Calculations are ‘distance’ driven, x, y and z aren’t necessary. Fast calculations !
Bin picking & 3D-OCR Analyse ‘blobs’ one by one. Find the centre of gravity XYZ, the normal direction components N x, N y, N z, the so called ‘Tool Centre Point’ and the WPS coordinates. YouTube: KdGiVL
ToF - RGB correspondency x y IDS uEye UI-1240SE-C x y z O3D2xx MESA SR4000 v c,P /F – k v.v t /f = t x. √ (z² P +y² P ) u c /F = k u.u t /F. Beer barrel inspection.
DepthSense Find the z –discontinuity 2.Look for vertical and forward oriented regions 3.Check the collineraity 4.Use geometrical laws in order to find x, y, z and b. RGBdRGBd Packed… …with a plastic warp
ToF IFM O3D2xx. 1. Remove weak defined pixels. 2. Find the z –discontinuity 3. Look for vertical and forward oriented regions 4. Check the collineraity 5. Use geometrical laws in order to find x, y, z and b. Packed …with a foil CamBoard Nano
Basic tasks of ToF cameras in order to support Healthcare Applications: Guide an autonomous wheelchair along the wall of a corridor. Avoid collisions between an AGV and unexpected objects. Give warnings about obstacles (‘mind the step’…, ‘kids’, stairs…) Take position in front of a table, a desk or a TV screen. Drive over a ramp at a front door or the door leading to the garden. Drive backwards based on a ToF camera. Automatic parking of a wheelchair (e.g. battery load). Command a wheelchair to reach a given position. Guide a patient in a semi automatic bad room. Support the supervision over elderly people. Fall detection of (elderly) people.
Imagine a wheelchair user likes to reach 8 typical locations at home: 1. Sitting on the table. 2. Watching TV. 3. Looking through the window. 4. Working on a PC. 5. Reaching the corridor 6. Command to park the wheel- chair. 7. Finding some books on a shelf. 8. Reaching the kitchen and the garden.
Instant Eigen Motion: translation. ToF guided navigation for AGV’s. Random points P! (In contrast: Stereo Vision must find edges, so texture is pre-assumed ) D 2 = D P ?
x δ1δ1 Image data: tg(β 1 ) = u 1 /f ; tg(β 2 ) = u 2 /f. Task: find the correspondence β 1 β 2 ; Procedure: With 0 < |α| < α 0 With |x| < x 0 Projection rules for a random point P : α R+R+ Instant Eigen Motion: planar rotation. D2D2 β2β2 β1β1 z2z2 z1z1 Previous sensor position Next sensor position P Here: x < 0 R > 0. D1D1 Parallel processing possible! ToF guided Navigation of AGV’s. D 2 = D P ? DPDP α D 2 ² = x P2 ² + z P2 ²
e.g. Make use of 100 Random points. Result = Radius & Angle ToF guided navigation of AGV’s. Instant Eigen Motion: planar rotation. D 2,i = D P,i ?
Research FTI Master Electromechanics Info: ; Research: ToF driven Quadrocopters Combinations with IR/RGB Security flights over industrial areas. TETRA-project ‘Smart Data Clouds’ Conrad: DJI Phantom RTF Quadrocopter
Quadrocopter navigation based on ToF cameras The target is to ‘hover’ above the end point of a black line. If ‘yaw’ is present it should be compensated by an overall torque moment. Meanwhile the translation t can be evaluated. The global effect of the roll and pitch angles repre- sent themselves by means of the points P and Q. The actual copter speed v is in the direction PQ. At the end P and Q need to come together without oscillations, while |t| becomes oriented in the y- direction. An efficient path can be followed up by means of Fuzzy Logic principals. BV-ToF 128x120 pix. t