Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction of Mobility laboratory & Collaboration with CALTECH Noriko Shimomura Nissan Mobility Laboratory.

Similar presentations


Presentation on theme: "Introduction of Mobility laboratory & Collaboration with CALTECH Noriko Shimomura Nissan Mobility Laboratory."— Presentation transcript:

1 Introduction of Mobility laboratory & Collaboration with CALTECH Noriko Shimomura Nissan Mobility Laboratory

2 Objective of this presentation 1.Mobility laboratory & our aims 2.Examples of our research 3.Collaboration with CALTECH by Sep. 2008 - Introduce Nissan’s researches and needs - have good collaboration by Sep. 2008 Contents

3 Alarm Controller Sensor Mobility Laboratory - Vehicle control - Human machine interface - Object detection, Road environment recognition Our aim - Reducing traffic accidents - Providing new driving assistance systems - Improving autonomous vehicle technology Mobility Laboratory & our aims

4 Camera Laser radar 1.Forward environment recognition using laser radar and camera 2.Nighttime driving support system using infra- red camera ! Examples of our research

5 Z X Y Axis of lens Camera Scanning Laser Radar Scan area Sensor Configuration Forward environment recognition

6 Example of Observed Sensor Data

7 SLR Lane maker recognition Camera Grouping Stationary/Moving Object Distinction Preceding vehicle , vehicles, structures on the road (2) (1) (signs, delineators) Camera: lane maker recognition Laser Rader: Object detection & distinction Flowchart

8 Outline of Lane Maker Recognition Y P(x,y,z) YIYI O X Axis of lens Z f Camera XIXI Height from the road surface Dy Road Model : X = (ρ/2) ・ Z 2 + φ ・ Z – Dx+ i ・ W ( i=0,1) Y = ψ ・ Z + Dy Camera position : Dx , Dy, θ , φ , ψ ( θ=0 ) Dx W ρ, φ , Dx , Dy , ψ are calculated using edge positions by regression analysis Lane width edge positions i=0 left lineright line image example i=1

9 Image input Detection region determination edge point detection Parameters on the previous image Lane maker detection Edge image b y Sobel operator Parameter estimation Edge image Curvatures Pitch angle Yaw angle Lateral position Bounce edge point on lane maker Flowchart & edge point detection

10 Recognition result

11 Recognition result (rainy day)

12 SLR Lane maker recognition Camera Grouping Stationary/Moving Object Distinction Preceding vehicle , vehicles, structures on the road (2) (signs, delineators) Camera: lane maker recognition Laser Rader: Object detection & distinction Flowchart

13 Object Detection by SLR Grouping1 Grouping2 Detected points Delineators Vehicles Sign(overhead) Z X SLR ~ Grouping method ~ - located closely - in the same distance - in the same direction DelineatorVehicle

14 Solution to the Difficulty → Delineator Distinction Tagging → Tag check Z X Δx - Δz + Δz - Δx + Tagged objects are detected along the lane.The relative speed is not estimated correctly. Tag

15 Object Distinction Preceding vehicle Based on ・ Stationary/Moving ・ Delineator recognition ・ Width of objects ・ Relative position to lanes Vehicles Road structures

16 (Before applying the proposed method) Detection and Discrimination with Relative speed and Grouping -- Preceding vehicle, vehicles, road structures --

17 Detection and distinction result with the proposed method -- Preceding vehicle, other vehicles, road structures --

18 Detection and distinction result with the proposed method -- Preceding vehicle, other vehicles, road structures --

19 Camera Laser radar 1.Forward environment recognition using laser radar and camera 2.Nighttime driving support system using infra- red camera ! Examples of our researches

20 ! ~ Adaptive Front lighting System with Infra-Red camera ~ Nighttime driving support system IR image (temperature) IR-AFS → Illuminate the pedestrian by Adaptive Front lighting System The driver can find the pedestrian easily at night including some objects that may be pedestrians

21 Effect of IR-AFS

22 Difficulty in IR based pedestrian detection Summer night (27 ℃ ) Ordinary approach of pedestrian detection with IR camera Large area has the same temperature as human Binary image → IR image 25 - 37 ℃ Binary image →

23 Our Aim Nighttime driving support system → Season independent pedestrian detection algorithm (Making use of other information than temperature) Effective nighttime driving support (It doesn't affect the driver, even if there are some false detection) Available in any seasons

24 Features in detection - There is no texture on IR image. - Many wrinkles on the cloths, few straight lines - Few wrinkles on artificial objects(cars, buildings) → Wrinkles and rough surface activate corner filters corner Strong > Weak weakStrong→

25 : Illumination Target: Detected pedestrian Explanation of our Algorithm : feature point Video

26 Collaboration with Caltech in 2007 1. CALTECH’s technologies 2. Nissan’s needs recognition methods that we have to improve including extension term Collaboration w/ Vision Lab: Want to make collaboration better

27 CALTECH’s technologies Focusing methods Probabilistic model Constellation model, etc. Learning method Feature detection (SIFT, Harris, etc. ) Nissan interests and focuses on

28 Nissan needs and requirement pedestrian detection road region recognition (without lane markers) improved lane marker recognition (available for many types of lane markers)

29 pedestrian detection

30 improved lane marker recognition (available for many types of lanes) Botts' dots

31 road region recognition (without lane marker)

32 Idea for collaboration /w no cost extension Caltech Pedestrian detection Nissan Road region detection Requirement for Pedestrian detection Accuracy: more than 75% False Alarm: less than 5% Min target size: 10x20 Processing time: up to 500ms (e.g. 100ms)

33 Schedule and Target in Sep. 2008 Dataset (provided by Nissan, AVI, VGA) First dataset: by the end of Aug. 2007 Second dataset: in Jan. 2008, for validation Deliverable in Sep. 2008 Documents of proposed method Result of experiment, detection ratio Mit-term report & information exchange (Jan. 2008) mid-term report(minimun target size, processing time etc.) provide additional dataset for validation 75% min target size ROC brain storming start developing new method Sep. 07 Jan. 08 develop & improve the method Sep. 09 validation using dataset

34 Deliverble end of Sep.2007 Singniture of Dr. Perona on the first page Report written by Seigo Watanabe. Jan. 2008 Mid-term report written by Post Dr. in Caltech more concrete target(minimun target size etc.) end of Sep. 2008 final report witten by Post Dr. in Caltech Documents of proposed method and validation results

35 Road Model iWDZZX x   2 2 Dy ZY  Z Y φ 0  i1  i  Z X W φ  Road curvature Yaw angle Lateral position Lane width Pitch angle Camera height = Bounce ρ, φ , Dx , Dy , ψ are calculated using edge positions by regression analysis iWZZX   2 2 ZY  ψ ψ Dy Dx


Download ppt "Introduction of Mobility laboratory & Collaboration with CALTECH Noriko Shimomura Nissan Mobility Laboratory."

Similar presentations


Ads by Google