1 / 32

Vision-based parking assistance system for leaving perpendicular and angle parking lots

Vision-based parking assistance system for leaving perpendicular and angle parking lots. 2013/12/17 指導教授 : 張元翔 老師 研究生 : 林柏維 通訊碩一 10279026. Outline. Introduction Related work Methods A.STHOL feature selection B.Bayesian decision Results Conclusion and Future. Outline.

nubia
Télécharger la présentation

Vision-based parking assistance system for leaving perpendicular and angle parking lots

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vision-based parking assistance system for leaving perpendicular andangle parking lots 2013/12/17 指導教授 : 張元翔 老師 研究生 : 林柏維 通訊碩一 10279026

  2. Outline • Introduction • Related work • Methods A.STHOL feature selection B.Bayesian decision • Results • Conclusion and Future

  3. Outline • <Introduction> • Related work • Methods A.STHOL feature selection B.Bayesian decision • Results • Conclusion and Future

  4. Introduction(1) • In the last years, a considerable number of research works and industrial developments onIntelligent Parking Assist Systems (IPAS)have been proposed, including both assistance and automatic parking approaches.

  5. Introduction(2) • The main goal of IPAS that assist drivers when parking is to ease the maneuver avoiding small collisions, reducing car damage, and avoiding personal injuries. • In this paper a new vision-based Advanced Driver Assistance System (ADAS) is proposed to deal with scenarios like the ones depicted in Figs. 1(a) and 1(b).

  6. Introduction(3) (a) Back-out perpendicular parking (b) Back-out angle parking Fig. 1. Driver and camera Field of View (FOV).

  7. Outline • Introduction • <Related work> • Methods A.STHOL feature selection B.Bayesian decision • Results • Conclusion and Future

  8. Related work(1) • The proposed architecture of the system is composed of three main parts: camera, processor and CAN-Bus communications. • From the CAN-Bus we obtain the next variables: steering angle, car speed and current gear.

  9. Related work(2) • These variables are used to trigger on/off the detection module according to the Finite State Machine (FSM). • Then the system waits until the car has been put into reverse gear and the detection module is then triggered on.

  10. Related work(3) Fig. 2. FSM for detection module.

  11. Related work(4) Fig. 3. Camera located at the back-right side of the vehicle.

  12. Related work(5) (a) Driver’s point of view (b) Image captured by the camera Fig. 4. Driver and camera point of view.

  13. Outline • Introduction • Related work • <Methods > A.STHOL feature selection B.Bayesian decision • Results • Conclusion and Future

  14. Methods • The spatio-temporal domain is analyzed by accounting the number of lines and their length with respect to their orientations in a histogram of orientations that we so-called Spatio-Temporal Histograms of Oriented Lines (STHOL).

  15. Flow chart Fig. 5. Overview of the spatio-temporal detection module.

  16. Spatio-temporal images Fig. 6. Two examples of the scan-lines and spatiotemporal images.

  17. Outline • Introduction • Related work • Methods <A.STHOL feature selection> B.Bayesian decision • Results • Conclusion and Future

  18. STHOL feature selection(1) • The first step of the line detection is the computation of the image derivatives using Sobel edge detector. • The edge pixels having the same label are then grouped together using connected components algorithm.

  19. STHOL feature selection(2) • The line segment candidates are obtained by fitting a line parameterized by an angle q and a distance from the origin p using the following expression: p= x cos(q) +ysin(q) (1)

  20. STHOL feature selection(3) • The line parameters are then determined from the eigenvalues and and eigenvectors v1 and v2 of the matrix D associated with the line support region which is given by:

  21. STHOL feature selection(4) • If the eigenvector v1 is associated with the largest eigenvalue, the line parameters (p,q ) are determined using:

  22. STHOL feature selection(5) Fig. 7. Overview of the STHOL feature selection architecture.

  23. Outline • Introduction • Related work • MethodsA.STHOL feature selection <B.Bayesian decision> • Results • Conclusion and Future

  24. Bayesian decision(1) • we represent the image I in terms of STHOL features and follow a Bayesian approach considering the free traffic class:

  25. Bayesian decision(2) • Use the minimum-error-rate classification using the discriminant function:

  26. Bayesian decision(3) • Instead of using two discriminant functions and , and assigning to we define a single discriminantfunction and we finally trigger the warning signal if .

  27. Outline • Introduction • Related work • Methods A.STHOL feature selection B.Bayesian decision • <Results> • Conclusion and Future

  28. Results Fig. 8. Three sample sequences with the triggered warning signal.

  29. Outline • Introduction • Related work • Methods A.STHOL feature selection B.Bayesian decision • Results • <Conclusion and Future>

  30. Conclusion and Future • This paper presented a novel solution to a new type of ADAS to automatically warn the driver when backing out in perpendicular or angle parking lots. • A novel spatio-temporal motion descriptor is presented(STHOL features) to robustly represent oncoming traffic or free traffic states. • A Bayesian framework is finally used to trigger the warning signal.

  31. Conclusion and Future(2) • Future work will be concerned with the evaluation of the method in night-time conditions, comparisons between our generative approach and other discriminative approaches such as SVM-based.

  32. Thank you for your listening

More Related