1 / 8

Introduction

Multimedia Platform Pada Autonomous Driving ( Studi Kasus pada “The KITTI Vision Benchmark Suite”). Introduction.

niles
Télécharger la présentation

Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multimedia Platform Pada Autonomous Driving( StudiKasuspada “The KITTI Vision Benchmark Suite”)

  2. Introduction • Mengembangkan autonomous systemsyang mampu membantu manusia dalam tugas sehari-hari adalah salah satu tantangan besar dalam computer science modern. Salah satu contoh adalah otonom sistem yang dapat membantu mengurangi kematian yang disebabkan mengemudi oleh kecelakaan lalu lintas. Berbagai sensor baru memiliki telah digunakan dalam beberapa tahun terakhir untuk tugas-tugas seperti recognition,navigation dan manipulation of objects.Autonomous driving systems sebagian besar mengandalkan GPS, laser range finders, radar dan peta .

  3. Sensor Setup • 1 Inertial Navigation System (GPS/IMU): OXTS RT 3003 • 1 Laserscanner: Velodyne HDL-64E • 2 Grayscale cameras, 1.4 Megapixels: Point Grey Flea 2 (FL2-14S3M-C) • 2 Color cameras, 1.4 Megapixels: Point Grey Flea 2 (FL2-14S3C-C) • 4 Varifocal lenses, 4-8 mm: Edmund Optics NT59-917

  4. Conclussion • Platform autonomous driving dengan standard station wagon denganduawarna color dandua grayscale PointGrey Flea2 video cameras (10 Hz, resolution: 1392×512 pixels, opening: 90◦ ×35◦), Velodyne HDL-64E 3D laser scanner (10 Hz, 64 laser beams, range: 100 m), GPS/IMU localization dengan RTK correction signals (open sky localization errors < 5 cm) dan powerful computer running sebagai real-time database. • Sensor kalibrasiyang dilakukan Camera-to-Camera calibration, Velodyne-to-Camera calibration dan GPS/IMU-to-Velodyne calibration. • Untukpengembanganakandilakukantermasuk visual SLAM dengan loop-closure capabilities, object tracking, segmentation, structure-from-motion dan 3D scene understanding.

More Related