Skip to content

albtam/sdc-nd-didi-competition

 
 

Repository files navigation

MKZ Model

The repository holds the data required for getting started with the Udacity/Didi self-driving car challenge. To generate tracklets (annotation data) from the released datasets, check out the Docker code in the /tracklet folder. For sensor transform information, check out /mkz-description.

Please note that tracklets cannot be generated for Dataset 1 without modifying this code, as we added an additional RTK GPS receiver onto the capture vehicle in order to determine orientation. The orientation determination to enable world to capture vehicle transformations on Dataset 2 is currently being written, with a release target for 4/4/2017 EOD.

Datasets

Here are links to the datasets we've released specifically for this challenge:

  • Dataset 2 – Three different vehicles with a variety of maneuvers, and the Round 1 test seuence. Larger image sizes and two GPS RTK units on the capture vehicle for orientation determination. Velodyne points have been removed to reduce size, so a Velodyne LIDAR driver must be run during bag playback.
  • Dataset 1 – NOT SUITABLE FOR TRACKLET GENERATION. Dataset intended for particpants to become familiar with the sensor data format and ROS in general. Tracklet code must be modified to work on this dataset, and no capture vehicle orientation is available unless using Course-Over-Ground techniques.

Resources

Starting Guides:

  • Udacity Intro – Basic ROS install and displaying data in RVIZ

Here's a list of the projects we've open sourced already that may be helpful:

About

Resources for the Udacity/Didi $100k competition

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 96.8%
  • Shell 2.7%
  • CMake 0.5%