A Large-Scale Comprehensive Perception Dataset with High-Density Long-Range Point Clouds


News

June 3, 2021
Trajectory forecasting challenge is announced at the CVPR Precognition workshop. Details here!
May 25, 2021
Trajectory forecasting evaluation server is open. Everyone is welcome for benchmarking!
April 6, 2021
Data released and open to download here.

Get In Touch

Subscribe to our mailing list

Subscribe

About

  • Full sensor suite (3x LiDAR, 1x SPAD-LiDAR, 4x Radar, 5x RGB, 5x depth camera, IMU, GPS)
  • High-density and long-range LiDAR point cloud
  • Multi-echo point cloud from SPAD-LiDAR
  • 100 sequences with 1000 frames (100s) each
  • Out of distribution data including car crash and violation of traffic trules
  • 500,000 annotated images for 5 camera viewpoints
  • 100,000 annotated frames for each LiDAR/Radar sensor
  • 26M 2D/3D bounding boxes precisely annotated for 4 object classes (car, cyclist, motorcycle, pedestrian)
  • Object identity annotated across time to form trajectories
  • Object attributes such as percentage of truncation/occlusion, angular and linear velocity, acceleration, brake, steer, throttle
  • Sequential point cloud panoptic segmentation: All points annotated for 23 semantic classes in all sequences; Points belong to foreground objects are also annotated for a unique instance class.
  • Video panoptic segmentation: All pixels annotated for 23 semantic classes in all videos. Pixels belong to foreground objects are also annotated for a unique instance class.
  • Free to use for both non-commercial and commercial uses

Features

All-Inclusiveness

  • All sensors with 360° coverage, including RGB, stereo, depth, LiDAR, SPAD-LiDAR, Radar, IMU, GPS
  • Annotation for 2D/3D detection, tracking, forecasting, panoptic segmentation
  • Variations of adverse weather/lighting, crowded scenes, people running, high-speed driving, violations of traffic rule, car accidents (vehicle to vehicle/pedestrian/cyclist)

High-Density and Long-Range LiDAR

  • LiDAR data is at a range of 1K meters, ~8x beyond the range of Velodyne-64
  • LiDAR data has up to 1M points per frame, ~10x beyond the density of Velodyne-64
  • Our experiments show that high-density and long-range LiDAR enables more robust and early detection on small objects at a large distance, which is essential for planning, especially in high-speed driving

SPAD-LiDAR with Multiple Echos

  • Measure every single photon and generate a 3D tensor of photon counts of the scene
  • Provide point cloud with multiple returns (echoes) if the laser is partially reflected by multiple objects
  • Provide multi-echo reflectance, each measuring intensity of a single laser pulse return (echo)
  • Provide ambient image to simulate sunlight reflected by objects

Out-of-Distribution Data (Car Crash)

  • Vehicle driving off the road
  • Car went flying due to crash
  • Ego-vehicle tilted after being hit
  • Vehicle-to-pedestrian crash
  • Rear-end collision
  • Car flipped over after crash
  • Vehicle-to-cyclist crash
  • Multi-vehicle collision