You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to propose a new (and hopefully universal) radar message that overcomes the shortages and limitations of the current OSS (ROS) one.
To do do, I first would like to present an analysis of the current ROS message, then list some of the radars with OSS drivers or definitions out there, and finally propose a new set of message definitions that support all known use cases, and is expected to be compatible with future devices.
That being said, I am not expert in this area, and would hope to get feedback from the community about this proposal, so we can properly leverage radar technology to its fullest (internally at TIER IV we already went through a series of reviews regarding this proposal).
*Note: in this document the following terms are used interchangeably, since every vendor has its own definitions:
ARS548 detections: range, range rate, azimuth, elevation (all with std + status flag), rcs, classification id, object id, ambiguity flag, multi target probability, positive predictive value
SRR520 detections: range, range rate, azimuth, SNR, RCS, false detection flags
ARS548 objects: ID, age (cycles), measurement status, movement status, position reference (which part of the object is being tracked), 3d position + cov, orientation + std, 3d velocity & acceleration (absolute & relative !) + std + cov, orientation + cov, existence probability, multi class classification, shape edge (length + width)
SRR520 objects: ID, 2d position + std, 2 velocity + std, 2d acceleration + std, shape edge (length + width), orientation, RCS, score, valid flag for the shape, measurement status
Towards an universal radar interface
As presented before, the ROS radar messages are insufficient since:
Objects do not contain orientation.
Although there is support for 4D radars, from the interface alone we can not know if the elevation is zero or is just not available.
Different vendors and models provide several features, but there is no way to accommodate them all (mainly in the detection interface).
We can not reliably know when a covariance is invalid, high, or simply not provided.
Notes regarding reference systems and dynamics:
Depending on the radar, objects may be output either in the sensor frame or the base link (provided the extrinsics are given).
Dynamics (velocity / acceleration) can be either relative or absolute (depending on the availability of motion compensation). This needs to be clear to the user or integrator
Proposal
Radar Object Info
This message is published at a low frequency and is similar to how the CameraInfo works for cameras. The fields in this message are radar dependent, but do NOT change over time.
Regarding the potential classes, the definition should be a superset of the classes used in object detection in Autoware. We have seen radars with classes that while important to autonomous driving, do not map with the standard classes that we use.
Availability, resolution, and range, are fields that may be needed to correctly process the information in the relevant stacks.
std_msgs/Header header
bool measurement_status_available # measured or tracked
bool position_z_available
bool velocity_z_available
bool acceleration_z_available
bool length_available
bool width_available
bool height_available
bool position_cov_available
bool velocity_cov_available
bool acceleration_cov_available
bool shape_cov_available
bool orientation_available
bool orientation_std_available
bool orientation_rate_available
bool orientation_rate_std_available
bool existence_probability_available
bool position_resolution_available
bool velocity_resolution_available
bool acceleration_resolution_available
bool orientation_resolution_available
bool orientation_rate_resolution_available
bool position_range_available
bool velocity_range_available
bool acceleration_range_available
bool orientation_rate_range_available
float32 position_resolution
float32 velocity_resolution
float32 acceleration_resolution
float32 orientation_resolution
float32 orientation_rate_resolution
float32 position_max_value
float32 velocity_max_value
float32 acceleration_max_value
float32 orientation_rate_max_value
# Class definitions - not part of a specific radar but of the interface
# ANIMAL=1 PEDESTRIAN=2 ......
# This field is required
uint32[] available_classes # The classes in this array are a subset of the defined classes
# This field is required
bool absolute_dynamics # absolute or relative dynamics
Radar Objects
RadarObjects:
std_msgs/Header header
RadarObject[] objects
RadarObject:
uint32 object_id # unique object identifier
uint16 age # number of iterations since the object was first detected
uint8 measurement_status # measured o tracked
geometry_msgs/Vector3 position
geometry_msgs/Vector3 velocity
geometry_msgs/Vector3 acceleration
geometry_msgs/Vector3 shape
# Can have a predefined threshold for invalid, defined in the message
float32[] position_cov # either 1x1 (scalar), 3x1 (diagonal), 6x1 (symmetric), or 3x3 (full)
float32[] velocity_cov # either 1x1 (scalar), 3x1 (diagonal), 6x1 (symmetric), or 3x3 (full)
float32[] acceleration_cov # either 1x1 (scalar), 3x1 (diagonal), 6x1 (symmetric), or 3x3 (full)
float32[] shape_cov # either 1x1 (scalar), 3x1 (diagonal), 6x1 (symmetric), or 3x3 (full)
float32 orientation
float32 orientation_std
float32 orientation_rate_mean
float32 orientation_rate_std
float32 existence_probability
float32[] class_probability # has the dimensionality defined in the available_classes of the info message
The radars detections are currently not used in Autoware. In the future, though, the data size is expected to increase and they could be integrated into sensor fusion models, at which point, an interface like PointCloud2 would be needed.
In addition, the radar detections are the ones with the most diversity (less uniform) among different vendors. This means accommodating all vendors in a single static message is not feasible. For that reason, we propose a similar approach to PointCloud2:
std_msgs/Header header
# Similar as a pointcloud
uint32 num_detections
sensor_msgs/PointField[] fields
bool is_bigendian
uint32 point_step
uint8[] data
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I would like to propose a new (and hopefully universal) radar message that overcomes the shortages and limitations of the current OSS (ROS) one.
To do do, I first would like to present an analysis of the current ROS message, then list some of the radars with OSS drivers or definitions out there, and finally propose a new set of message definitions that support all known use cases, and is expected to be compatible with future devices.
That being said, I am not expert in this area, and would hope to get feedback from the community about this proposal, so we can properly leverage radar technology to its fullest (internally at TIER IV we already went through a series of reviews regarding this proposal).
*Note: in this document the following terms are used interchangeably, since every vendor has its own definitions:
ROS radar message
https://github.com/ros-perception/radar_msgs
Here, there are two types of relevant structures:
RadarReturn.msg
:RadarTrack.msg
:ROS radar message limitations
RadarReturn.msg
:RadarTrack.msg
:Known radar definitions
Ainstein AI
https://ainstein.ai/
https://github.com/AinsteinAI/ainstein_radar/tree/master
Detections: Id, SNR, range, range rate, azimuth, elevation
Objects: UIS, pose, twist, bounding box, value (likehood?), label
Texas Instruments mmWave radars
https://www.ti.com/sensors/mmwave-radar/overview.html
https://github.com/radar-lab/ti_mmwave_rospkg
Detections: position, range, range rate, azimuth, intensity (?), doppler bin
Sadly, no much available information
OmniPreSense
https://omnipresense.com/
https://github.com/SCU-RSL-ROS/radar_omnipresense/tree/master
Detections: range, range rate, FFT data, direction (string?!), object number
Delphi
https://www.amtechs.co.jp/product/rf/system-optical/radar/post-108.html
https://bitbucket.org/unizg-fer-lamor/radar_interface/src/master/msg/
Detections: range, range rate, azimuth, rcs, status
Objects: 2D position, 2D velocity, 2D acceleration, ID, Status, width, movable, amplitude
SmartMicro
https://www.smartmicro.com/automotive-radar
https://github.com/smartmicro/smartmicro_ros2_radars/tree/master
Detections: range, range rate, power, rcs, noise, snr, azimuth, elevation
Objects: 3d position, abolute speed, yaw, length, mileage (?), quality (?), acceleration (scalar), idle cycles, spline idx, status, class
Altos
https://www.altosradar.com/product
https://github.com/Altos-Radar/altosRadarParse/blob/main/pointCloud.h
Detections: range + variance, range range + variance, elevation, azimuth, SNR
Continental
https://www.continental-automotive.com/en/components/radars.html
https://github.com/tier4/nebula/tree/main/nebula_messages/continental_msgs/msg
ARS548 detections: range, range rate, azimuth, elevation (all with std + status flag), rcs, classification id, object id, ambiguity flag, multi target probability, positive predictive value
SRR520 detections: range, range rate, azimuth, SNR, RCS, false detection flags
ARS548 objects: ID, age (cycles), measurement status, movement status, position reference (which part of the object is being tracked), 3d position + cov, orientation + std, 3d velocity & acceleration (absolute & relative !) + std + cov, orientation + cov, existence probability, multi class classification, shape edge (length + width)
SRR520 objects: ID, 2d position + std, 2 velocity + std, 2d acceleration + std, shape edge (length + width), orientation, RCS, score, valid flag for the shape, measurement status
Towards an universal radar interface
As presented before, the ROS radar messages are insufficient since:
Notes regarding reference systems and dynamics:
Proposal
Radar Object Info
This message is published at a low frequency and is similar to how the
CameraInfo
works for cameras. The fields in this message are radar dependent, but do NOT change over time.Regarding the potential classes, the definition should be a superset of the classes used in object detection in Autoware. We have seen radars with classes that while important to autonomous driving, do not map with the standard classes that we use.
Availability, resolution, and range, are fields that may be needed to correctly process the information in the relevant stacks.
Radar Objects
RadarObjects
:RadarObject
:Radar Detections Info
Radar Detections
The radars detections are currently not used in Autoware. In the future, though, the data size is expected to increase and they could be integrated into sensor fusion models, at which point, an interface like
PointCloud2
would be needed.In addition, the radar detections are the ones with the most diversity (less uniform) among different vendors. This means accommodating all vendors in a single static message is not feasible. For that reason, we propose a similar approach to
PointCloud2
:Beta Was this translation helpful? Give feedback.
All reactions