-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HD-maps and Motion Prediction #272
Comments
Hi @RobeSafe-UAH, have you tried out our map tutorial here? We recommend calling the API functions listed here, instead of working with the raw map files: We provide a single map for each city (covering the portion of the city where we release sensor data + tracked trajectories). |
|
Thanks @johnwlambert for your clear explanation. I have another question. If we analyze the .csv provided inthe motion-forecasting sets, the OBJECT_TYPE column can be AV, OTHERS or AGENT. What is the difference between AV and AGENT (I assume OTHERS are the remaining vehicles, pedestrians, bycicles, etc.). Is the AGENT the ego-vehicle with which sensor data is recorded? Am I wrong with this hypothesis? |
We mean this: TIMESTAMP,TRACK_ID,OBJECT_TYPE,X,Y,CITY_NAME |
Hi @RobeSafe-UAH, sorry for the confusion here. These issues clarify the object types: #79 and #58. Please let me know if you have additional questions. |
Hi @johnwlambert, my apologies, but I don't understand it perfectly. The answer to issue #79 is as following: """ The 15 object classes you mention are provided in the 3D tracking dataset. To my knowledge, they haven't made any such statement regarding forecasting dataset. Nevertheless, in other Motion Prediction datasets (such as INTERACTION or NuScenes) you have the recorded vehicle/pedestrian/etc. track files and the ego-vehicle position with the corresponding sensors. My question is: Is AV in the ARGOVERSE nomenclature the ego-vehicle from which the sensors were recorded? Why there is a most interesting trajectory or track (AGENT) if we actually have to predict the future position of the agents n-seconds ahead? Thanks in advance :). |
@johnwlambert Another naive question. Is it required to predict the future positions of all agents n-seconds ahead or just the AGENT (most interesting trajectory or track), as a Single Motion Prediction conditioned by the traffic situation and the other agents? |
For Argoverse 1, the task is only to predict the future for the identified AGENT. For Argoverse 2, multi-agent forecasting will be supported. |
@James-Hays totally understood now, I read the preliminar documentation (https://openreview.net/forum?id=vKQGe36av4k) of Argoverse 2.0 and its comparison with Argoverse 1.1: """ Thanks a lot for your suggestions. |
Hey folks,
We are trying to incorporate the hd map info (lanes info, driveable area, etc.) in our stochastic model based to conduct the motion forecasting task. In the API, the plots seem to be quite interesting since they incorporate useful information to give particular attention to the corresponding vehicles. Nevertheless, after studying the maps provided in hd-maps.tar.gz, the .npy structures corresponding to Miami (similar to Pittsburgh) "driveable_area", "halluc_bbox", "npyimage_to_city_se2" and "ground_height" have the following dimensions (after applying astype(np.uint8) to avoid lossy conversion from float64 to uint8, expected by imsave):
Shape: (12574, 4)
Image: MIA_10316_halluc_bbox_table
Shape: (3674, 1482)
Image: MIA_10316_driveable_area_mat_2019_05_28
Shape: (3674, 1482)
Image: MIA_10316_ground_height_mat_2019_05_28
Shape: (3, 3)
Image: MIA_10316_npyimage_to_city_se2_2019_05_28
As result, they are totally black with the exception of ground_height (several points are white).
My questions are:
Are these numpy.array only bidimensional?
Do these maps cover the entire sequence of trajectories for each city (that is, there is a single BEV hd-map for each city and the obstacles appear and dissapear from that particular region, not having a sequence of hd maps along the ego-vehicle motion)?
Thanks in advance,
The text was updated successfully, but these errors were encountered: