MCG-NJU / SparseBEV

[ICCV 2023] SparseBEV: High-Performance Sparse 3D Object Detection from Multi-Camera Videos

Home Page:https://arxiv.org/abs/2308.09244

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Where in the code do you handle ego motion?

Slonna opened this issue · comments

commented

Dear author
Wonderful work! Where in the code do you handle ego motion? I can't find it.
It seems that after moving the sampling points with velocity, it directly uses the lidar2img matrix to project points

Ego motion is integrated into lidar2img during data loading:

def compose_lidar2img(ego2global_translation_curr,
ego2global_rotation_curr,
lidar2ego_translation_curr,
lidar2ego_rotation_curr,
sensor2global_translation_past,
sensor2global_rotation_past,
cam_intrinsic_past):
R = sensor2global_rotation_past @ (inv(ego2global_rotation_curr).T @ inv(lidar2ego_rotation_curr).T)
T = sensor2global_translation_past @ (inv(ego2global_rotation_curr).T @ inv(lidar2ego_rotation_curr).T)
T -= ego2global_translation_curr @ (inv(ego2global_rotation_curr).T @ inv(lidar2ego_rotation_curr).T) + lidar2ego_translation_curr @ inv(lidar2ego_rotation_curr).T
lidar2cam_r = inv(R.T)
lidar2cam_t = T @ lidar2cam_r.T
lidar2cam_rt = np.eye(4)
lidar2cam_rt[:3, :3] = lidar2cam_r.T
lidar2cam_rt[3, :3] = -lidar2cam_t
viewpad = np.eye(4)
viewpad[:cam_intrinsic_past.shape[0], :cam_intrinsic_past.shape[1]] = cam_intrinsic_past
lidar2img = (viewpad @ lidar2cam_rt.T).astype(np.float32)
return lidar2img

commented

Thanks for your quick rely! I got it.