Where Will the Oncoming Vehicle be the Next Second?
Alexander Barth and Uwe Franke
A new image based approach for fast and robust tracking of vehicles from a moving platform is presented. Position, orientation, and the full motion state including velocity, acceleration, and yaw rate of a detected vehicle are estimated from a tracked 3D point cloud. This point cloud is computed by analyzing image sequences in both space and time, i.e. by fusion of stereo vision and tracked optical flow vectors. Starting from an automated initial vehicle hypothesis, the tracking is performed by means of Extended Kalman Filter. The filter combines the knowledge of where a point in the rigid point cloud has moved within a given time interval, with the dynamic model of a vehicle. The proposed system is applied to predict the driving path of other traffic participants and runs currently at 25 Hz (VGA images) on our demonstrator vehicle UTA.