Visual Perception for Self-Driving Cars! Part 6: Camera to Bird’s Eye View
<p>One of the most essential tasks for autonomous driving is <strong><em>precise environment perception</em></strong>. The measurement of environmental distances while integrating monocular cameras is quite difficult. As monocular cameras can only see places in the picture plane, images can be transformed to a<strong><em> bird’s eye viewpoint (BEV)</em></strong>. Bird’s eye view or top view is the viewing of the 3D scenes from the top. To be more exact, it’s a narrow of the same picture as viewed from a position where the image plane lines in front of the camera with the ground plane. The process of converting camera data to BEV is known as <a href="https://csyhhu.github.io/2015/07/09/IPM/" rel="noopener ugc nofollow" target="_blank"><strong><em>Inverse Perspective Mapping (IPM)</em></strong></a>. This transformation distorts three-dimensional objects like automobiles and vulnerable road users, making it harder to assess their position relative to the sensor.</p>
<p><a href="https://medium.com/@shahrullo/visual-perception-for-self-driving-cars-part-5-multi-task-learning-cb4300208a4e"><strong>Website</strong></a></p>