What is it about?

Autonomous vehicles are limited in their perception capabilities to the field of view of their onboard sensors. Additionally, the environment may not be completely perceivable due to the presence of occlusions and blind spots. To overcome this challenge, wireless vehicle-to-vehicle communication could be employed to send and receive sensory information about the surroundings among vehicles within the vicinity. This form of cooperative perception (CP) turns every vehicle into a moving sensor platform, extending each vehicle's field of view and line of sight. This study proposes one such technique for CP over a short range. The system uses visual and inertial sensors, augmented by a positioning system, to perform cooperative relative localisation between two vehicles that share a common field of view. This allows one vehicle to locate the other vehicle in its frame of reference. Subsequently, information about objects in the field of view of one vehicle, localised using a monocular camera is relayed to the other vehicle through communication. A mobile multi-robot testbed was developed to emulate autonomous vehicles and to experimentally evaluate the proposed method through a series of driving scenario test cases, in which CP could be effective and crucial to the safety and comfort of driving.

Featured Image

Why is it important?

In traffic environment with autonomous driving, there are many situations in which the field of view of one vehicle is obscured by other vehicles or road features, such as an intersection or merging area, or driving behind a large truck. Autonomous vehicles rely on their sensors’ field of view to detect objects and potential hazards and avoid them accordingly. However, when their field of view is blocked they may drive into a hazardous situation when it might be too late to stop by braking (just like a manual driver would do.) If (ego) vehicles could share their field of view with other vehicles which are either behind them or are in a potentially obstructed field of view position, then the second vehicle can anticipate the potential hazardous situation ahead and avoid it altogether. For example, the second vehicle will not pass to overtake a car, or will not enter into an intersection or merge into a highway if there is potential of collision. The passing of sensory information from the first vehicle to the second vehicle improves the sensory field of view of the second vehicle and the resulting cooperative perception can significantly improve safety.


Passing of sensory data between vehicles is not a trivial task and poses many challenges. First the two vehicles need to know each other’s distance and orientation using common features in the road. This requires processing of video data in real time of common features for the vehicles to be able to localize their positions relative to each other and to the road. Second, they need to communicate this information to each other which requires vehicle to vehicle robust communications. The algorithms which provide the relative pose estimation need to account for communication delays, etc. The magnitude of data (like extensive visual scenes) communicated between the two vehicles poses computational challenges which could not necessarily be done in real time. So the method should allow a cutoff point in detection of a hazardous scenes and use the trajectory data and other features to pass the safety messages to the second vehicle (from the ego vehicle.) As the information is being passed both the positions of the two vehicles and their relative distances (and orientation) relative to the potentially hazardous target changes, too. This dynamically changing scene and relative positions cause other algorithmic challenges which need to be overcome. The significance of contribution is in handling these issues mathematically and proving the effectiveness of the method with real time robot testing.

Azim Eskandarian
Virginia Polytechnic Institute and State University

Read the Original

This page is a summary of: Cooperative Perception in Autonomous Ground Vehicles using a Mobile Robot Testbed, IET Intelligent Transport Systems, June 2019, the Institution of Engineering and Technology (the IET), DOI: 10.1049/iet-its.2018.5607.
You can read the full text:



The following have contributed to this page