Adaptive Message Prioritization for Vehicular Cooperative Perception at Target Intervals

Masashi Kunibe, Rei Yamazaki, Taichi Murakawa, Hiroshi Shigeno

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we propose adaptive message prioritization for vehicular cooperative perception at target intervals. The purpose of this paper is to ensure that Connected and Autonomous Vehicles (CAVs) can perceive the surrounding traffic environment by exchanging wireless messages such as Cooperative Awareness Messages (CAMs) and Cooperative Perception Messages (CPMs). Especially, under challenging conditions such as the wireless congestion control or the mixed-traffic of CAVs and human-driven vehicles, the number of perceptible objects by these messages decreases due to the message transmission rate control under the congestion control or the less message transmission under the low CAV penetration rate. To perceive the surrounding objects under the challenging conditions, the proposed method assigns high transmission priority to CPMs that include the information of multiple objects perceived by in-vehicle sensors. Specifically, the method controls the transmission frequency of CAMs and CPMs by assigning transmission priority to them such that the maximum perception interval approaches the target interval. Simulation result shows that 75% CAVs perceive surrounding vehicles within the target interval even under both the wireless congestion control and 20% CAV penetration rate.

Original languageEnglish
Pages (from-to)57-65
Number of pages9
JournalJournal of information processing
Volume31
DOIs
Publication statusPublished - 2023

Keywords

  • age of information
  • cooperative perception
  • transmission priorities
  • vehicular communication
  • wireless congestion control

ASJC Scopus subject areas

  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Adaptive Message Prioritization for Vehicular Cooperative Perception at Target Intervals'. Together they form a unique fingerprint.

Cite this