Embedded Systems

SCOPE: A Synthetic Multi-Modal Dataset for Collective Perception Including Physical-Correct Weather Conditions

by Jörg Gamerdinger, Sven Teufel, Patrick Schulz, Stephan Amann, Jan-Patrick Kirch­ner, and Oliver Bring­mann
In 2024 IEEE In­tel­li­gent Trans­porta­tion Sys­tems Con­fer­ence (IEEE ITSC 2024), 2024.

Ab­stract

Col­lec­tive per­cep­tion has re­ceived con­sid­er­able at­ten­tion as a promis­ing ap­proach to over­come oc­clu­sions and lim­ited sens­ing ranges of ve­hi­cle-lo­cal per­cep­tion in au­tonomous dri­ving. In order to de­velop and test novel col­lec­tive per­cep­tion tech­nolo­gies, ap­pro­pri­ate datasets are re­quired. These datasets must in­clude not only dif­fer­ent en­vi­ron­men­tal con­di­tions, as they strongly in­flu­ence the per­cep­tion ca­pa­bil­i­ties, but also a wide range of sce­nar­ios with dif­fer­ent road users as well as re­al­is­tic sen­sor mod­els. There­fore, we pro­pose the Syn­thetic COl­lec­tive PEr­cep­tion (SCOPE) dataset. SCOPE is the first syn­thetic multi-modal dataset that in­cor­po­rates re­al­is­tic cam­era and LiDAR mod­els as well as pa­ra­me­ter­ized and phys­i­cally ac­cu­rate weather sim­u­la­tions for both sen­sor types. The dataset con­tains 17,600 frames from over 40 di­verse sce­nar­ios with up to 24 col­lab­o­ra­tive agents, in­fra­struc­ture sen­sors, and pas­sive traf­fic, in­clud­ing cy­clists and pedes­tri­ans. In ad­di­tion, record­ings from two novel dig­i­tal-twin maps from Karl­sruhe and Tübin­gen are in­cluded. The dataset is avail­able at https://​ekut-​es.​github.​io/​scope