Fusion of Non-Visual Modalities Into the Probabilistic Occupancy Map Framework for Person Localization

IEEE Inernational Conference on Distributed Smart Cameras, 2011
In the recent years, the problem of person detection and localization has received much attention, with two strong areas of application being surveillance/security and tracking of players in sports. Different solutions based on different sensor modalities have been proposed, and recently sensor fusion has gained prominence as a paradigm for overcoming the limitations of the individual sensor modalities. We investigate the possibilities for fusion of additional, nonvisual, sensor modalities into state-of-the-art vision-based framework for person detection and localization, the Probabilistic Occupancy Map(POM), with the aim of improving the localization results in realistic, cluttered, indoor environment. We point out the aspects that need to be considered when fusing an additional sensor information into POM and provide a possible mathematical model for it. Finally, we experimentally demonstrate the proposed fusion on the example of person localization in a cluttered environment.The performance of a system comprising visual cameras and POM and a radio-based localization system is experimentally evaluated, showing their strengths and weaknesses. We then improve the localization results by fusing the information from the radio-based system into POM using the proposed model. Index Terms—sensor fusion, Probabilistic Occupancy Map, multi-camera, radio, person localization.

Embedding

<a href="http://prints.vicos.si/publications/117">Fusion of Non-Visual Modalities Into the Probabilistic Occupancy Map Framework for Person Localization</a>