Fusion of non-visual modalities into the probabilistic occupancy map framework for person localization
Proceedings of the 5th ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC2011), 2011
In this paper we investigate the possibilities for
fusion of non-visual sensor modalities into state-of-the-art visionbased
framework for person detection and localization, the
Probabilistic Occupancy Map (POM), with the aim of improving
the frame-by-frame localization results in a realistic (cluttered)
indoor environment. We point out the aspects that need to be
considered when fusing non-visual sensor information into POM
and provide a mathematical model for it. We demonstrate the
proposed fusion method on the example of multi-camera and
radio-based person localization setup. The performance of both
systems is evaluated, showing their strengths and weaknesses.
We show that localization results may be significantly improved
by fusing the information from the radio-based system into the
camera-based POM framework using the proposed model.