Vision sensor design and evaluation for autonomous navigation
Institution:
RouenDisciplines:
Directors:
Abstract EN:
The main objective of this thesis is to provide a robot navigation system based on visual sensors’ measurements. To achieve this goal, we inquire about the design of an optimal visual sensor which allows to formulate a linear optimization problem of egomotion estimation. A multiple-camera system is built, mimicking the functioning of insects’ compound eyes and it captures the visual information in a more complete form called the plenoptic function that encodes the spatial and temporal light radiance of the scene. Contributions of this thesis are presented on three axes. First, we present the mathematical formulation of the plenoptic function and the relationship between the motion estimation and the ray-based plenoptic model. A multi-scale approach is also introduced to increase the accuracy of the system and meanwhile to reduce the computational costs. The second axis is dedicated to optimize the plenoptic sensor for real-time indoor navigation. We show that a plenoptic sensor with low resolution can perform better than a state-of-the-art monocular camera with high resolution. We also give a complete design scheme by establishing the link between velocity, resolution, field-of-view and motion estimation accuracy. Finally, due to the sparsity of the plenoptic data, we use a random sampling scheme which measures only the useful part of the visual information. By processing directly the sparse measurements, the computational time is reduced with minimal loss of accuracy. Since the required amount of data is largely reduced at the acquisition stage, computation resources can be reallocated for other tasks. The performance of the current built plenoptic sensor is evaluated in a systematical way through synthetic and experimental data.
Abstract FR:
Pas de résumé disponible.