We consider the problem of estimating the relative orientation of a number of individual photocells - or pixels - that hold fixed relative positions. The photocells measure the intensity of light traveling on a pencil of lines. We assume that the light-field thus sampled is changing, e.g. as the result of motion of the sensors and use the obtained measurements to estimate the orientations of the photocells. Our approach is based on correlation and information-theory dissimilarity measures. Experiments with real-world data show that the dissimilarity measures are strongly related to the angular separation between the photocells, and the relation can be modeled quantitatively. In particular we show that this model allows to estimate the angular separation from the dissimilarity. Although the resulting estimators are not very accurate, they maintain their performance throughout different visual environments, suggesting that the model encodes a very general property of our visual world. Finally, leveraging this method to estimate angles from signal pairs, we show how distance geometry techniques allow to recover the complete sensor geometry. © 2009 Elsevier Inc. All rights reserved.
ASJC Scopus subject areas
- Signal Processing
- Computer Vision and Pattern Recognition