Thanks for the info. Still very impressed with the speed and accuracy, assuming it works as well as they have presented. To be able to compute visual odometry in an arbitrary scene on a mobile device seems like a really big accomplishment, no?
Visual odometry is getting quite accurate. For example, it is used on the Mars Exploration Rovers (Spirit and Opportunity) to improve the position obtained from wheel odometry [1]. As you mentionned, the big accomplishment is doing this in real time on a mobile device. On a typical mobile ARM processor, it would probably be very slow and drain the battery in no time. That's why they are using a custom coprocessor designed specifically for image processing and computer vision. The next challenge is porting the existing visual odometry algorithms on this new processor. It looks like hiDOF did the implementation [2].