They are already the brain food of the smartphone, and now they allowing an increasing arrayof previously inanimate objects to understand your place, motion, touch, gestures and activity.
The idea is that Gaia's two telescopes will focus the stars on to the end of the array, and as these celestial objects then scan across the CCDs their positions and individual properties will be logged.