Could someone with better knowledge of astronomy/aerospace explain (with less hyperbole) which part is the "impossible" part here? I don't really get it. Reading the article, I got that they're sending a telescope into space with a very large camera sensor and will record information about many stars.
> Could someone with better knowledge of astronomy/aerospace explain (with less hyperbole) which part is the "impossible" part here?
Hyperbole is the operative term. When first proposed it would have been impossible with then-current hardware, but since such missions take time to germinate, sometimes people use the term "impossible" to add a sense of drama and daring, but with the expectation that the goal will become possible in time.
Also, according to Arthur C. Clarke, "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."
It is about the end-result of this program that was considered "impossible" - a celestial map consisting billions of stars with detailed information about each of them, including position and movement. It seems that such a task was considered so difficult in the nineteens, that when they requested it they didn't expected to end up with a solution that would completely cover all the requirements.
Probably just the progressive changes: the accuracy and number of stars. They're saving 1000 terabytes of data from the gigapixel sensor. These missions always take very long to design and build so these numbers probably were thought impossible ten years ago.
Gaia will produce quite a lot of data, approximately 200 TB over 5 years, according to this Wikipedia article (http://en.wikipedia.org/wiki/Gaia_(spacecraft)), however that feels a bit hand-wavy. Even if we assume an order of magnitude error, and its more like 2 PB, it is still far far less than the 15 PB (images plus metadata) that will be produced over 10 years with the Large Synoptic Survey Telescope. And yes, the LSST is driving a large number of distributed computing research projects because of its unique processing requirements.
I’d be interested to know the ratio of the L0 data size to the L2/3/4 data[0] size – in other words, how many downlinked bits of sensor readout are processed per bit of useful science data? Of course that won’t be a hard number; I’m just curious to get a sense of the general scale on which the pipeline reduces pixels to physical parameters.
"There are also no moving parts anywhere on Gaia. Even the antenna to communicate with Earth has been designed to point electronically, not mechanically."
Good, but on the next line...
"And if the satellite does need to make fine adjustments, it has been equipped with thrusters that can squirt just 1.5 micrograms of nitrogen gas a second."
Is there really no moving parts on Gaia?
Hmmm, they could try to use the Earth magnetic field for orientation; I guess the microthrusters were better...
P.S. And saying "accuracy of about seven micro-arcseconds for the nearest stars" is a little bit inkorrect.
Those micro-thrusters are probably piezoelectric, like the nozzles on ink jet printers.
They move a bit I guess, but there is no net motion - they transmit no force or vibration outside of the device. They flex, but have no relative (sliding) motion.
There are other parts on the satellite that can bend - will you call that a moving part as well?
Piezoelectric actuator still relies on a mechanical motion. It has to be - the idea of nitrogen nozzle is to have a storage of nitrogen under some pressure in a closed volume, and to be able to open that volume to let some gas out. That opening has to be mechanical - unless they use some exotic force fields to keep nitrogen from escaping, in which case it's hardly a (regular) nitrogen thruster anymore.
Flexing is still a motion. That, however, can be compensated with an opposite motion in another gas channel.
If other parts of satellite can bend on demand, they'd need an actuator or, alternatively, one can use Earth magnetic fields to reorient the spacecraft and let external forces affect it unevenly, thus providing bending. This, however, is outside of the scope of the article.
So, to summarize - either secondary subsystems on Gaia are quite unconventional, or the article isn't quite precise.
A single device that observes and catalogs a billion stars sounds like science fiction. That would be a feat even in a software simulation. Doing it in orbit is crazy.