How Webb sends its 100-megapixel images a million miles back to Earth

- Advertisement -


NASA has just showed the first set of images taken by the James Webb Space Telescope, from the awe-inspiring deep field of galaxies to the smallest features of the atmosphere of a distant exoplanet. But how does a spacecraft a million miles away bring tens of gigabytes of data back to Earth?

- Advertisement -

First, let’s talk about the images themselves. These are not just ordinary .jpg files, and Webb is not just an ordinary camera.

- Advertisement -

Like any scientific instrument, Webb collects and sends home a lot of raw data from its instruments, two highly sensitive near- and mid-infrared sensors, and a host of accessories that can specialize them for spectroscopy, coronography and other tasks as needed.

Let’s take one of the recently released first direct comparison images as an example.

megapixel race

- Advertisement -

Hubble, which looks more like a traditional visible light telescope, took this image of the Carina Nebula back in 2008:

Of course, this is an incredible image. But Hubble is more comparable to a traditional visible light telescope, and more importantly, it was launched back in 1990. Technology has changed a lot since then!

Here is Webb’s version of the same region:

Image credits: NASA, ESA, CSA, STScI

It is clear to any viewer that Webb’s version has long away in more detail, even looking at these small versions. The opaque texture of the nebula turns into intricate cloud formations and wisps, and more and more stars and presumably galaxies become clear and visible. (Though we note here that the Hubble image has its charm.)

Let’s just zoom in on one area to emphasize the level of detail in the capture, to the left and up of center:

Image credits: NASA, ESA, CSA, STScI

Unusual, right? But there is a price to pay for this detail: data!

The Hubble image is about 23.5 megapixels in size and weighs 32 megabytes uncompressed. The Webb image (available after processing) is 123 megabytes and approximately 137 megabytes. That’s more than five times the data, but even that doesn’t tell the whole story. According to Webb’s specifications, it sends data at 25 times the bandwidth of Hubble – not just bigger images, but bigger ones… from 3,000 times more distance.

long distance phone call

Hubble is in low Earth orbit, about 340 miles above the surface. This means that communicating with it is really very simple – your phone reliably receives signals from GPS satellites that are much further away, and for NASA scientists it is child’s play to transmit information back and forth to a satellite in such a close orbit.

JWST, on the other hand, is at the second Lagrange point, or L2, about millions of miles from Earth, directly from the Sun. This is four times more than the Moon has ever flown, and in some ways it is a much more difficult task. Here’s an animation from NASA showing what this orbit looks like:

Fortunately, this type of connection is far from unprecedented; we were sending and receiving large amounts of data from far more distant places. And we know exactly where Webb and Earth will be at any given time, so while it’s not trivial, it’s really just choosing the right tools for the job and very careful planning.

From the beginning, Webb was designed to transmit Ka-band radio waves in the 25.9 GHz band, as well as the bands used for other satellite communications. (For example, Starlink also uses Ka, as do others in the territory.)

This main radio antenna is capable of transmitting about 28 megabits per second, which is comparable to the speed of home broadband access – if the signal from your router took about five seconds to travel through a million miles of vacuum to reach your laptop.

Purely illustrative for a sense of distance – the objects are clearly not to scale.

This gives him about 57 gigabytes of downlink bandwidth per day. There’s a second antenna operating in the lower S-band — surprisingly, the same band used for Bluetooth, Wi-Fi, and garage door openers — is reserved for low-speed stuff like software updates, telemetry, and health checks. If you are interested in details, IEEE Spectrum has a great article this is more detail about it.

However, this is not just a constant flow, because of course the Earth rotates and other events may interfere. But because they are dealing with mostly known variables, Webb’s team schedules their contacts four or five months in advance by relaying data over the deep space network. Webb could have collected the data and sent it the same day, but both the capture and transmission had been planned well in advance.

Interestingly, Webb only has about 68 gigabytes of internal storage space, which you think will make people nervous if it can send 57 – but there are more than enough opportunities to offload this data so that it never gets that terrible “disk”. complete” message.

But what you see at the end – even this large uncompressed 123MB TIFF image – is not what the satellite sees. In fact, it doesn’t even perceive color as we understand it.

“Let data show through in color”

The data that goes to the sensors is in the infrared range, which is outside the narrow band of colors that humans can see. Of course, we use many methods to see beyond this range, such as X-rays, which we pick up and see as we can see when they fall on film or a digital sensor calibrated to detect them. The same goes for Webb.

“The telescope is not really a point-and-shoot camera. I mean, we can’t just take a picture and there it is, right? This is a scientific instrument. Therefore, it was developed primarily for scientific results.” explained Joe DePasquale of the Space Telescope Science Institute on a NASA podcast..

This side-by-side comparison shows near-infrared (left) and mid-infrared (right) observations of the South Rim Nebula from NASA’s Webb Telescope.

What it finds is not really data that people can analyze, let alone perceive directly. First, the dynamic range is off the charts – that means the difference in magnitude between the darkest and brightest points. In fact, there is nothing darker than the endless emptiness of space and not much brighter than an exploding sun. But if you have an image that includes both taken over several hours, you will get huge deltas between dark and light in the data.

Our eyes and brain have a pretty good dynamic range, but it throws them off track – and more importantly, there’s no real way to show it.

“It basically looks like a black image with some white flecks because it has such a huge dynamic range,” DePasquale said. “We have to do something called data stretching, which is to take the pixel values ​​and sort of reposition them, basically so you can see all the detail that’s there.”

Before you object in any way, first, keep in mind that basically all images are created this way – part of the spectrum is cut out and adapted for viewing by our very capable, but also limited visual system. Because we can’t see in infrared, and there’s no equivalent for red, blue, and green at those frequencies, image analysts have to do the hard work of combining an objective use of data with a subjective understanding of perception and beauty. The colors may correspond to wavelengths in a similar order, or perhaps be separated to more logically highlight areas that “look” the same but emit completely different radiation.

“We like, as you know, in the imaging community to call this process ‘representational color’ instead of the way it used to be called ‘false color imaging’ by many today. I don’t like the term “false color” because it implies that we’re faking, or you know, it’s not quite what it sounds like, data is data. We are not going to delve into and apply how to draw color on an image. We respect data from start to finish. And we let the data show through in color.”

If you look at the image above, the two types of nebula are taken from the same angle, at more or less the same time, but using different instruments to capture different segments of the IR spectrum. While both must ultimately be shown in RGB, the various objects and features found when examining higher wavelengths can be made visible using this creative yet scientifically rigorous method of color assignment.

And, of course, when data is more useful as data than as a visual representation, there are even more abstract ways of looking at it.

Image credits: NASA, ESA, CSA and STScI

An image of a distant exoplanet may only show a dot, but the spectrogram shows details of its atmosphere, as you can see in this example directly above.

Collecting, transmitting, retrieving, parsing, and presenting data for something like Webb is a complex task, but now that it is up and running, hundreds of researchers are enthusiastically dedicating themselves to it. Expect even more creative and fun ways to display this information – as JWST is just getting started on its million mile mission, we have a lot to look forward to.


Credit: techcrunch.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox