This Is How NASA Photoshops Stunning Pictures of Faraway Galaxies

What’s less known is that raw data from NASA photographs undergo extensive processing before they’re released to the world.
This Is How NASA Photoshops Stunning Pictures of Faraway Galaxies
Jonathan Zhou
9/29/2015
Updated:
10/19/2015

For decades, the public has consumed sublime photos of distant stars and galaxies produced by NASA, with a few, like the Pillars of Creation, having reached iconic status. Everyone knows where the images come from: spacecrafts like the Hubble Space Telescope or the Mars Rover, but that’s only half of the story. What’s less known that the raw data undergoes extensive processing before they’re released to the world.

In a blog post on Sept. 28, Adobe lays out the postproduction editing involved in creating NASA’s visual masterpieces. To create the panoramic landscape of Mars, editors at NASA have to first stitch together dozens of photos, most of which aren’t aligned with the horizon, and crop out jags to give the photo a geometric frame.

An original composite of 'Marathon Valley' taken by the Mars Exploration Rover Opportunity on March 3-4, 2015. (NASA/JPL-Caltech)
An original composite of 'Marathon Valley' taken by the Mars Exploration Rover Opportunity on March 3-4, 2015. (NASA/JPL-Caltech)

The cameras on the Mars rover also take pictures in 3-D with two cameras, which results in a series of images that needs to be sewn together with camera models for it to be visually coherent. Finally, the pictures are corrected for brightness to balance out the lighting problems created by Martian dust.

NASA’s adjustments strive to have the photos be as scientifically accurate as possible, with the changes made both for aesthetic appeal and information richness. When NASA adds colors to photos of faraway cosmic bodies, the palette reflects what the eyes would see if they were sensitive to color across a wide spectrum.

The Orion Nebula, M42, as imaged by NASA's Wide-field Infrared Survey Explorer, or WISE, in January 2013. (NASA/JPL-Caltech/UCLA)
The Orion Nebula, M42, as imaged by NASA's Wide-field Infrared Survey Explorer, or WISE, in January 2013. (NASA/JPL-Caltech/UCLA)

“I basically take raw grayscale data from different parts of the infrared spectrum, and then remap them into visible colors—typically with red, green, and blue Photoshop layers—to create images that are accurately representative of the infrared colors that human eyes cannot see,” Robert Hurt, an astronomer and visualization scientist at Caltech’s Infrared Processing and Analysis Center, told Adobe. “I think of it as a visual translation process.”

Hurt’s job is not limited to highlighting details in the photos not visible to the human eye, but also removing camera artifacts that could mislead the viewer into thinking that there are stars or planets that aren’t there.