Beyond the visible: How space photos get their color

Space photos use infrared and ultraviolet light sensors to show us planets in our solar system and distant galaxies. That means the photos we see have to be artificially colored to give a sense of what those objects might look like to human eyes.

An image taken by the Hubble Space Telescope shows a "stellar nursery" called Westerlund 2, located in the constellation Carina about 20,000 light-years from Earth. Hubble images are colored after they're taken, to approximate what the scene might look like to human eyes.

NASA/Reuters

May 10, 2015

Space photos are breathtaking. They show us beautiful, high-resolution images of other worlds, offering glimpses of the mountains and oceans of planets and moons, the spirals of distant galaxies, and even the towering clouds of gas from which new stars are forged. In a flyby of Saturn’s moon Titan in November 2014, NASA’s Cassini spacecraft even captured an ultra-rare image: sunlight glinting off the Kraken Mare sea, which is usually shrouded in an orange haze of chemicals.

These photos, however, aren’t taken with cameras like the ones commonly found on Earth, which means the colors shown in those cosmic shots are the result of a little bit of artistic license.

A satellite image is really a composite of many images taken over a period of time, using data from infrared and ultraviolet light sensors as well as light from the visible spectrum. So when Cassini flies past Titan or Tethys on its orbit around Saturn, the resulting images must be color-corrected to show what they might look like to human eyes.

Howard University hoped to make history. Now it’s ready for a different role.

The same is true of the Hubble Space Telescope, which uses electronic sensors to probe the universe. Those sensors produce images in varying shades of black and white, which are then combined to form a final picture. The colors we see in famous Hubble images such as the Sombrero Galaxy don’t always appear as they would if we were able to see those objects with our own eyes.

“We often use color as a tool,” explains NASA’s Hubble website, “whether it is to enhance an object's detail or to visualize what ordinarily could never be seen by the human eye.”

Hubble can detect portions of the electromagnetic spectrum that are invisible to our eyes, such as infrared and ultraviolet light. So when a galactic image contains data from outside the visible light range, scientists use Hubble’s various filters and their own judgement to create a composite image that gives a good sense of what the object might look like to human eyes, or that brings out details of the object. Hubble’s successor, the James Webb Space Telescope, scheduled to be launched in 2018, will produce images with 32 million pixels each, twice as much resolution as Hubble’s images, but will also rely on infrared data that must be translated into the visible spectrum.

Interestingly, even images taken by the two currently-active Mars rovers, Opportunity and Curiosity, are recolored to give us the familiar vistas of the Red Planet. The rovers move very slowly – about 100 feet per hour, on average – and can take days to record the data that goes into a panoramic image. That data is combined with telescope pictures, images from past Mars missions, and other information gathered from Opportunity and Curiosity to create a final picture that accurately captures what a person would see if they were standing in that exact spot on Mars, NASA says.

Not only do space photos reveal celestial sights that are much too far away for humans to visit with today’s technology, they also show features and details that exist outside what our eyes can perceive.