We know our planet is blue because of the water that on our planet and living here to know. But for places that are lightyears away, a type of measurement has to be taken. Between Mercury to Pluto and now beyond, probes were sent out to take photographs and analyze the planets from a far.
On Google right now there is a way to view the illustrated images of what these planets look like. This can look spooky at first but worth wild to view the Earth in this way every once in a while. Go to Google Maps (preferably on a desktop computer, doesn’t work on a phone that well) and select satellite view. Select the 3D marble on the bottom right corner to convert to 3D Earth. Then scroll as far away from your location as possible. You will see the first 3D illustration of Earth with the glowing blue light around the blue marble. Keep scrolling away from Earth until Google switches you to Google in Space.
This is where you will get to see the other planets within a 3D illustration in the order of the solar system. On the left, you get to view the menu of all the finished renderings of the planets, dwarf planets, moons and other satellites. When you click through all of the renderings, a lot of the places are different colours, different patterns and some of them freaky textures making them look very alien. You can rotate them to see what they look like, where they are in the solar system and click on the icons on them and see all the prenamed places by scientists.
As a shortcut, click here to view Google in Space then select the marble on the bottom right corner to see the Earth then back up one more step to get to Google 3D space.
Shouldn’t they all be various shades of grey?
There are an infinite number of planets in space, period. The Hubble Space Telescope is similar to the cameras we have on Earth but it is programmed to take images of light of the deepest parts of space while on orbit around Earth. It takes photos of galaxies, satellites like planets and stars in every walk of life from birth to death. When the Hubble Space Telescope takes a picture it sends it back to Earth.
It does this in black and white images so that it can record the most contrasting image. It records in a separate three channel process. The photograph is made up of three channels of red, green and blue. The intensity of black in a photo represents the amount of that colour has in that colour channel, for instance, a photo of a sky with the colour channels separated would be a dark grey sky.
The photo is reconstructed in the space program’s art studio for evaluation. They combined the channels while assigning chemical elements colour hues. There’s a lot of artistic license in creating these photos as a reference for the visible distance of stars within the galaxy.
What is true colour and false colour?
True colour is represented clearly and accurately colour. This term “true” for true colour is not solid. The perceived colour changes are depending on atmospheric conditions when the photo is taken. The true colour isn’t that helpful, to begin with by rising doubt in the information provided. When a scientist tries to take a photo that tries to put as much information about the area as possible which could create false colour data. Colours can change depending on weather conditions or time of day similar to taking a picture on Earth.
The photos in the NASA Mars Pathfinder project used a map-projected and filtered through the VICAR image processing software at Jet Propulsion Laboratory. All the photos were sent back as JPEG to the lab with the map projection, pixel information, angle of the photo and photo number at the bottom of each photo of Mars.
“We actually try to avoid the term ‘true color’ because nobody really knows precisely what the ‘truth’ is on Mars. Colors change from moment to moment. It’s a dynamic thing. We try not to draw the line that hard by saying ‘this is the truth!’”Jim Bell on the Mars panoramic color images
False colour has two definitions. The first one is a photo taken as part of the electromagnetic spectrum that’s invisible to the human eye like infrared light or ultraviolet light. These channels are mapped out using red, blue and green. The other type of false colour that exists is a photo taken within visible light then mapped out in red, blue and green channels. This is called narrowband imaging. Through narrowband imagery, the photos can show what the objects are made by analyzing the light reflected by different elements. The different frequencies of light reveal star formations and black holes.
The narrowband filters have to bypass light through them. They are three types of narrowband filters in astrophotography that when combined create a visible coloured image: Ha (Hydrogen Alpha), OIII (Oxygen III) and SII (Sulfur II). Hydrogen Alpha creates red, Oxygen III creates green and Sulfur II creates blue. Combining these channels can create approximate true colour. This combination is also known as the Hubble Colour Palette.
Based on the photo of Pillars of Creation, the Hubble Space Telescope uses wavelengths like 673 nm, 657 nm and 502 nm to map out the colours of red, green and blue in a picture. For example, the 673 nm would be in the red channel represented in ionized sulphur and 657 nm representing ionized nitrogen and hydrogen alpha which is also a part of the red channel. The wavelength of 502 nm would be in the green channel.