How do we colourise photos of space?
Hubble and
Webb have been blessing our feeds with beautiful and stellar images of the
Universe. Anyone who looks at them cannot deny the awe and wonder it fills you
with. But, do they actually look like this? More precisely, if we were to
observe this Universe with the naked eye would we actually see the bodies as
their images show them to be? Are they as colourful as is shown or are we somehow
manipulating them to see what we understand?
All the
colourful pictures that we see on our devices use three colours as their
primary colours, i.e.; red, green and blue. These colours in various
proportions make other colours. For example, in a hexadecimal representation of
colours, we have six digits. The first two digits represent a red value, the
next two are the green value, and the last two are the blue value.
|
Some common hex codes and their corresponding colour |
The human eye
consists of six million photoreceptor cells in the retina, called the cones. There
are three types of cones or receptors, the short-S (blue), medium-M (green),
and long-L wavelength (red) sensitive cones. These cones are present in
different quantities. The short wavelength cones make for about 10% of the
cones. It responds most to blue-wavelength light peaking at 420 nm. The
medium wavelength cones respond most to light of yellow to green, and peak at
about 530 nm. The long wavelength or red-sensing cones make for about 60% of
the cones. It responds most to the longer wavelengths peaking at about 560 nm. An
important thing to be noted is the peaks aren’t the same for all the
individuals. It might lie in the range of 420-440 nm, 534-545 nm, and, 564-580
nm respectively for different individuals.
|
Responsivity vs Wavelength (nm) |
This is the
guiding principle used in colouring black-and-white images. We use a process
called broadband filtering which is essentially taking pictures of objects with
different filters and then recombining them to get the desired image. For example in the picture given below, we have taken a black-and-white photograph of the flowers. The same black-and-white photography is done using red, green, and blue filters. The relative absence and presence of a particular colour in the given photographs predict the actual colour present. This calculation is done on a pixel-by-pixel basis and we obtain a colourful image.
|
Broadband filtering at play |
The images
taken by Hubble and Webb are in black and white. The main purpose of these telescopes is to measure the brightness of light reflecting off of objects in space which is
clearest in black and white. These images are then digitally coloured. We take
images filtering various wavelength ranges and then recombine the image to get
the picture.
Scientists
also use colours to map out how different gases interact to form galaxies and
nebulae. A process called narrowband filtering is used to capture specific
wavelengths of light. Hubble can record very narrow bands of light coming from
individual elements like oxygen, carbon, and hydrogen. We can then use colours
to track their position in an image.
|
Narrowband filtering of Hydrogen, oxygen, and Sulphur |
The most
common application of narrowband filtering is studying the formation of stars
and galaxies. The filters isolate light from hydrogen, sulfur, and oxygen, the
three key building blocks of stars. This is not a true colour image. It is more
of a colourised map. The characteristic wavelengths of hydrogen, sulphur, and
oxygen are 656.2 nm, 672 nm, and 495.9 nm
respectively. Hydrogen and sulphur are naturally seen in red light, and oxygen
is seen in blue. These correspond to the colours red, red, and cyan. To get a better image the wavelengths are adjusted and assigned their places as
red, green, and blue according to their chromatic order. Sulphur is denoted by
red colour, hydrogen takes green and oxygen is shown by blue.
|
Pillars of Creation in True and false colours |
What about
infrared light? We do know the primary wavelengths Webb is working in are in the infrared region. How do we colourise invisible light?
In infrared
light too, a similar process is followed. We assign different colours to
different elements. Take their images through broadband and narrow band filtering and then
recombine them to show the stunning images we are presented with.
|
Infrared photography of Helix Nebulae |
Have we
been duped? Are colours even real? Are we artificially colouring the Universe
to make it look more beautiful than it actually is?
Well, yes
and no. It’s true that if we are presented with actual images of the galaxies taken,
they may appear bland and boring. But, the biggest fact we aren’t paying heed
to is how limited we as humans are in our perception of light. We see a small
part of the electromagnetic spectrum and call it visible light. If we could
see the complete spectrum imagine how colourful we would find the Universe to
be! We could see temperatures, the all-surrounding microwave background
radiation, and telephones communicating in various radio waves. The very thought
about it makes me trippy. What about you? So, we have been duped ever so
slightly to understand the Universe in a colour language we know. In reality,
the Universe is much more colourful than we can ever comprehend.
I hope you
liked this article. Drop a comment and tell me what you thought about it. See
you soon.
Auf
Wiedersehen!
No comments:
Post a Comment