Menu
For free
Registration
home  /  Relationship/ Incredible photographs of deep space (20 photos). Photographing spacecraft from space and from the surface of the earth Image obtained by the spacecraft

Incredible photographs of deep space (20 photos). Photographing spacecraft from space and from the surface of the earth Image obtained by the spacecraft

NASA/JPL-Caltech/UCLA


Celestial bodies of different spectral classes compared to the Sun


The same objects in infrared photography


Sometimes it is enough just to stitch together individual photographs into a panoramic image - this is how we see images sent by Mars rovers. But more often you need to complete more tasks. Robert Hurt, an astronomer at the Infrared Imaging Center at the California Institute of Technology and an expert in Photoshop, said that one of the steps is to remove unnecessary details from the image. In particular, photographic technology can create artifacts that will look like real objects in the Universe, but in fact they are not there.

In many cases, it is necessary to enhance the contrast of halftone images taken in different parts of the infrared range. Thanks to this, unnecessary objects are removed from the field of view, and the most interesting ones, on the contrary, are emphasized. Data from multiple telescopes are often combined to produce more accurate images, and using Photoshop's red, green, and blue layers can add visible color to infrared photographs.

The result of the work is a file of several tens of gigabytes, which combines data from several telescopes. The specialist, who has produced many images for the National Aeronautics and Space Administration (NASA), adds that his work allows the general public to become acquainted with discoveries in the field of space and get the most accurate picture of them.

30 years ago, the whole world watched with great interest as a pair of space travelers flew past Saturn, transmitting fascinating images of the planet and its moons.

Ed Stone, the project scientist for Voyager, one of NASA's most ambitious missions, remembers the first time he saw loops in one of Saturn's narrow rings. This was the day the Voyager 1 spacecraft made its closest flyby of the giant planet, 30 years ago. Scientists gathered in front of television monitors in the work offices of NASA's Jet Propulsion Laboratory in Pasadena, California, and pored over the stunning images and other data each day during the heady period of the flyby.

NASA's Voyager 1 spacecraft took this image during its closest flyby of Saturn. He showed loops in one of Saturn's narrow rings (left). Images from the Cassini spacecraft (right) have finally allowed scientists to understand how Saturn's moons Prometheus and Pandora form the twisted shape of the ring.

Dr. Stone turned his attention to the jagged, stranded ring, today known as the F ring. The countless particles that make up the broad rings are in a nearly circular orbit around Saturn. Thus, it was one of the surprises that the F ring was discovered just a year before the flyby of NASA's Pioneer 10 and 11 spacecraft.

"It was clear that Voyager was showing us a very different Saturn," said Stone, now at the California Institute of Technology in Pasadena. Time and time again, the spacecraft showed so many unexpected things, often taking many days, months and even years to comprehend.

The F ring was just one of many strange things discovered during Voyager's close approach to Saturn, which occurred on November 12, 1980, for Voyager 1, and August 25, 1981, for Voyager 2. Six small moons were found during the Voyager flyby and the mysterious Enceladus was studied, the surface of which indicated some kind of geological activity.

An incredible hexagonal structure around Saturn's north pole was first discovered in Voyager 2 images (left). Cassini took higher resolution photographs of the hexagon. The images show the hexagon to be a remarkably stable wave in one of the jet streams of the planet's atmosphere.

Images from the two spacecraft also showed enormous storms engulfing the planet's atmosphere that were not visible to ground-based telescopes.

Atmosphere of Titan

Scientists have used Voyager data to resolve a long-standing debate about whether Titan has a thick or thin atmosphere. Sensitive instruments revealed that Saturn's moon Titan had an atmosphere containing a thick haze of hydrocarbons in a nitrogen-rich atmosphere. The discovery led scientists to believe that seas of liquid methane and ethane exist on the surface of Titan.

This image from Voyager 1 showed Saturn's moon Titan shrouded in a haze of hydrocarbons in a nitrogen atmosphere and led astronomers to speculate about seas of liquid methane and ethane on Titan's surface. Cassini successfully confirmed this theory, sending back a radar image of a lake named Ontario (right) and images of other lakes of liquid hydrocarbons on Titan.

"When I look back, I realize how little we really knew about the solar system before the Voyager missions," Stone added.


Animation from radar imagery showing lakes on Titan's surface.

In fact, the flights of these space reconnaissance aircraft raised many new questions, for the sake of which another NASA spacecraft, Cassini, was subsequently sent to solve these mysteries. While Voyager 1 was supposed to fly about 126,000 kilometers above Saturn's clouds, Voyager 2 flew just 100,800 kilometers above the cloud layer, but Cassini descended even lower.

NASA's Voyager spacecraft was the first to capture close-up images of Saturn's moon Enceladus (left). The Cassini spacecraft first discovered plumes of water vapor emanating from the icy moon Enceladus (right) in 2005, resolving the issue of the lunar surface in geological terms.

Thanks to Cassini's long period of operation around Saturn, scientists have discovered answers to many of the mysteries seen by Voyager.

Ice geysers of Enceladus

Cassini discovered a mechanism that explains the constantly renewed landscape on Enceladus - tiger stripes, cracks from which jets of water vapor and organic particles shoot out. Cassini studies have shown that the moon Titan actually has stable lakes of liquid hydrocarbons on the surface and is very similar to Earth in the early period of its development. Cassini data also solved how two small moons discovered by Voyagers - Prometheus and Pandora - affect the F ring, which has a strange twisted shape.

Gallery of breathtaking images of the Cassini interplanetary probe

To get the full experience, watch in full screen mode (the square at the top right).























"Cassini owes many of its discoveries to Voyager," says Linda Spilker, JPL's Cassini project scientist, who began her career working from 1977 to 1989. "We're still comparing the Cassini data to the Voyager results and proudly building on that legacy."

Hexagon of Saturn

But the Voyagers still left many mysteries that Cassini has not yet solved. For example, scientists first noticed a hexagonal structure at Saturn's north pole in Voyager images.

Cassini took higher resolution photographs of the northern hexagon. The data tells scientists about a remarkably stable wave in the planet's atmosphere that has maintained Saturn's hexagon for 30 years.

Knitting needles in rings

Scientists first saw these clouds of tiny particles, known as "spokes," in images from NASA's Voyager spacecraft. The spokes are thought to be caused by electrostatically charged tiny particles that rise above the plane of the ring, but scientists are still figuring out how the particles get this charge.

Even more perplexing were several wedge-shaped clouds of tiny particles that were discovered in the rings of Saturn. Scientists have dubbed them "spokes" because they look like bicycle spokes. The Cassini team has been searching for them since the spacecraft first arrived at Saturn. During the equinox on Saturn, sunlight illuminated the edge-on rings and spokes appeared on the outer portion of Saturn's B ring. Cassini scientists are still testing their theories about what could be causing these strange phenomena.

The future of Voyager

Today, the Voyager spacecraft are still pioneering the journey to the edge of our solar system. We cannot expect these spacecraft to explore real interstellar space, but they transmit data about the haliopause quite successfully. It is planned that the energy of their radioisotope generators will be enough until 2030, and then lifeless ships, by inertia, will fly in outer space until they meet any star.

The Voyager 1 image (left) shows convective clouds on Saturn taken in 1980. The Cassini image (right) from 2004 shows a storm in the atmosphere of a giant giant called Draco, which was the powerful source of the radio emission detected by Cassini. This radio emission is very similar to bursts of radio emission generated by lightning on Earth. In 2009, Cassini sent back photographs of flashing lightning in Saturn's atmosphere.

Voyager 1 was launched on September 5, 1977, and is currently located about 17 billion kilometers from the Sun. This is the farthest spacecraft. Voyager 2, launched on August 20, 1977, is currently located at a distance of about 14 billion kilometers from the Sun.

The video, made from images taken by the Cassini spacecraft, shows hurricanes and storms swirling around the planet's north pole.

The Voyagers were built at JPL, which is run by the California Institute of Technology. The Cassini-Huygens mission is a joint project between NASA, the European Space Agency and the Italian Space Agency. JPL also operates Cassini, and the orbiter and its two onboard cameras were designed, developed and assembled at JPL.


Video showing Cassini's discoveries made during 15 years of work

A whole month has already passed since the Parker Solar Probe spacecraft set off on its flight. It is now known that each of its four instruments included in the payload saw “first light.” These early observations are not yet significant scientific events, but they do show that each of the craft's instruments is working well. The instruments work in tandem to measure the sun's electric and magnetic fields, solar particles and solar wind, and image the environment around the spacecraft.

“All of the instruments produced data that not only served as calibration, but also recorded bursts of what we expect to measure close to the Sun to solve the mysteries of the solar atmosphere and corona,” Nour Raouafi, Parker Solar Probe project scientist at the university's Applied Physics Laboratory John Hopkins.

The mission's first close approach to the Sun will take place in November 2018, but even now the instruments can collect data on what's happening in the solar wind while still closer to Earth. Here's a quick overview of these results.

WISPR (Wide-field Imager for Solar Probe, Optical telescope for imaging the solar corona and heliosphere)

In fact, WISPR is the only device on the device that will show the most understandable result for everyone - images in the visible range. It will make it possible to clearly, but very briefly, observe the solar wind from inside the corona. The instrument consists of two telescopes and is located behind a heat shield between two antennas from the FIELDS instrument kit. To keep them safe, the telescopes were covered with a protective shield during launch.

WISPR was turned on in early September 2018 and has already transmitted test images to Earth for calibration, obtained with the protective shield closed. On September 9, 2018, its doors were opened, allowing the equipment to take the first images during its journey to the Sun.


The right side of this image - from the indoor WISPR telescope - has a 40-degree field of view. The left side of the image is from the external WISPR telescope, which has a 58-degree field of view. Source: NASA/Naval Research Laboratory/Parker Solar Probe

Russ Howard, principal investigator for the WISPR program at the Naval Research Laboratory, studied the images to determine what the instrument was seeing compared to what was expected, using different celestial landmarks as guides.

“There is a very characteristic cluster of stars in the overlap of the two images. The brightest star is Antares, which is in the constellation Scorpius about 90 degrees from the Sun,” Howard said.

The sun, not visible in this image, is far to the right of the edge of the image. The planet Jupiter is also visible in the image. It was captured by the indoor telescope WISPR - a bright object just right of center on the right side of the image.

"The left side of the photo shows a beautiful image of the Milky Way, looking towards the galactic center."

Exposure time, which is the amount of time that light shines on the open sensor to produce that image, is an interval that can be shortened or lengthened to make the image darker or brighter. During this shoot, the exposure time was minimal, and for good reason:

“We deliberately kept the exposure short because if there was something very bright there when we first turned on the camera, it would simply blow everything out.”

As the spacecraft approaches the Sun, its orientation will change, as will the WISPR images. With each new orbit around the Sun, WISPR will capture images of structures emerging from its corona. And, while other measurements have previously been made with instruments as close as one astronomical unit, WISPR will operate much closer to the Sun, reducing that distance by about 95 percent. This greatly increases the ability to see what's happening in this region on a much smaller scale than ever before, providing new images of the previously untouched solar corona.

ISʘIS (Integrated Science Investigation of the Sun, Investigation of electrons, protons and heavy ions)


Source: NASA/Princeton University/Parker Solar Probe

ISʘIS (pronounced “isis,” the acronym simply includes the symbol for the Sun) measures high-energy particles associated with solar activity, that is, flares and coronal mass ejections. (The mission's other set of instruments, SWEAP, focuses on the low-energy particles that make up the solar wind.) ISʘIS consists of two instruments that cover the energy range for these active particles: EPI-Lo focuses on the lower end of the energy spectrum, and EPI-Hi measures higher active particles. Both instruments collected their first data under low-voltage conditions, allowing scientists to verify that the detectors were working as expected. As Parker Solar Probe approaches the Sun, it will be fully operational to measure particles in its corona.

Data from EPI-Lo on the left shows background cosmic rays - charged particles that came into our solar system from other parts of the galaxy. As more voltage is applied to EPI-Lo and the probe turns toward the Sun, the instrument will begin to measure more of those particles that are already related to the solar wind.

On the right is data from EPI-Hi, which shows the concentrations of hydrogen and helium particles. Closer to the Sun, scientists expect to observe many more of these particles, along with heavier elements, as well as some particles with much higher energies, especially during ejection events.

“The ISʘIS team is delighted that the device is working well. There are still a few steps ahead, but so far everything looks great!” - David McComas, professor of astrophysical sciences at Princeton University and principal investigator of the ISʘIS program.

FIELDS (Measurement of electric and magnetic fields, radio waves, Poynting vector, plasma, and electron temperature)


Source: NASA/UC Berkeley/Parker Solar Probe

The FIELDS instrument suite aboard the Parker Solar Probe will study the scale and shape of electric and magnetic fields in the solar atmosphere. These are key measurements for understanding why the Sun's corona is hundreds of times hotter than its surface.

FIELDS sensors consist of four two-meter electric field antennas. They are mounted at the front of the ship, extending beyond the thermal shield, so they are exposed to the full force of the solar environment. Also included are three magnetometers and a fifth short electric field antenna mounted on a boom that extends from the rear of the ship.

The above data, collected during mast deployment shortly after the spacecraft's launch in August 2018, shows how the magnetic field changes as the mast moves away from the probe. Early data is the magnetic field of the spacecraft itself; instruments measured a sharp drop in the magnetic field as the boom moved away from the craft. Once deployed, the instruments will measure the magnetic field of the solar wind. The above graph eloquently illustrates the reason why such sensors should be located far from the spacecraft.

In early September 2018, four electric field antennas were successfully deployed on the front of the spacecraft, and signatures of solar flares began to be observed almost immediately thereafter.


Illustration comparing data from Parker Solar Probe (center and bottom) and Wind (top).

As promised in the comments to my publication “Why are rovers on Mars!”, where questions were asked about space photographs, photographs of space objects, about the stitching of photographs and how rovers take “selfies”, this material has been prepared.

So: "Let's go!"))

Photos from space published on the websites of NASA and other space agencies often attract the attention of those who doubt their authenticity - critics find traces of editing, retouching or color manipulation in the images. This has been the case since the birth of the “moon conspiracy,” and now photographs taken not only by Americans, but also by Europeans, Japanese, and Indians have come under suspicion. Together with the N+1 portal, we are looking into why space images are processed at all and whether, despite this, they can be considered authentic.

In order to correctly assess the quality of space images that we see on the Internet, it is necessary to take into account two important factors. One of them is related to the nature of interaction between agencies and the general public, the other is dictated by physical laws.

Public relations

Space images are one of the most effective means of popularizing the work of research missions in near and deep space. However, not all footage is immediately available to the media.

Images received from space can be divided into three groups: “raw”, scientific and public. Raw, or original, files from spacecraft are sometimes available to everyone, and sometimes not. For example, images taken by the Mars rovers Curiosity and Opportunity or Saturn's moon Cassini are released in near real time, so anyone can see them at the same time as scientists studying Mars or Saturn. Raw photographs of the Earth from the ISS are uploaded to a separate NASA server. Astronauts flood them with thousands, and no one has time to pre-process them. The only thing that is added to them on Earth is a geographic reference to make searching easier.

Usually, public footage that is attached to press releases from NASA and other space agencies is criticized for retouching, because they are the ones that catch the eye of Internet users in the first place. And if you want, you can find a lot of things there. And color manipulation:

Photo of the landing platform of the Spirit rover in visible light and capturing near-infrared light. (c) NASA/JPL/Cornell

And overlaying several images:

Earthrise over Compton Crater on the Moon. (c) NASA/Goddard/Arizona State University

And copy-paste:

Fragment of Blue Marble 2001(c) NASA/Robert Simmon/MODIS/USGS EROS

And even direct retouching, with erasing some image fragments:

Highlighted image GPN-2000-001137 of the Apollo 17 expedition. (c) NASA

NASA’s motivation in the case of all these manipulations is so simple that not everyone is ready to believe it: it’s more beautiful.

But it’s true, the bottomless blackness of space looks more impressive when it’s not interfered with by debris on the lens and charged particles on the film. A color frame is indeed more attractive than a black and white one. A panorama from photographs is better than individual frames. It is important that in the case of NASA it is almost always possible to find the original footage and compare one with the other. For example, the original version (AS17-134-20384) and the “printable” version (GPN-2000-001137) of this image from Apollo 17, which is cited as almost the main evidence of retouching of lunar photographs:

Comparison of frames AS17-134-20384 and GPN-2000-001137 (c) NASA

Or find the rover’s “selfie stick,” which “disappeared” when taking his self-portrait:

Physics of Digital Photography

Typically, those who criticize space agencies for manipulating color, using filters, or publishing black-and-white photographs “in this digital age” fail to consider the physical processes involved in producing digital images. They believe that if a smartphone or camera immediately produces color images, then a spacecraft should be even more capable of doing this, and they have no idea what complex operations are needed to immediately get a color image onto the screen.

Let us explain the theory of digital photography: the matrix of a digital camera is, in fact, a solar battery. There is light - there is current, no light - no current. Only the matrix is ​​not a single battery, but many small batteries - pixels, from each of which the current output is separately read. Optics focuses light onto a photomatrix, and electronics reads the intensity of energy released by each pixel. From the data obtained, an image is constructed in shades of gray - from zero current in the dark to maximum in the light, that is, the output is black and white. To make it color, you need to apply color filters. It turns out, oddly enough, that color filters are present in every smartphone and in every digital camera from the nearest store! (For some, this information is trivial, but, according to the author’s experience, for many it will be news.) In the case of conventional photographic equipment, alternating red, green and blue filters are used, which are alternately applied to individual pixels of the matrix - this is the so-called Bayer filter .

The Bayer filter consists of half green pixels, and red and blue each occupy one quarter of the area. (c) Wikimedia

We repeat here: navigation cameras produce black and white images because such files weigh less, and also because color is simply not needed there. Scientific cameras allow us to extract more information about space than the human eye can perceive, and therefore they use a wider range of color filters:

Matrix and filter drum of the OSIRIS instrument on Rosetta (c) MPS

Using a filter for near-infrared light, which is invisible to the eye, instead of red, resulted in Mars appearing red in many of the images that made it into the media. Not all of the explanations about the infrared range were reprinted, which gave rise to a separate discussion, which we also discussed in the material “What color is Mars.”

However, the Curiosity rover has a Bayer filter, which allows it to shoot in colors familiar to our eyes, although a separate set of color filters is also included with the camera.

Filters on the mast camera of the Curiosity rover (c) NASA/JPL-Caltech/MSSS

The use of individual filters is more convenient in terms of selecting the light ranges in which you want to look at the object. But if this object moves quickly, then its position changes in pictures in different ranges. In the Elektro-L footage, this was noticeable in the fast clouds, which managed to move in a matter of seconds while the satellite was changing the filter. On Mars, a similar thing happened when filming sunsets at the Spirit and Opportunity rover - they do not have a Bayer filter:

Sunset taken by Spirit on Sol 489. Overlay of images taken with 753,535 and 432 nanometer filters. (c) NASA/JPL/Cornell

On Saturn, Cassini has similar difficulties:

Saturn's moons Titan (behind) and Rhea (front) in Cassini images (c) NASA/JPL-Caltech/Space Science Institute

At the Lagrange point, DSCOVR faces the same situation:

To get a beautiful photo from this shoot suitable for distribution in the media, you have to work in an image editor.

There is another physical factor that not everyone knows about - black and white photographs have higher resolution and clarity compared to color ones. These are so-called panchromatic images, which include all the light information entering the camera, without cutting off any parts of it with filters. Therefore, many “long-range” satellite cameras shoot only in panchrome, which for us means black and white footage. Such a LORRI camera is installed on New Horizons, and a NAC camera is installed on the LRO lunar satellite. Yes, in fact, all telescopes shoot in panchrome, unless special filters are used. (“NASA is hiding the true color of the Moon” is where it came from.)

A multispectral “color” camera, equipped with filters and having a much lower resolution, can be attached to a panchromatic one. At the same time, its color photographs can be superimposed on panchromatic ones, as a result of which we obtain high-resolution color photographs.

While the rest of the world is watching and waiting for new information about Starman (a mannequin from SpaceX, dressed in a new spacesuit developed by it and sitting in the driver's seat of a Tesla electric roadster heading towards Mars), the space agency NASA published the most distant space photograph in the history of mankind, taken by a space apparatus "New Horizons". At the time the photo was taken (December 5, 2017), the device was 6.12 billion kilometers from Earth.

In addition to the distance record, the New Horizons photos have other amazing features. The station managed to image several objects in the Kuiper belt, located at a distance of 55 astronomical units from Earth, beyond the orbit of Neptune. The belt consists of small cosmic bodies and accumulations of various substances, such as ice, ammonia and methane.

Let us recall that one astronomical unit is equal to 149.6 million kilometers, that is, the distance from the Earth to the Sun. Thus, the objects that New Horizons managed to photograph are located at a distance of over eight billion kilometers from us. In particular, the station, moving towards its main target - the Kuiper belt object 2014 MU69 - managed to obtain false-color images of several dwarf planets 2012 HZ84 and 2012 HE85.

Kuiper belt objects 2012 HZ84 (left) and 2012 HE85 (right)

On the same day, but two hours earlier, the device took another photograph. This time the object for the image was a more distant target - the Wishing Well star cluster (NGC 3532).

Wishing Well Star Cluster (NGC 3532)

From 2015 to 2016, the spacecraft captured an entire photoset of detailed images of the dwarf planet Pluto, giving astronomers another opportunity to study and analyze the surface of this celestial body at an unprecedented new level of detail.

It should be noted that New Horizons is far from the first device that managed to get so far from Earth. Before it there were probes such as Voyager 1/2, as well as Pioneer 10/11. However, New Horizons is the only man-made spacecraft whose camera is still operational. The probe is currently in hibernation mode and moving towards its main mission goal. Scientists expect that in 2019 the device will be able to image the planetoid 2014 MU69, which is located at a distance of 1.6 billion kilometers from Pluto.