feature article
Subscribe Now

FPGA Saves the World (Sort of), Part 2

On September 26 of this year, the DART (Double Asteroid Redirection Test) spacecraft slammed into the moonlet/asteroid Dimorphos and vaporized itself into an expanding plume of atomized debris. Fifteen days prior, DART had released LICIAcube – the Light Italian CubeSat for Imaging Asteroids – which was hitchhiking along for the ride. However, LICIAcube wasn’t a joyrider. It had an important mission: to capture images of DART’s impact with Dimorphos and subsequent vaporization from a safe distance – about 50 kilometers from Dimorphos – and then to image the other, unseen side of Dimorphos after the impact. DART would never see the other side of Dimorphos. LICIAcube would then continue to cruise the Solar System for the remainder of its estimated 6-month lifespan, or however long it lasts. The DART spacecraft also captured and transmitted images of Dimorphos, right up to the point where it transformed itself into a bright debris cloud.

LICIAcube is the first Italian spacecraft to reach deep space. It’s operated by the Italian Space Agency (ASI) with participation from the National Institute of Astrophysics and the Universities of Bologna and Milan. Argotec, an Italian aerospace company, designed, integrated, and tested the CubeSat prior to its launch. LICIAcube is a 6U CubeSat, where one “U” measures 10 by 10 by 11.35 cm, so LICIAcube measures 10 by 20 by 30 cm (before it spreads its two photoelectric solar arrays) and masses 14 kg (about 31 pounds). That means LICIAcube is small and runs on relatively little power.

The small CubeSat carries two on-board cameras, named LUKE (LICIACube Unit Key Explorer) and LEIA (LICIACube Explorer Imaging for Asteroid). LUKE is a more conventional kind of color imaging camera with an RGB Bayer pattern and infrared filter and a wide 5° field of view. LEIA is a monochrome camera with a narrower 2.06° field of view. LUKE produces a 2048×1088-pixel color image and LEIA produces either a 2048×2048- or a 1024×1024-pixel monochrome image depending on a compression setting. (With the obvious references to the original 1977 “Star Wars” movie, “LICIAcube’s cameras must be able to discern between a moon[let] and a space station,” said spokesperson General Ben “Obi Wan” Kenobi.)

Both of LICIAcube’s cameras employ an AMS CMV4000 CMOS imaging sensor with a maximum 2048×2048-pixel resolution and a global electronic shutter. The LEIA camera is designed to observe details of the impact plume created when the DART spacecraft slams into Dimorphos. LEIA’s lack of a color filter allows the camera to use the CMOS imager’s full resolution to image the asteroid. LUKE’s larger field of view allows it to fully image DART’s impact plume during LICIAcube’s flyby, and the camera’s RGB Bayer filter will help scientists analyze the moonlet’s and impact plume’s physical properties. At least, that’s the hope.

Because LICIAcube is a small satellite with limited power, it has a low-bandwidth link to earth. That means it will take some time for the images it captured during DART’s close encounter of the terminal kind with Dimorphos to reach Earth. LICIAcube sends data back to Earth from 38 light-seconds away using its integrated patch antennas to emit a bitstream aimed at NASA’s Deep Space Network, located on a distant blue marble named Earth. At best, LICIAcube’s X-band radio link to Earth can handle 256 kilobits/sec. NASA and ASI received LICIAcube’s first two images shortly after DART’s impact and subsequent vaporization. The rest of the captured images will stream back over the next several days and weeks.

This is a novel use for a CubeSat. Most of the CubeSats I am familiar with are designed to operate in low- or near-Earth orbits, not 6 million miles (10 million kilometers) away. LICIAcube‘s mission takes it 24 times as far away from Earth as Earth’s moon. Because of this extreme distance and the resulting 38 light-second delay, LICIAcube must navigate and operate autonomously. That’s a tall order for a small computer embedded in a very small spacecraft.

LICIAcube’s computer is an Argotec HAWK, which contains an Argotec Fermi OBC (on-board computer) that’s based on a CAES GR712RC dual-core, radiation-hardened LEON3FT SPARC V8 Processor with a fault-tolerant memory controller and 256 Mbytes of EDAC (error detection and correction) SDRAM.

Like the DART spacecraft, LICIAcube’s imaging systems employ imaging systems based on a hardware image-processing pipeline. As with DART, LICIAcube’s hardware image-processing pipeline is also implemented in an FPGA – the same FPGA used for DART’s image processing: a radiation-tolerant Microchip RTG4. This FPGA resides on a companion board plugged into the Argotec HAWK computer.

The LICIAcube’s imaging pipeline instantiated in the Microchip RTG4 FPGA implements a set of image-processing algorithms including:

  •         Pixel binning or image downsampling from 1024×1024 to 512×512 pixels with 12 bits/pixel
  •         2D convolutional, low-pass filtering
  •         Histogram generation based on pixel luminance with thresholding
  •         Binarization, to create a 1-bit/pixel image
  •         Object detection, which involves labeling regions and identifying objects in the image

In addition, software running on the HAWK computer’s LEON3 processor performs feature extraction of the identified objects and finds the centroid of each labeled object.

The first images that LICIAcube sent to Earth shortly after the DART spacecraft collided with Dimorphos show the asteroid and the impact plume:

LICIAcube’s first images sent to Earth show the asteroid Dimorphos (center right) and the impact plume, which is all that remains of the DART spacecraft (center top). Image credit: ASI/NASA

LICIAcube’s images complement the pre-impact images taken by the DART spacecraft’s DRACO imager (See “FPGA Saves the World (Sort of)”) and images taken by the Hubble Space Telescope (HST) from Earth orbit (see “Losing Hubble – Saving Hubble”) and the James Webb Space Telescope (JWST) from its semi-stable orbit one million miles from Earth. The Didymos/Dimorphos binary asteroid system was chosen because it passes Earth orbit closely enough to be observed in detail by multiple telescopes including HST and JWST.

The combined testimony of these spacecraft and telescope images should advance our understanding of asteroid composition significantly, but that was not the main mission goal. As Part 1 of this article explained, the DART mission attempted to deflect the Dimorphos moonlet/asteroid’s trajectory as it orbits Didymos using kinetic impact alone, without explosives. DART’s impact did not produce the dramatic imagery of Hollywood’s “Armageddon,” which employed a nuclear weapon to destroy a rogue asteroid, nor did it involve Bruce Willis or a pounding, award-winning Aerosmith soundtrack. Instead the DART mission uses a far less impactful approach to redirect the asteroid rather than destroying it. Curiously, with a mission cost of about a third of a billion dollars, NASA’s actual, real-life DART mission cost less than three times the $140 million needed to make the movie some 25 years ago.

It will take a few weeks to collect and fully analyze all the images obtained from the DART spacecraft, LICIAcube, HST, JWST, and the other earthbound telescopes that observed the impact event. However, on October 11, NASA confirmed that the spacecraft’s impact altered Dimorphos’ orbit around Didymos by 32 minutes, shortening the 11 hour and 55-minute orbit to 11 hours and 23 minutes. Before the DART mission’s impact on Dimorphos, NASA’s mission goals defined a minimum successful period change of Dimorphos orbit as a change of at least 73 seconds, so it appears that DART may have surpassed this minimum benchmark by more than 25x.

 

Leave a Reply

featured blogs
Feb 2, 2023
The Sigrity Aurora online course provides the essential training required to start working with Sigrity Aurora. The course covers the design flow from simulating a pre-routed parallel bus to constraining the PCB routing based on the simulation results. Constraints are created...
Feb 1, 2023
See how Lightelligence used our Platform Architect SoC design tool to develop a multi-die system-in-package including digital, analog, and optical components. The post Customer Spotlight: Lightelligence Optimizes Optical SoC Design with Synopsys Platform Architect appeared f...
Jan 30, 2023
By Hossam Sarhan Work smarter, not harder. Isn't that what everyone is always telling you? Of course, it's excellent advice,… ...
Jan 19, 2023
Are you having problems adjusting your watch strap or swapping out your watch battery? If so, I am the bearer of glad tidings....

featured video

Synopsys 224G & 112G Ethernet PHY IP OIF Interop at ECOC 2022

Sponsored by Synopsys

This Featured Video shows four demonstrations of the Synopsys 224G and 112G Ethernet PHY IP long and medium reach performance, interoperating with third-party channels and SerDes.

Learn More

featured chalk talk

Flexible Development with the PSoC 62S2 Evaluation Kit

Sponsored by Mouser Electronics and Infineon

In order to get a successful IoT design launched today, we need a robust toolbox of cloud connectivity solutions, sensor interfaces, radio modules, and more. In this episode of Chalk Talk, Amelia Dalton and Paul Wiegele from Infineon investigate the PSoC™ 62S2 Evaluation Kit from Infineon. They take a closer look at the key features included in this kit and how it can help jumpstart your next IoT design.

Click here for more information about Infineon Technologies PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit