feature article
Subscribe Now

Metaspectral Takes AI Processing of Hyperspectral Imaging by Storm

Do you know any jokes suitable for young kids? The reason I ask is that there are three girls aged 8, 7, and 3 living next door to me. Whenever they see me pottering around outside, the two older girls race over to tell me their latest and greatest jokes and puns (the youngest one toggles along behind to make sure I haven’t forgotten that she’s three fingers old).

Of course, this means I need to have some jokes of my own to tell in return. Furthermore, these jokes need to be a tad more subtle than my usual material (i.e., any jokes The Muppet Show refused). The sort of thing we’re looking at here is:

      Question: Can a kangaroo jump higher than your house?
      Answer: Of course it can because your house can’t jump!

      Question: What did the banana say to the dog?
      Answer: Nothing—bananas can’t talk!

      Question: What time is it if an elephant sits on mommy’s car?
      Answer: It’s time for mommy to get a new car!

Thank you, thank you. Tell your friends. I’ll be playing here all week. The problem is that I need at least two new jokes a day and tracking down a constant stream of suitable material (and memorizing it) is exhausting, so if you have any suitable suggestions, please post them in the comments below.

Speaking of hyperspectral imaging (we weren’t, but we are now), the idea here is to use a special camera to obtain the spectrum for each pixel in the image of a scene with the goal of detecting objects, identifying materials, or detecting processes.

The camera sensors we typically employ for most tasks, including machine vision applications, detect only red, green, and blue light. This is due to the fact that, as I discussed in my paper on the Evolution of Color Vision, most people are trichromats, which means our eyes boast three different types of color receptors that we use to perceive what we think of as the “visible light” portion of the electromagnetic spectrum.

These three types of receptors allow us to detect long wavelengths (perceived as red), medium wavelengths (perceived as green), and short wavelengths (perceived as blue). Interestingly enough, a very small percentage of female humans are classed as being tetrachromats because they have four different types of color receptor cells in their eyes.

I think my wife (Gina the Gorgeous) must be one of their number because—whenever we find ourselves decorating the house—she will brandish bunches of color swatches under my nose demanding to know which option I prefer, even though all the members of a swatch appear identical to me. If I find myself foolish enough to say this (rather than simply picking one at random and saying “this is by far the best of the bunch”), she will roll her eyes and proceed to tell me the names and differentiating qualities of each item. This explains why I can now inform guests that we have “gray with a sniff of lavender” in the dining room and “gray with a hint of seafoam in the study” (they all look like “gray with a hint of gray” to me).

As opposed to regular cameras, hyperspectral imaging expands the range of the spectrum being considered, typically into the infrared (IR), but sometimes into the ultraviolet (UV). According to the Wikipedia:

Engineers build hyperspectral sensors and processing systems for applications in astronomy, agriculture, molecular biology, biomedical imaging, geosciences, physics, and surveillance. Hyperspectral sensors look at objects using a vast portion of the electromagnetic spectrum. Certain objects leave unique ‘fingerprints’ in the electromagnetic spectrum. Known as spectral signatures, these ‘fingerprints’ enable identification of the materials that make up a scanned object. For example, a spectral signature for oil helps geologists find new oil fields.

One of the great things about being me (apart from being outrageously handsome, a trendsetter, and a leader of fashion, of course) is that I’ve been around a long time, as part of which I’ve been fortunate enough to meet a bunch of interesting people. In turn, of course, they’ve been lucky enough to meet me, but there’s no point in our dwelling on the obvious.

For example, one of my friends is Adam Taylor, who is the founder of Adiuvo Engineering and Training. In 2015, the interplanetary space probe called New Horizons became the first spacecraft to fly-by Pluto (maybe we should say “first human spacecraft” just to cover all the bases). At that time, Adam was the Chief Engineer, Electrical Systems, at the UK imaging company e2v (now Teledyne ev2), whose imaging sensors were used to take the glorious photos of Pluto and its moon Charon.

Pluto (front) and Charon (rear) (Source: NASA/JHUAPL/SWRI)

In fact, e2v actually provided two of the imaging sensors featured in New Horizons. The Long-Range Reconnaissance Imager (LORRI) system employed a 1024×1024 pixel by 12-bits-per-pixel monochromatic CCD imager. Meanwhile, the Ralph telescope, which was the one that took the pictures that kept us riveted to our seats, featured a 5K x 32 lines hyperspectral imager. This super sensor includes two panchromatic (black and white) lines that are sensitive to all wavelengths of visible light; two visible light color lines (red and blue, from which green can be inferred when used in conjunction with the panchromatic data); two lines that are sensitive to infrared, and one line that can be used to detect methane.

Adam introduced me to Paul Jerram, Engineering Manager, Sensors, Space Imaging at e2v, and Steve Bowring, Principal Engineer, Design, Space Imaging at e2v. Paul and Steve were the ones who designed the aforementioned sensors, and it was awesome to hear them tell how they felt when they finally got to see the images captured by their creations. The following year, in 2016, I got to meet Alice Bowman, who was the Mission Operations Manager (MOM) for the New Horizons space probe, but that’s a story for another day.

All of the above (well, apart from the elephant, kangaroo, and banana jokes) sets the scene for the fact that I was recently chatting with Migel Tissera, who is Co-founder and CTO at Metaspectral. My brain is still spinning on its gimbals as a result of all the things Migel told me, but I’ll try to condense things into an understandable form as follows.

First, Metaspectral is a small, feisty company that is at the forefront of real-time AI-powered hyperspectral data analysis. They’ve built a complete technology stack that starts with hardware in the form of an edge board that connects to any hyperspectral camera on the market (they are totally camera agnostic), and that makes the camera discoverable over any computer network. On the software side, in addition to state-of-the-art hyperspectral-tuned lossless compression algorithms, they have super-advanced deep-learning algorithms that can analyze hyperspectral data at a sub-pixel level, thereby allowing them to “go deeper into each pixel to figure out the material composition of each pixel.”

One example application is plastic recycling. Imagine a conveyor belt carrying lots of transparent plastic items (bottles, bags, “things”) that appear identical material-wise to the human eye or to a regular camera sensor. When fed by a hyperspectral camera, however, Metaspectral’s Fusion Platform can identify the composition—that is, the type(s) of plastic/polymer forming each object—on a pixel-by-pixel basis. The results can be used to control pick-and-sort robotic hands, for example.

Some example applications for Metaspectral’s technology (Source: Metaspectral)

Other applications and markets include space in the form of hyperspectral Earth Observation (EO) analysis, precision agriculture and environmental monitoring, and security and defense with the ability to detect chemical, biological, radiological, and nuclear (CBRN) materials. 

There’s so much to wrap one’s brain around here that it can make your eyes water, like the “synthetic data generation processes used to train target detection neural networks,” for example. The best I can suggest is that—if you are interested in learning more—you bounce over to Metaspectral’s website and feast your orbs on everything they have to offer. In the meantime, I would love to hear your thoughts on all of this (and your kid friendly jokes, of course).

 

3 thoughts on “Metaspectral Takes AI Processing of Hyperspectral Imaging by Storm”

  1. I can’t believe it — my chum Matt Pulzer in the UK just reminded me of a classic that will be great for the little girls next door:

    Question: What’s brown and sticky?
    Answer: A stick!

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

ROHM Automotive Intelligent Power Device (IPD)
Modern automotive applications require a variety of circuit protections and functions to safeguard against short circuit conditions. In this episode of Chalk Talk, Amelia Dalton and Nick Ikuta from ROHM Semiconductor investigate the details of ROHM’s Automotive Intelligent Power Device, the role that ??adjustable OCP circuit and adjustable OCP mask time plays in this solution, and the benefits that ROHM’s Automotive Intelligent Power Device can bring to your next design.
Feb 1, 2024
11,536 views