feature article
Subscribe Now

Metaspectral Takes AI Processing of Hyperspectral Imaging by Storm

Do you know any jokes suitable for young kids? The reason I ask is that there are three girls aged 8, 7, and 3 living next door to me. Whenever they see me pottering around outside, the two older girls race over to tell me their latest and greatest jokes and puns (the youngest one toggles along behind to make sure I haven’t forgotten that she’s three fingers old).

Of course, this means I need to have some jokes of my own to tell in return. Furthermore, these jokes need to be a tad more subtle than my usual material (i.e., any jokes The Muppet Show refused). The sort of thing we’re looking at here is:

      Question: Can a kangaroo jump higher than your house?
      Answer: Of course it can because your house can’t jump!

      Question: What did the banana say to the dog?
      Answer: Nothing—bananas can’t talk!

      Question: What time is it if an elephant sits on mommy’s car?
      Answer: It’s time for mommy to get a new car!

Thank you, thank you. Tell your friends. I’ll be playing here all week. The problem is that I need at least two new jokes a day and tracking down a constant stream of suitable material (and memorizing it) is exhausting, so if you have any suitable suggestions, please post them in the comments below.

Speaking of hyperspectral imaging (we weren’t, but we are now), the idea here is to use a special camera to obtain the spectrum for each pixel in the image of a scene with the goal of detecting objects, identifying materials, or detecting processes.

The camera sensors we typically employ for most tasks, including machine vision applications, detect only red, green, and blue light. This is due to the fact that, as I discussed in my paper on the Evolution of Color Vision, most people are trichromats, which means our eyes boast three different types of color receptors that we use to perceive what we think of as the “visible light” portion of the electromagnetic spectrum.

These three types of receptors allow us to detect long wavelengths (perceived as red), medium wavelengths (perceived as green), and short wavelengths (perceived as blue). Interestingly enough, a very small percentage of female humans are classed as being tetrachromats because they have four different types of color receptor cells in their eyes.

I think my wife (Gina the Gorgeous) must be one of their number because—whenever we find ourselves decorating the house—she will brandish bunches of color swatches under my nose demanding to know which option I prefer, even though all the members of a swatch appear identical to me. If I find myself foolish enough to say this (rather than simply picking one at random and saying “this is by far the best of the bunch”), she will roll her eyes and proceed to tell me the names and differentiating qualities of each item. This explains why I can now inform guests that we have “gray with a sniff of lavender” in the dining room and “gray with a hint of seafoam in the study” (they all look like “gray with a hint of gray” to me).

As opposed to regular cameras, hyperspectral imaging expands the range of the spectrum being considered, typically into the infrared (IR), but sometimes into the ultraviolet (UV). According to the Wikipedia:

Engineers build hyperspectral sensors and processing systems for applications in astronomy, agriculture, molecular biology, biomedical imaging, geosciences, physics, and surveillance. Hyperspectral sensors look at objects using a vast portion of the electromagnetic spectrum. Certain objects leave unique ‘fingerprints’ in the electromagnetic spectrum. Known as spectral signatures, these ‘fingerprints’ enable identification of the materials that make up a scanned object. For example, a spectral signature for oil helps geologists find new oil fields.

One of the great things about being me (apart from being outrageously handsome, a trendsetter, and a leader of fashion, of course) is that I’ve been around a long time, as part of which I’ve been fortunate enough to meet a bunch of interesting people. In turn, of course, they’ve been lucky enough to meet me, but there’s no point in our dwelling on the obvious.

For example, one of my friends is Adam Taylor, who is the founder of Adiuvo Engineering and Training. In 2015, the interplanetary space probe called New Horizons became the first spacecraft to fly-by Pluto (maybe we should say “first human spacecraft” just to cover all the bases). At that time, Adam was the Chief Engineer, Electrical Systems, at the UK imaging company e2v (now Teledyne ev2), whose imaging sensors were used to take the glorious photos of Pluto and its moon Charon.

Pluto (front) and Charon (rear) (Source: NASA/JHUAPL/SWRI)

In fact, e2v actually provided two of the imaging sensors featured in New Horizons. The Long-Range Reconnaissance Imager (LORRI) system employed a 1024×1024 pixel by 12-bits-per-pixel monochromatic CCD imager. Meanwhile, the Ralph telescope, which was the one that took the pictures that kept us riveted to our seats, featured a 5K x 32 lines hyperspectral imager. This super sensor includes two panchromatic (black and white) lines that are sensitive to all wavelengths of visible light; two visible light color lines (red and blue, from which green can be inferred when used in conjunction with the panchromatic data); two lines that are sensitive to infrared, and one line that can be used to detect methane.

Adam introduced me to Paul Jerram, Engineering Manager, Sensors, Space Imaging at e2v, and Steve Bowring, Principal Engineer, Design, Space Imaging at e2v. Paul and Steve were the ones who designed the aforementioned sensors, and it was awesome to hear them tell how they felt when they finally got to see the images captured by their creations. The following year, in 2016, I got to meet Alice Bowman, who was the Mission Operations Manager (MOM) for the New Horizons space probe, but that’s a story for another day.

All of the above (well, apart from the elephant, kangaroo, and banana jokes) sets the scene for the fact that I was recently chatting with Migel Tissera, who is Co-founder and CTO at Metaspectral. My brain is still spinning on its gimbals as a result of all the things Migel told me, but I’ll try to condense things into an understandable form as follows.

First, Metaspectral is a small, feisty company that is at the forefront of real-time AI-powered hyperspectral data analysis. They’ve built a complete technology stack that starts with hardware in the form of an edge board that connects to any hyperspectral camera on the market (they are totally camera agnostic), and that makes the camera discoverable over any computer network. On the software side, in addition to state-of-the-art hyperspectral-tuned lossless compression algorithms, they have super-advanced deep-learning algorithms that can analyze hyperspectral data at a sub-pixel level, thereby allowing them to “go deeper into each pixel to figure out the material composition of each pixel.”

One example application is plastic recycling. Imagine a conveyor belt carrying lots of transparent plastic items (bottles, bags, “things”) that appear identical material-wise to the human eye or to a regular camera sensor. When fed by a hyperspectral camera, however, Metaspectral’s Fusion Platform can identify the composition—that is, the type(s) of plastic/polymer forming each object—on a pixel-by-pixel basis. The results can be used to control pick-and-sort robotic hands, for example.

Some example applications for Metaspectral’s technology (Source: Metaspectral)

Other applications and markets include space in the form of hyperspectral Earth Observation (EO) analysis, precision agriculture and environmental monitoring, and security and defense with the ability to detect chemical, biological, radiological, and nuclear (CBRN) materials. 

There’s so much to wrap one’s brain around here that it can make your eyes water, like the “synthetic data generation processes used to train target detection neural networks,” for example. The best I can suggest is that—if you are interested in learning more—you bounce over to Metaspectral’s website and feast your orbs on everything they have to offer. In the meantime, I would love to hear your thoughts on all of this (and your kid friendly jokes, of course).

 

3 thoughts on “Metaspectral Takes AI Processing of Hyperspectral Imaging by Storm”

  1. I can’t believe it — my chum Matt Pulzer in the UK just reminded me of a classic that will be great for the little girls next door:

    Question: What’s brown and sticky?
    Answer: A stick!

Leave a Reply

featured blogs
Oct 3, 2024
Someone with too much time on his hands managed to get Linux to boot on an Intel 4004 in only 4.76 days...

featured paper

A game-changer for IP designers: design-stage verification

Sponsored by Siemens Digital Industries Software

In this new technical paper, you’ll gain valuable insights into how, by moving physical verification earlier in the IP design flow, you can locate and correct design errors sooner, reducing costs and getting complex designs to market faster. Dive into the challenges of hard, soft and custom IP creation, and learn how to run targeted, real-time or on-demand physical verification with precision, earlier in the layout process.

Read more

featured chalk talk

Choosing the Ideal Semiconductor Technology for your Topology
Sponsored by Mouser Electronics and Infineon
Wide band gap semiconductor materials are great choices for next generation power converter switches. In this episode of Chalk Talk, Amelia Dalton and Matthew Reynolds from Infineon explore the benefits and tradeoffs of GaN, silicon junction or Silicon carbide power switching technology, the importance that body diode performance plays in these solutions and how you can take advantage of gallium nitride, silicon carbide, and silicon junction wide band gap power solutions in your next design.
Sep 27, 2024
9,895 views