editor's blog
Subscribe Now

Simultaneous alternative views

Embedded vision systems are providing more opportunities for machines to see the world in the same way that our eyes see them (and our brains interpret them), but variants of these technologies are also enabling systems to see things in ways we can’t.

Imec has just announced a new “hyperspectral” camera system for use in medical and industrial inspection systems or anywhere specific filters are needed to understand specific characteristics of whatever is being viewed. In such situations, simply looking at one bandwidth of light may not be enough; a complement of filters may be needed either to provide a signature or to evaluate multiple characteristics at the same time.

One way this is done now is to take separate images, each with a different filter, time-domain multiplexed. This slows the overall frame rate, divided down by the number of filters. The new approach allows full-frame-rate images with all filters simultaneously.

There are actually two versions of this, which imec calls “shapshot” and “linescan.” We’ll look at snapshot first, as this is particularly new. It’s intended for image targets that are either stationary or moving in a random way (or, more specifically, not moving in a line as if on a conveyor belt). The imaging chip is overlaid with tiles, each of which is a different filter.

The filters are the last masked layers of the chip – this is monolithic integration, not assembly after the fact. The filter consists of two reflecting surfaces with a cavity between; the size of the cavity determines the frequency. This means that the final chip will actually have a non-planar surface because of the different cavity sizes – and therefore different heights – of the filter layer. Because these make up the last layers to be processed, it’s convenient for staging base wafers for final processing with custom filter patterns.

The camera lens then directs, or duplicates, the entire scene to every tile. Perhaps it’s better to think of it as an array of lenses, much like a crude plenoptic lens. This gives every filter the full image at the same time; the tradeoff against time-domain multiplexing filters over the lens instead is that you get the resolution of one tile of the imaging chip, not the entire imaging chip.

A camera optimized for linescan applications doesn’t use the plenoptic approach; instead, the tiles are made very small and, as the image moves under the camera at a known rate, multiple low-resolution images are captured and then stitched back together using computational photography techniques.

The lensing system determines the bandwidth characteristics, since you can design this with filters having wider or narrower bandwidths and with or without gaps between the filters. This allows a range from continuous coverage to discrete lines. A collimating lens will direct light in straight lines onto the filters, providing narrow bandwidth; a lens that yields conical light will give wider bandwidth due to interference from the different filters with this light. The aperture size then acts as a bandwidth knob.

Imec has put together a development kit that allows designers to figure out which filters to use for a given linescan application; they’ll be providing one for snapshot cameras as well. Each filter configuration is likely be very specific to its application, making this something of a low-volume business for now. Because of that, and in order to grow the market, imec will actually be open for commercial production of these systems.

You can find more details in their release.

Leave a Reply

featured blogs
Mar 25, 2019
The 2018 AI Index Report , developed by scientists and researchers in the AI field was recently released. This report uses quantitative data, such as publication counts or mentions of AI, to assess... [[ Click on the title to access the full blog on the Cadence Community sit...
Mar 22, 2019
In the video above, it might not appear that much is taking place, but just like with transformers there is “more than meets the eye.” Alright, that was corny, and I am mildly ashamed, but Nanosecond Event Detection for shock and vibration is nothing to be ashamed...
Mar 21, 2019
A Race to the Finish: Announcing the Winners of the 2019 3D InCites Awards Mentor Embedded Linux launch targets enterprise-class gap Using AI Data For Security How to optimize your testbench-to-DUT connections VCSEL Technology Takes Off A Race to the Finish: Announcing the...
Jan 25, 2019
Let'€™s face it: We'€™re addicted to SRAM. It'€™s big, it'€™s power-hungry, but it'€™s fast. And no matter how much we complain about it, we still use it. Because we don'€™t have anything better in the mainstream yet. We'€™ve looked at attempts to improve conven...