feature article
Subscribe Now

Smart Sensor Sees in 3D

Toposens Creates Ultrasound Sensor for Robots, Cars, and Robot Cars

“[Motion sensing hand dryers] never work, so I just end up looking like I’m waving hello to a wall robot.” – Jimmy Fallon

In school I was taught that a thermostat is a robot, inasmuch as it responds to its surroundings in real time. Years later, I was the head of engineering for a robot company that made big, scary machines with 500-pound arms. Those seemed like “real” robots to me. Now we can buy robots to vacuum under our furniture, look after our garden sprinklers, patrol grocery aisles, or just to play around with.

What all robots have in common is a need to perceive and respond to their environment. Without that sensory aspect, my teacher said, all you’ve got is a machine. Robots must be able to change their behavior based on external – and unexpected – inputs. That closed-loop feedback was the defining characteristic of a robot, in his view. 

“’Easy to use’ is easy to say,” as the GUI experts tell us. It’s one thing to throw sensors at a product and say it’s autonomous, but that understates the complexity of the problem. What are you sensing, exactly? How accurately? How quickly? How effectively? And, what do you do with the raw data the sensors produce? 

A German startup has tackled all those problems head-on and come up with a $100 device that should make robot-building a bit less daunting. 

Toposens is a 22-person company based in the leafy and gemütlich Bavarian capital of Munich. The founders didn’t start out to make sensors. Instead, they were trying to build a robotic fish. As you do. 

The problem with the fish (apart from waterproofing everything) was sensing its surroundings in three dimensions, underwater, in real time. Water can be a tough medium to work with because it distorts light and sound a lot more than air does. And fish can move in all directions. To make sense of the (ahem) streams of data flowing in from its sensors, the founders developed a unique algorithm to crunch the raw data. That took four years of effort but solved a tough problem of sensing in 3D. So much so, that they abandoned the fish idea and instead developed a new 3D ultrasound sensor based around their sensor-fusion algorithm. 

The result is TS3, a sensor module about the size of a pack of gum, with a UART port on one side. What makes TS3 different from other ultrasound sensors is that it works in three dimensions simultaneously. It has a single output transducer, but three receivers. More importantly, it also has built-in hardware to implement Toposens’s secret algorithm that combines the data from all three receivers into one unified image of the world outside. 

Typical ultrasound sensors are essentially one-dimensional, sending out a 40-KHz pulse and measuring the time it takes for the sound to reflect back to a receiver. This gives a “flat” view of the world, like looking through one eye. Such 1D sensors allow a home vacuum robot to avoid obstacles, yet they somehow still get stuck under chairs, eat power cords, and drive over discarded socks. 

The TS3, on the other hand, sends a single pulse but receives and integrates multiple reflections, forming a 3D image of the world ahead. It can deduce the sides, edges, and angles of objects, their height, and what empty space might lie between them. Rounded objects like table legs (or human legs) have shape and depth in the Toposens world. 

Why not just install three traditional ultrasound sensors? Won’t that accomplish the same thing? Not a chance, says Toposens Managing Director Tobias Bahnemann. Individual sensors can’t integrate reflected data the way their single 3D sensor can. It’s hard enough to correlate the reflections from a single source reflecting off multiple objects at arbitrary angles. There’s no way to combine three separate sources and three separate receivers. The time-of-flight delta is extremely small for objects that are only a few inches away. That’s what their custom processor and magic algorithm are designed to fix. 

Like other ultrasound and lidar sensors, the output from the TS3 is a serial data stream delivering a “point cloud,” a table of X, Y, Z, and “loudness” coordinates. This last datum measures the strength of the reflected ultrasound wave and correlates to the material density, or hardness, of the object. Shiny metal objects will reflect more strongly than squishy ones, and developers can use this data to distinguish between, say, table legs and human legs. Or between a stuffed animal and the real thing. 

TS3’s point cloud can’t identify what an object is, only where it is. That capability might come later, says Bahnemann. Current users in the automotive and robotics arena don’t need object recognition, but future users might want it to fine-tune their path planning or collision avoidance. Toposens might also start providing velocity information, rather than make customers derive that from successive point clouds. 

Ultrasound sensors have both advantages and disadvantages compared to other technologies, such as lidar (lasers) and cameras. Ultrasound can see transparent objects, for one. This turns out to be a big deal in hotels, where cleaning robots with light-based sensors bump into glass doors and windows. Not a good look for a high-class establishment. Ultrasound also works in the rain, snow, and total darkness, which can stymie cameras. 

On the other hand, cameras generally produce a more detailed image (i.e., they have more pixels) than ultrasound, and they can detect colors, although that’s rarely important. Camera images are just as “flat” as 1D ultrasound images, requiring two or more cameras to produce a stereoscopic image. Lasers and cameras work at longer ranges than ultrasound; TS3 is effective to about 5 meters. 

Toposens currently sells its TS3 to developers for about $250 a pop but promises volume customers sub-$100 pricing. Tire-kickers include local favorites BMW, Porsche, and Daimler-Benz, as well as Huawei and other electronics firms. The need for sensors is undeniable, but the choices are many. If Toposens, like a robot, has responded appropriately to market demand, the market may respond favorably in kind. 

Leave a Reply

featured blogs
Oct 22, 2020
WARNING: If you read this blog and visit the featured site, Max'€™s Cool Beans will accept no responsibility for the countless hours you may fritter away....
Oct 22, 2020
Cadence ® Spectre ® AMS Designer is a high-performance mixed-signal simulation system. The ability to use multiple engines and drive from a variety of platforms enables you to "rev... [[ Click on the title to access the full blog on the Cadence Community site....
Oct 20, 2020
In 2020, mobile traffic has skyrocketed everywhere as our planet battles a pandemic. Samtec.com saw nearly double the mobile traffic in the first two quarters than it normally sees. While these levels have dropped off from their peaks in the spring, they have not returned to ...
Oct 16, 2020
[From the last episode: We put together many of the ideas we'€™ve been describing to show the basics of how in-memory compute works.] I'€™m going to take a sec for some commentary before we continue with the last few steps of in-memory compute. The whole point of this web...

featured video

Demo: Inuitive NU4000 SoC with ARC EV Processor Running SLAM and CNN

Sponsored by Synopsys

Autonomous vehicles, robotics, augmented and virtual reality all require simultaneous localization and mapping (SLAM) to build a map of the surroundings. Combining SLAM with a neural network engine adds intelligence, allowing the system to identify objects and make decisions. In this demo, Synopsys ARC EV processor’s vision engine (VPU) accelerates KudanSLAM algorithms by up to 40% while running object detection on its CNN engine.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

featured Paper

New package technology improves EMI and thermal performance with smaller solution size

Sponsored by Texas Instruments

Power supply designers have a new tool in their effort to achieve balance between efficiency, size, and thermal performance with DC/DC power modules. The Enhanced HotRod™ QFN package technology from Texas Instruments enables engineers to address design challenges with an easy-to-use footprint that resembles a standard QFN. This new package type combines the advantages of flip-chip-on-lead with the improved thermal performance presented by a large thermal die attach pad (DAP).

Click here to download the whitepaper

Featured Chalk Talk

Create Multi-Band Sensor Networks with the LaunchPad SensorTag Kit

Sponsored by Mouser Electronics and Texas Instruments

Doing IoT development today can involve a number of different communications standards. There is no “one size fits all” for wireless protocols. Every application has its own needs and constraints. In this episode of Chalk Talk, Amelia Dalton chats with Adrian Fernandez of Texas Instruments about the new LaunchPad SensorTag development kit - that can launch your IoT design regardless of what wireless standards you need.

Click here for more information about Texas Instruments LPSTK-CC1352R MCU LaunchPad Sensor Tag Kit