feature article
Subscribe Now

Realistic Edge-Device Testing

KnowThings Stresses the System – Intentionally

We have so many electronic devices in our lives these days. OK, file that under “D” for “Duh!!!,” but how many different kinds do we have? Well… phones, of course. Tablets? Check… Computers? You bet.

After those, things are less obvious. A smart watch? OK… mostly that’s a phone that looks like a watch (and maybe without the phone capabilities, but… who uses a phone as a phone anymore anyway??). But OK… next? (I’m talking regular folks, not buy-all-the-techy-things folks from inside the Silicon Valley bubble…)

Now, if you look into industrial applications, you start to get a lot more. Obviously, folks delivering packages have electronic gadgetry that lets you sign for receipt and lets them keep track of where all those packages go. You see city infrastructure folks with electronic devices that can measure where wires and cables and power are located underground. There are the machines that measure what’s happening within our cars. Etc. etc. etc.

So most of us use one of a dizzying number of available brands of a few kinds of items. Industry uses many more specialized, purpose-built devices. The question is, how do you test to be sure that all of these things communicate correctly and with the desired performance?

So Many Tests, So Little Time

For us consumers, it’s not so hard (tell that to the testing guy though…). You’ve got a very few protocols to worry about carrying generic data from things like browsers, apps, and phone calls. So you can test the bejeebus out of them. How do you model the traffic? Well, since there are only a few classes of platform, you can expend the effort to build a purpose-built data generator for use in testing – that is, if you can’t simply plug into a real data source coming out of the wall or the air.

What about those more specific industrial devices? Well, there you have a moderate number and a limited audience; you can probably also model that traffic pretty clearly.

So here we have either wide scope with a limited number of platforms or narrow scope and a greater number of platforms (although probably not a crazy number). What’s going to happen if the Internet of Things (IoT) explodes for consumers the way it’s been predicted? Or even for industry? Now we’ll have a wide scope and a huge number of purpose-built widgets, each of which is likely to have some level of proprietary behavior even if it uses industry-standard protocols for most network layers.

The challenge here is primarily scaling and stress: you can test a single unit and have it work just fine, but what if that single unit suddenly becomes a thousand or a million or more? Can the infrastructure handle the load with the required performance?

With network layers below the application layer, performance may be governed by well-known protocols. But at the app layer, the data patterns will be specific to that applications. Yes, on a phone, an “app” is but one program on a generic computing platform. But with the IoT, an app is an entire device with a dedicated purpose. Does every device maker have to spend the time and effort to design and validate testing models for each unit they design?

What’s more likely is that you build a few prototype units and test them. All well and good, but how do you test the device’s ability to scale? You don’t want to have to built thousands of prototypes for stress testing.

Or perhaps you’re adapting existing industrial equipment for the IoT – a so-called brownfield installation. Now thousands of devices exist, but they’re already installed at sites around the world. Do you have to buy a couple thousand to run the stress tests?

Traditionally, you would now be consigned to the woodshed to spend valuable time developing bespoke data generators rather than working on the next product that you can sell. But, instead of doing that, might you be able to summon up the Thing That Will Save Everything? Machine learning?

Stressing Out

KnowThings thinks that you can. They’ve got a system for generating models automatically, using AI to build the model.

There are two pieces to the models they can help you to simulate:

  • The network itself, down to the physical layer. This is a one-time effort that they say is pretty easy.
  • What they call the data acquisition module – that is, the IoT edge device – or the portion of it that senses something and then sends the data up. This model isn’t so easy to build manually. (You can, of course, but it can be a lot of work.)

AI’s role is to listen to data from a device. The model generator will identify the patterns in the data through a variety of usage scenarios, categorizing the various packets with machine-generated tags. You can then go in and edit those tags to make them more human-digestible.

(Image courtesy KnowThings)

But what if you don’t have a device yet? They’re also working on a manual modeling capability using a simple language to describe the model. Initially, components included in the model will be generic – that is, they won’t reflect the detailed behavior of one brand of device vs. another. But their vision includes getting to that level of detail, meaning that you could do, for example, second-source evaluations.

So now you have two capabilities: manual, for when you don’t have built hardware yet, and automatic, for when you do. The third capability manages the transition: letting the manual model inform and provide semantic clues to the Ai process as you go from a virtual unit to a real unit.

What you get with these models is the ability to push your devices and system to see where (or if) they break. But let’s say that you have a model with human-understandable tags so that you can go in and inspect the results of a data capture. What happens if some unknown or, even worse, unauthorized traffic comes through?

Unexpected or novel data patterns will still be characterized; they just won’t have a nice, easy-to-read tag for what they represent. They’ll be mystery patterns. You can then go figure out whether they represent some mode or use case that you hadn’t run across yet. If not, then it may indicate a security issue – someone injecting malicious traffic into your stream.

So if this pans out as promised, you may have an easier way to test your edge devices. Of, if the effort has had you skimping on the testing, you can skimp no more.

 

More info:

KnowThings

One thought on “Realistic Edge-Device Testing”

Leave a Reply

featured blogs
Sep 22, 2021
'μWaveRiders' 是ä¸ç³»åˆ—æ—¨å¨æŽ¢è®¨ Cadence AWR RF 产品的博客,按æˆæ›´æ–°ï¼Œå…¶å†…容涵盖 Cadence AWR Design Environment æ新的核心功能,专题视频ï¼...
Sep 22, 2021
3753 Cruithne is a Q-type, Aten asteroid in orbit around the Sun in 1:1 orbital resonance with the Earth, thereby making it a co-orbital object....
Sep 21, 2021
Learn how our high-performance FPGA prototyping tools enable RTL debug for chip validation teams, eliminating simulation/emulation during hardware debugging. The post High Debug Productivity Is the FPGA Prototyping Game Changer: Part 1 appeared first on From Silicon To Softw...
Aug 5, 2021
Megh Computing's Video Analytics Solution (VAS) portfolio implements a flexible and scalable video analytics pipeline consisting of the following elements: Video Ingestion Video Transformation Object Detection and Inference Video Analytics Visualization   Because Megh's ...

featured video

Digital Design Technology Symposium

Sponsored by Synopsys

Are you an SoC designer or manager facing new design challenges driven by rapidly growing and emerging vertical segments for HPC, 5G, mobile, automotive and AI applications?

Join us at the Digital Design Technology Symposium.

featured paper

Ultra Portable IO On The Go

Sponsored by Maxim Integrated (now part of Analog Devices)

The Go-IO programmable logic controller (PLC) reference design (MAXREFDES212) consists of multiple software configurable IOs in a compact form factor (less than 1 cubic inch) to address the needs of industrial automation, building automation, and industrial robotics. Go-IO provides design engineers with the means to rapidly create and prototype new industrial control systems before they are sourced and constructed.

Click to read more

featured chalk talk

Using Intel FPGA to Develop Video and Vision Solutions

Sponsored by Mouser Electronics and Intel

Today’s video applications require enormous amounts of compute performance on small power budgets. And, the wide variety of specifications, rates, and resolutions makes flexibility a key design requirement. In this episode of Chalk Talk, Amelia Dalton chats with Omi Oliyide of Intel about how Intel FPGAs are ideal to take on even the most challenging video and vision designs, and explain how you can get started with this exciting technology in your next project.

More information about Intel Arria® 10 GX FPGA Development Kit