feature article
Subscribe Now

Modelling: not just for big boys?

Years ago, when I worked in PR, I used to visit a telecoms company every few weeks, with a colleague. (PR agency people always seem to travel to clients in pairs – I have never understood why.) The visits were usually to be briefed on a new product or a new release, and the briefings were conducted by engineers. On the way, we used to make small bets on how long it would be before the briefing engineer got up and started using the white-board: it was rarely more than five minutes into the meeting. Now, when on the receiving end of the PowerPoint presentations I used to help develop, if the presenter is, or was once, an engineer, there are frequently times when a notebook comes out and diagrams are sketched to make a point more clearly.

Engineers seem to think better when they do diagrams – for them a picture may be worth at least a thousand words. And yet modelling, model driven development, UML, SysML — all the different names that are bandied around for defining a system through pictures rather than a screed of words, has not made significant headway outside areas like defence and aerospace. In fact, within defence, Unified Modeling Language (UML) is mandatory for projects in the US and UK.

There are tools from a range of sources. At the top end of the pile for embedded is Telelogic. Telelogic, having swallowed up a number of players in the modelling field such as iLogix to create a broad range of tools, has itself been bought by IBM. IBM had previously bought Rational Software, a leader in using UML for developing enterprise IT applications. Telelogic’s tool range is large, but the two most important products for embedded applications are the Doors requirements specification tool, which we are not going to look at, and Rhapsody, a suite of UML/SysML based modules for modelling and related activities. (SysML – System Modeling Language – is an extended sub-set of UML, tuned for describing systems rather than software.)

The biggest independent player is Artisan, with Studio just about to enter release 7.0. This is aimed at embedded projects, but Artisan stresses the centrality of their tools to collaborative products, with “work as one” as a slogan. There are other commercial players, but not at this scale.

IAR has taken a slightly different approach, using the state machine subset of UML as the basis for its visualSTATE modelling and code generation suite. While not full system modelling, it is optimised to produce compact C/C++ code for constrained embedded systems.

All three of these companies are concerned with making their tools integrate with other parts of the tool chain. Telelogic was already moving towards Eclipse compatibility when they were bought by IBM, the company who created Eclipse and donated it to the software community. Artisan has just acquired High Integrity Solutions (HIS) to get full ownership of Vds (V-Design System), a framework like Eclipse but aimed at mission and safety-critical systems and software development. And IAR stresses how well visualSTATE integrates with the IAR Embedded Workbench of compiler and debug tools.

If you Google for shareware in this area, there are a lot of free/low-cost tools out there, many of which seem to generate Java as first choice. And Rhapsody’s modelling module is available as a free download, with the intention of sucking you in to the process. Artisan provides a free, time-limited evaluation version of Studio.

So what is involved? In simple terms, you draw diagrams of your system, formalised versions of the white-board images you use when describing something to other engineers (or even PR people). Depending on the conventions you use, different shaped blocks have different meanings, and the lines joining them use different symbols to indicate different interactions. Most modelling tools let you drag and drop these symbols and connections. You can start at the very highest level and then decompose the high-level blocks into lower-level ones. These diagrams can be used for communication, and, at the risk of repetition, they allow you to demonstrate to the end user or customer that you understand the problem much more easily than a with telephone-book-sized written description.

But then you can do more. You can check that your model is correct: most modelling tools have built-in verification tools that are able to find mistakes that have been made in the basic design that appear totally legal to code inspection and normal debugging approaches.

The next step is neat – the model then generates code. It does it quickly and to defined standards. The mainstream modelling tools normally generate C/C++ and a range of other languages, while many of those from the open source community generate Java. At this stage, you are back in the normal development cycle, ready to compile, integrate, debug, etc. Most tools allow you to add your own hand-coded segments to the generated code, and with Rhapsody it is possible to go backwards from legacy code into the model. IAR claims that visualSTATE is outstanding in creating very compact code for memory-limited applications.

Why would you want to go through these hoops when it is possible to create systems in other ways? Advocates of the modelling approach have an array of arguments. It makes more efficient use of human resources. From the management viewpoint the objective is to get a system developed and into the market as efficiently as possible. That means quickly and cheaply. If the best brains in the organisation are not hacking code or figuring out why a particular bug keeps appearing, but instead applying their intelligence to the highest level of design, that has to be good, surely?

By using models, it is easier to get buy-in on the design from other parts of the company: marketing people can more easily grasp the functionality, and it is easier to incorporate changes and refine the design when working with a model. Many systems now provide an option for generating a human interface – buttons, output screens, etc. – to simulate the behaviour of the final design: another way to ensure that what is being designed is what the specifier required.

The code created by a code generator may not be a marvel of elegance, but it can be compact, is correct, and, importantly, is produced quickly. Almost as significantly, changing the initial model and generating new code is much easier when producing variants. It is easier, for example, to create a family with the same basic hardware and different features embodied in software.

A feature that all the proprietary tools seem to have is automatic documentation: at the press of a button the current state of the design can be recorded and filed for future use in a variety of possible formats, including word documents, spreadsheets, state diagrams or even the model itself.

But if this approach is so great, why aren’t people falling over themselves to use these tools? Three main reasons seem to be an appropriate match between tool and application, cost, and cultural issues.

There is, for many developers, a mismatch between their needs and the mainstream modelling tools. The major suppliers’ products are expensive, often tens of thousands of dollars or more, and geared to group working – the tools are designed for defence/aerospace and safety critical applications, where the end user accepts high prices in return for products that are bullet proof. (Sometimes literally.) If you are working on a small project, on your own or as part of the typical small team, then they are clearly over-sophisticated. (An analogy is that it is rather like how Microsoft, in gearing Office for enterprise use, group working and so on, has made it no longer such a useful tool for the single user.)

Cost is a significant barrier to the wider adoption of model-based design. There are free/shareware tools, but these often have limitations and minimal support. If you want tools from a proprietary source, then it costs. The big suppliers are expensive, as we have discussed. Even IAR’s visualSTATE is priced in thousands of dollars, which is not expensive compared with the cost of a programmer’s time when debugging a product but still runs into the barrier that most companies won’t spend more than $1K on a tool.

Also there is a cultural issue: if you have spent the last few years honing your skills as a C++ programmer, let alone as an ace assembly language programmer for really memory-restricted applications, you are not going to want to hear that this piece of software can replace you. I remember assembler programmers for IBM mainframes in the late 1960s complaining that FORTRAN was so inefficient that it would never be useful. (And as for COBOL….)

Another argument is that the tools take time to learn, particularly when project deadlines are tight. Well, none of them claim to be completely intuitive, but the trade-off is increased productivity, not just on the project but on all future ones.

So is modelling just for big boys? Well, I hate to say it, but unless companies change their views on an appropriate level of expenditure for tools, yes it is, today. However I have heard that one company is planning an announcement that could change this scenario quite markedly. And if the rumour is true, there may soon be a tool that will be very cost-effective for the single user. And then modelling will no longer be just for the big boys.

Leave a Reply

featured blogs
Dec 1, 2020
If you'€™d asked me at the beginning of 2020 as to the chances of my replicating an 1820 Welsh dresser, I would have said '€œzero,'€ which just goes to show how little I know....
Dec 1, 2020
More package designers these days, with the increasing component counts and more complicated electrical constraints, are shifting to using a front-end schematic capture tool. As with IC and PCB... [[ Click on the title to access the full blog on the Cadence Community site. ]...
Dec 1, 2020
UCLA’s Maxx Tepper gives us a brief overview of the Ocean High-Throughput processor to be used in the upgrade of the real-time event selection system of the CMS experiment at the CERN LHC (Large Hadron Collider). The board incorporates Samtec FireFly'„¢ optical cable ...
Nov 25, 2020
[From the last episode: We looked at what it takes to generate data that can be used to train machine-learning .] We take a break from learning how IoT technology works for one of our occasional posts on how IoT technology is used. In this case, we look at trucking fleet mana...

featured video

Improve SoC-Level Verification Efficiency by Up to 10X

Sponsored by Cadence Design Systems

Chip-level testbench creation, multi-IP and CPU traffic generation, performance bottleneck identification, and data and cache-coherency verification all lack automation. The effort required to complete these tasks is error prone and time consuming. Discover how the Cadence® System VIP tool suite works seamlessly with its simulation, emulation, and prototyping engines to automate chip-level verification and improve efficiency by ten times over existing manual processes.

Click here for more information about System VIP

featured paper

How to optimize an OpenCL Kernel for the data center using Silexica's SLX FPGA

Sponsored by Silexica

FPGAs are being increasingly employed as co-processors in data centers. This application note explains how SLX FPGA accelerates a Fintech design example, leveraging Xilinx’s Vitis Platform’s bottom-up flow, Alveo U200 accelerator card, and Vitis quantitative finance library.

Click here to download the whitepaper

Featured Chalk Talk

Nano Pulse Control Clears Issues in the Automotive and Industrial Markets

Sponsored by Mouser Electronics and ROHM Semiconductor

In EV and industrial applications, converting from high voltages on the power side to low voltages on the electronics side poses a big challenge. In order to convert big voltage drops efficiently, you need very narrow pulse widths. In this episode of Chalk Talk, Amelia Dalton chats with Satya Dixit from ROHM about new Nano Pulse Control technology that changes the game in DC to DC conversion.

More information about ROHM Semiconductor BD9V10xMUF Buck Converters