feature article
Subscribe Now

The Challenges of an Embedded Software Engineer

First, there are many “knowns” in the enterprise software space. Developers usually develop on the same hardware (host machine) and software platform (i.e. operating system) that their final product will run on. Usually, it is a well-tested and well- known development environment that hasn’t changed over the multiple products that the developer has contributed to. As a result, the software developer typically focuses on the application rather than the environment. Additionally, in the majority of enterprise applications, there is also no concept of real-time performance, determinism, or interrupts (outside of keyboard, mouse and network).

Now let’s enter into the world of the embedded engineer. While the hardware design is often known, frequently it is still under development with a strong possibility that it may change, either subtly or significantly, during its design and development period. As the hardware architecture is usually very different from the architecture of the development host, it is also often different from the last design that the software engineer worked on.

As an added complexity, it is often the responsibility of the embedded developer to define and implement the software platform to the hardware platform, and many of the low-level drivers and software stacks either need to be developed, or at least ported, to the hardware platform. With each different software or hardware platform there is often a new, different and proprietary cross development system that is very specific either to the hardware or software environment. Now, add in the real-time, deterministic, and often concurrent nature of embedded systems that the software has to comply with, and one can get a good picture of how difficult an embedded software engineer’s job can be.

So why do embedded engineers do it, and how do they ever get software out on time?

One of the fundamental differences between embedded and enterprise, is that the hardware platform and hence the embedded device cannot operate without the embedded software. This gives the embedded software engineer a very definite reward of making a piece of hardware come to life as a real product. The challenges faced also make the job of an embedded engineer extremely interesting, yet pressure-sensitive—particularly with on-going time-to-market issues. As a result, embedded software engineers are now looking for technologies that can help them overcome these challenges and become more productive.

As embedded systems continue to become more complex, the amount of embedded software has grown substantially. The industry supporting embedded software development has been developing new products and technologies to help the embedded software developer deliver on time. Let’s look at some of these technologies using the challenges above to see how they can help overcome them.

Challenge 1 – No hardware available.

Even if prototype systems are available, they are often either “buggy” and/or in very short supply. So, if the embedded software engineer wants to have the best chance of completing software on time by being able to start development as early as possible, then simulation or prototyping technology need to be used.

Simulators have been around almost as long as processors themselves, but their use and complexity has changed to make them more usable for today’s high-level embedded software developers. There are essentially two types of hardware simulators. Instruction set simulators (ISS) and hardware logic simulators. The ISS simulates the instruction set of the processor architecture and varies in the degree of timing and peripheral accuracy from “cycle-accurate” to “instruction accurate” with different levels of cache and memory access timing accuracy. The more accurate the simulation, the more complex the ISS becomes, often reducing execution speed. For most software developers in the early phase of development, an “instruction accurate” ISS can address their needs, as it allows them to build their software using the cross development tools that they will be using for their real-target development. It also allows them to use high-level debuggers to debug their code flow, variable access, memory issues, and to also get a feel for the performance of their system. Additionally, an ISS can be used to simulate peripheral devices by using stub functions in the high-level debugger to capture input and output to a peripheral device and to simulate its function. Even real-time operating systems can be run in an ISS, allowing for a good representation of the concurrent operation to be simulated.

For hardware testing, logic simulators are used to simulate the actual operation of the complete hardware system. These simulators are very accurate, and allow very detailed testing of the hardware performance and interoperability to make sure that the hardware design is complete and functioning before the hardware is manufactured. So, it would seem to make sense to use these simulated hardware environments to test software. However, the complexity of these tools do not easily lend themselves to being easy to use by software developers, and the amount of simulation that is occurring makes the software execution time prohibitively slow. The desire of the hardware engineer is somewhat different, wanting to test the hardware design using as much of the software that will eventually run on the system as possible to provide real-world stimuli to the simulated system. But, while the hardware engineer might be familiar with using the tool, the speed of simulation is still prohibitive for running system software.

Enter co-verification. This relatively new technology brings together the two worlds of simulation, and with it the hardware and software teams. The co-verification technology acts as a intelligent bridge between the ISS, the high-level debugger, and the logic simulator handling the hardware design. The processor instructions are handled by the ISS at relatively high-speeds. Since the processor architecture is not being designed by the hardware engineer, it doesn’t require simulation in the logic simulator, allowing the peripheral devices and the “real” hardware to be simulated in the logic simulator. This allows the software developers to run their software in a familiar environment at ISS speeds, and then when a peripheral part of the hardware is stimulated, the execution control is seamlessly passed to the logic simulation. The co-verification also allows for intelligent optimization of the execution environment. For example, in initial testing, the memory subsystem would reside in the logic simulator. However, once that has been fully tested, the memory reads and writes can be moved over to the ISS memory subsystem for much faster execution.

As the software development grows, there is a need to bring all the software parts together, and do real system testing. Running on real, working hardware is the ideal way to do this, however there may still be issues with availability and stability of the hardware.

Enter prototyping. This is different from simulation as it doesn’t always require a true representation of the processor architecture. It is more focused at the software and system level. What it does provide is a simulated software platform, incorporating the appropriate real-time operating systems, software stacks, man-machine interfaces, and other software system components to allow the software to run and be tested in a complete prototype system. The heart of the prototyping system can be an ISS if it provides enough performance, but can also run on the development host’s processor (i.e. native execution). In this latter example the performance of the executing code can be as fast as real-time, and is only limited by the speed of the development machines’ processor. What prototyping offers over traditional simulation is the ability to test all of the components and protocols of a complex embedded system. This allows the software developer to apply stimuli to the system that looks and feels like the real device. For example, network data can be input and output to the system, even using the networking hardware in the development station. Similarly, the user interface can be modeled using graphical toolkits and “hot buttons”. An example of a cell-phone prototype can be seen in diagram 1 below. Once this prototyped software system has been built, it can then be deployed as a “software platform” and used by all the software developers on this project, reducing the need for large quantities of hardware based prototyping systems, but still allowing the developers to test their components alongside the rest of the system.

Diagram 1 – a prototype of a simple cell phone

In part 2 of this series, we will look at the challenges surrounding real-time operating systems and development environments, and how some of today’s technologies are making this challenge easier to deal with.

Leave a Reply

featured blogs
Apr 19, 2024
Data type conversion is a crucial aspect of programming that helps you handle data across different data types seamlessly. The SKILL language supports several data types, including integer and floating-point numbers, character strings, arrays, and a highly flexible linked lis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

Extend Coin Cell Battery Life with Nexperia’s Battery Life Booster
Sponsored by Mouser Electronics and Nexperia
In this episode of Chalk Talk, Amelia Dalton and Tom Wolf from Nexperia examine how Nexperia’s Battery Life Booster ICs can not only extend coin cell battery life, but also increase the available power of these batteries and reduce battery overall waste. They also investigate the role that adaptive power optimization plays in these ICs and how you can get started using a Nexperia Battery Life Booster IC in your next design.  
Mar 22, 2024
3,866 views