feature article
Subscribe Now

Not Your Grandmother’s Embedded Systems

As I mentioned in a recent column — [Electronic] Ghosts of Christmas Past and Future — the fact that we have just drifted into the year 2020, which means we are now 1/5th the way through the 21st century, is giving me pause for thought about how fast technology is changing. Take a moment to think about all the developments that have taken place over the past 100 years or so. How do you think things will change over the next 100 years?

In a little while, I will make some bold predictions about embedded system space in 10 to 100 years’ time, but first…

Hindsight — The One Exact Science

I find it strange how many people seem to have an inherent belief that things aren’t going to change much technology-wise in the future. Of course, I think it’s fair to say that almost everyone expects “small improvements,” but few anticipate seismic transformations, despite their being aware of how rapidly technology has advanced over the past couple of years.

I’m reminded of the Distinguished Roman Aristocrat Sextus Julius Frontinus who, in 98 AD, proclaimed that “Inventions have long-since reached their limit, and I see no hope for further development.” This sort of thing never ends; in 1888, the Canadian-American astronomer Simon Newcomb shared his thoughts that “There is little left in the heavens to discover” (at that time, of course, astronomers still believed that the entire universe comprised only our Milky Way galaxy). Not to be outdone, in 1894, the American physicist Albert Michelson informed the world that “The most important fundamental laws and facts of physical science have all been discovered” (this was just three years before J. J. Thomson discovered the electron in 1897).

When this topic comes up, many people also quote Charles H. Duell, the Commissioner at the U.S. Office of Patents, as saying, “Everything that can be invented has been invented” in 1899, but this has been debunked as apocryphal. What Duell actually said in 1902 was: “In my opinion, all previous advances in the various lines of invention will appear totally insignificant when compared with those which the present century will witness. I almost wish that I might live my life over again to see the wonders which are at the threshold.” I know how he felt.

What If?

I sometimes contemplate “what if” scenarios based on real historical events. For example, an aeolipile, also known as a Hero’s engine, is a simple bladeless radial steam turbine that spins when the central water container is heated. Torque is produced by steam jets exiting the turbine. In the 1st century AD, Hero of Alexandria described the device in Roman Egypt, and many sources give him the credit for its invention. So, what if the people of Hero’s time had taken this further — how would history have changed if the Roman Empire had developed steam engines and trains?

Charles Babbage first proposed his Analytical Steam Engine in 1837. It is often said that Babbage was a hundred years ahead of his time and that the technology of the day was inadequate for the task. However, in his book, Engines of the Mind, Joel Shurkin stated that: “One of Babbage’s most serious flaws was his inability to stop tinkering. No sooner would he send a drawing to the machine shop than he would find a better way to perform the task and would order work stopped until he had finished pursuing the new line. By and large this flaw kept Babbage from ever finishing anything.” How would history have changed had the Victorians had access to steam-powered digital computers?

The legendary American inventor, Thomas Alva Edison, demonstrated his incandescent bulb in 1879. Contrary to what Hollywood would have us believe, however, this wasn’t the first such artifact. As fate would have it, the English physicist and electrician Sir Joseph Wilson Swan successfully demonstrated a true incandescent bulb in 1878, which was a year earlier than Edison (but we — the English — aren’t bitter).  In 1883, William Hammer (an engineer working with Edison) observed that he could detect electrons flowing from the lighted filament to a metal plate mounted inside an incandescent light bulb. Rather unfairly, this phenomenon became known as the “Edison Effect.” Unfortunately, Edison didn’t do anything with this discovery, and it wasn’t until 1904 that the English electrical engineer John Ambrose Fleming created the first vacuum tube diode (rectifier), followed in 1907 by the first vacuum tube triode (amplifier), which was created by the American inventor Lee de Forest. How would history have changed if vacuum tube technology had become available circa the mid-1880s?

I could keep this up for hours, but I’m afraid we have other poisson à frire

The Last 100 Years (Give or Take)

My dad was born in 1915, which was which was about a third of the way through WWI. It was also only 12 years after Wilbur and Orville Wright had flown their first powered aircraft at Kitty Hawk, and five years before the world’s first commercial radio broadcast.

The term “computer” derives directly from the Latin words computus and computare, both of which mean the same as the English verb compute; that is, “to determine by mathematical means.” According to the Oxford English Dictionary, the first known use of the word “computer” was in 1613, in a book called The Yong Mans Gleanings by English writer Richard Braithwait. When my dad was a “yong man,” this usage of the term typically referred to a human computer; that is, a person who carried out calculations or computations. It wasn’t until around the middle of the 20th century that the modern use of the term to mean a “programmable digital electronic computer” really started to take hold.

I wish I had the ready wit of Groucho Marx, who always managed to come up with quotable quotes, like when he said: “I find television very educating. Every time somebody turns on the set, I go into the other room and read a book.” All joking aside, I think Groucho’s views were tainted by the fact that — at the time of his utterance — he’d not been fortunate enough to see Doctor Who, the first episode of which aired on 23 November 1963 (I was six years old, and I watched this from a safe location behind the sofa with only the whites of my eyes remaining visible).

Way back in 1876, the Welsh electrical engineer and inventor Sir William Henry Preece, who was the Chief engineer of the British Post Office at the time, uttered the immortal words: “The Americans may have need of the telephone, but we do not. We have plenty of messenger boys.” Telephones are ubiquitous these days, and many people tend to assume that they caught on really quickly, but such was not the case. For example, in 1946, when my mother started college at the age of 16, one of the courses was on “How to use the telephone.” This may sound a little silly now, but no one in my mother’s circle had ever used one, plus they had to learn the correct forms of address for business usage.

It wasn’t until 1971 that it became possible to direct-dial transatlantic phone calls from England to North America (the USA and Canada), and vice versa. Two years later, in 1973, standing on a Manhattan street corner (on 6th Ave. between 53rd and 54th Streets), Motorola’s Martin Cooper placed the world’s first mobile phone call to his rival Joel Engel at Bell Systems. While I was chatting with Martin a couple of years ago, he told me that other people walking by gave him a wide berth because they weren’t sure if he was mentally deranged (from their point of view, he appeared to be talking into something resembling a house brick). Martin also told me that, following his conversation with Joel, he decided to call his mother. While dialing, without thinking, he inadvertently stepped off the curb into the street and almost became the world’s first mobile phone fatality.

1971 also saw the release of the first commercially available microprocessor — the Intel 4004 — with 2,300 transistors, a 4-bit data bus, and a 12-bit address bus (the device was pin-limited, so the data and address were multiplexed over the same four pins). The 4004 was quickly followed by the Intel 8008 in 1972, the Intel 8080 and Motorola 6800 in 1974, the MOS Technology 6502 in 1975, and the Zilog Z80 in 1976.

I graduated from high school and started university in the summer of 1975. At that time, the only computer in the engineering department was analog in nature. We did have access to a digital computer in another building (it filled the other building). We created our rudimentary programs in FORTRAN using a Teletype terminal with a punched card writer in the engineering building, then we carried our deck of punched cards to the computer building. We left our card decks to be added to the queue, and we were given instructions to return the following week. When we did return, it was only to find a rubber band holding a scrap of paper on top of our cards saying something like, “Missing comma, Line 2.” Arrggggh (and I mean that most sincerely). So, we trudged back to the engineering  department building, added a card containing the comma, and slogged back to the computer building to start the process all over again. It typically took an entire semester to get even the simplest of programs up and running.

My grandparents shortly after they got married (Image source: Max Maxfield)

I have so much I could say about the first color televisions, pagers, cell phones, high-definition televisions, the evolution of computers, the introduction of the internet, the development of the global positioning system (GPS), the proliferation of wireless networks, smartphones, tablet computers, and… the list goes on. However, I fear you know all of this stuff already, so let us instead turn our attention to the future…

The Next 100 Years (Give or Take)

Based on past experience, it would take a brave man or woman to predict the future of technology. If I look back to where we were 50 years ago, I wouldn’t have come close to predicting the way things are today. Even as recently as 10 years ago, I would not have anticipated the current state of play with regard to things like virtual reality, augmented reality, and artificial intelligence.

One thing I would bet my best friend’s life savings on is that, ten years from now, experts will be predicting the demise of the 8-bit microcontroller. To be honest, I’ve been hearing this for the past 40 years, so it wouldn’t surprise me if — in 100 years’ time — an article in a 2120 issue of an online electronics magazine has a title along the lines of “The End of the 8-Bit Processor!”

I’ve said it before, and I’ll say it again — I think the combination of mixed reality (MR) with artificial intelligence (AI) is going to change the way we interface with the world, our systems, and each other (see also What the FAQ are VR, MR, AR, DR, AV, and HR? and What the FAQ are AI, ANNs, ML, DL, and DNNs?). Having said this, I really don’t think any of us have a clue as to what the future holds technology-wise.

The American science fiction author and professor of biochemistry Isaac Asimov started his legendary Foundation series in the early 1940s. This is set in a future in which humans have a Galactic Empire that spans the Milky Way. They also have hyperdrive spaceships equipped with hyperwave communications. In many ways, Asimov was a futurist, but he also dropped the ball on occasion. At one point in the tale, for example, a hyperwave message arrives for the captain of a starship. This message is transferred onto a thin metal strip, which is then rolled up and inserted into a metal “egg.” Someone then carries the “egg” to the captain’s cabin, at which point he opens it using his thumbprint and feeds the metal strip into a reader. If only Asimov could see us now, I bet he would have tweaked that passage a little.

When my mom grew up, they had one cold water tap in the house (see also The Times They Are a-Changin’). They cooked their food and heated the house using a coal-fired stove in the family room (they didn’t have a kitchen). They didn’t get electricity in the house until around 1943 when my mom was 13 years old. Prior to that, lighting was provided by gas mantles mounted on the walls. When they weren’t in use, my grandmother covered the holes in the power sockets on the wall with sticky tape “to stop the electricity leaking out.” This wasn’t unreasonable considering her experience with gas.

A few years ago, while visiting my mom in England, I gave her an iPad Pro. I also set her up with an Amazon account and preloaded it with a gift card. We were all sitting in my brother’s family room at the time. “What should I buy?” asked my mom. “What do you want? I replied. Eventually, she decided that a matching electric kettle and toaster set would be nice, so I showed her how to order one on Amazon.

Once we’d placed the order, I asked my mother, “What would my grandmother have thought to see you using your iPad and a wireless network to order an Electric Kettle and Toaster?” My mother thought about this for a while, and then replied, “Your grandmother wouldn’t have understood any of this. What would really have amazed her was the thought of technologies like an Electric Kettle and an Electric Toaster.”

I don’t know about you, but this really gave me something to think about. Now, when I look around at all of the technological wonders that surround me, I often say to myself, “These aren’t your grandmother’s embedded systems!”

3 thoughts on “Not Your Grandmother’s Embedded Systems”

    1. These days we are being bombarded with propaganda to believe that the massive servers in the cloud will be the future of everything from AI to EDA.
      Once upon a time the telling fact was that our desktop PCs represented 0% of the total computing power.
      While my old SOC grad professor expressed doubt that there would ever be a Kinko’s for your specific need in chips, I think there may come a day when your ASIC phone represents 0% of the ASICs used in computing.
      Like the days chiseled days of analog computing before the microprocessor, we are only beginning to use the massively parallel power in real math modeled solutions, non AI. Take the example of the 737 max with control reverting to a manual switch when the engine model fails. There is no correlation between logic synthesis and the clever bunyon use of sensors to control process. Control will be much more deterministic with achievable latency for accelerated parallel computation of “all things considered.” That we will remained dumbed down with reliance on AI is a cop out. You can’t achieve a reliable solution without verification and that throws everything back into the analytical realm.
      FPGAs, eASICs, and MPSOCs are the gateway to the ASIC future of computing.

      1. I remember when people first started talking about “the cloud” — I never thought it would catch on LOL — now, of course, it’s easy to see how useful it can be for a lot of things — but it’s not the “be all and end all” — there’s currently a push to do a lot of processing in the fog, at the edge, and at the extreme edge — as always, it’s going to be interesting to see how things change over the next 10 years or so.

Leave a Reply

featured blogs
Apr 24, 2024
Diversity, equity, and inclusion (DEI) are not just words but values that are exemplified through our culture at Cadence. In the DEI@Cadence blog series, you'll find a community where employees share their perspectives and experiences. By providing a glimpse of their personal...
Apr 23, 2024
We explore Aerospace and Government (A&G) chip design and explain how Silicon Lifecycle Management (SLM) ensures semiconductor reliability for A&G applications.The post SLM Solutions for Mission-Critical Aerospace and Government Chip Designs appeared first on Chip ...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

It’s the little things that get you; Light to Voltage Converters
In this episode of Chalk Talk, Amelia Dalton and Ed Mullins from Analog Devices chat about the what, where, and how of photodiode amplifiers. They discuss the challenges involved in designing these kinds of components, the best practices for analyzing the stability of photodiode amplifiers, and how Analog Devices can help you with your next photodiode amplifier design.
Apr 22, 2024
315 views