Sometimes people say things that are both obvious and unexpectedly insightful. “Unexpected,” in the sense that I hadn’t thought of it myself—yet once I heard it, I was surprised it hadn’t already occurred to me. I’ll explain what kicked off my current cogitations in a moment. First, however, I’d like to waffle a bit, as is my wont.
I hail from the city of Sheffield, in the county of Yorkshire, in merry old England. My grandfather (on my mother’s side) was born in the 1890s. At that time, the best public transportation system Sheffield had to offer was horse-drawn trams. By 1910, Sheffield had developed one of the most comprehensive electric tram networks in the country. I can imagine older people in 1910s Sheffield saying, “There are kids growing up today who will never have known a world without electric trams.”
My dad was born in 1915, which was five years before the first commercial broadcast. By 1935, when Dad was 20 years old, there were hundreds of radio stations operating across America. Meanwhile, England, along with many other European countries, had spawned national broadcasting systems. For example, the BBC (British Broadcasting Corporation) offered a daily schedule of news, music, educational talks, radio plays, and children’s programming, all broadcast without advertising. I can imagine older people in 1935 saying, “There are kids growing up today who will never have known a world without radio.”
My mum was born in 1930, which was nine years before the start of WWII. Although residential electricity had been around for a long time, it was typically available only in wealthier households. My mum didn’t get electricity in her house until 1941. The reason they got electricity was to power a radio. My granddad was away fighting the war, and my grandma was deaf. She’d lost her hearing around the age of 17 after catching meningitis, but she could read lips like a diva. It was my mom’s job to listen to the evening news on the radio and relay the events of the day to my grandma. Mum did this throughout the war, which ended in 1945. I can imagine older people in her area of Sheffield around that time saying, “There are kids growing up today who will never have known a world without electricity in the house.”
I was born in the 1950s, not too long after the end of World War II and just a few short years after television began to take hold as a mass medium. Although experimental broadcasts had been made in the 1930s, regular public television service in the UK didn’t really take place until 1946. By the time I was a child, television was becoming a fixture in living rooms across Britain and America. The BBC offered a mix of news, drama, comedy, and children’s programming, all in glorious black and white. I can imagine older people in the 1950s saying, “There are kids growing up today who will never have known a world without television.”
I graduated from high school and started at university in 1975, just as personal computers were beginning to appear on the scene. Those machines were primitive by today’s standards—some didn’t even have screens, just rows of blinking lights—but they were the seeds of a revolution. By the early 1980s, computers like the Apple II, BBC Micro, and Commodore 64 were making their way into homes, classrooms, and offices. The idea of having a computer in your house still seemed a bit odd to many people, but for those of us who were hooked, it was the start of something big. I can still hear my friends in the 1980s saying, “There are kids growing up today who will never have known a world without personal computers.”
I watched the rise of the internet from the inside, as it evolved from an academic curiosity into the digital lifeblood of modern life. In the early 1990s, you needed a dial-up modem, a lot of patience, and a taste for plain-text bulletin boards and painfully slow websites. But by the late 1990s, the internet had woven itself into the fabric of daily existence—email, web browsing, online shopping, digital news. The world suddenly felt smaller, more connected, and more immediate. I can remember people at the turn of the millennium saying, “There are kids growing up today who will never have known a world without the internet.”
And then came things like wireless connectivity in the form we know and love it today, coupled with things like smartphones. When Apple launched the iPhone in 2007, it wasn’t the first mobile phone or even the first to offer internet access, but it changed everything. In a matter of years, we went from basic calls and texts to carrying entire digital universes in our pockets. Today’s smartphones perform the roles of camera, calendar, map, encyclopedia, music player, bank, and social lifeline all rolled into one. It wasn’t long before I heard people saying things like, “There are kids growing up today who will never have known a world without smartphones, or without being connected, wirelessly, all the time.”
I could keep this up for hours. For example, I remember when newspapers were black-and-white (if you were lucky). It wasn’t until the late 1970s and early 1980s that we started to see the occasional splash of color, which was so remarkable at that time that older people talked about it with the same reverence as landing people on the moon. The real breakthrough came with Rupert Murdoch’s launch of Today in 1986. This was the first UK national newspaper to be printed in full color, and it marked the start of a shift toward color in mainstream newsprint. By the 1990s, people could happily proclaim, “There are kids growing up today who will never have known a world without color newspapers.”
The sad addendum to this last item is that, in the United States, where I currently hang my hat, newspapers are closing at a rate of roughly two to three per week (the U.S. has lost nearly 3,000 newspapers since 2005). This prompts me to pontificate that it won’t be too long before we are all saying, “There are kids growing up today who will never have known a world with print newspapers.” (Now I’m wearing my sad face.)
So, what triggered my rambling ruminations above? Well, I was just chatting with Dave Beasley, who is the Executive Vice President (EVP) for Sales and Marketing at Virtium. The purported purpose of our conversation was to discuss Virtium’s new offering in artificial intelligence (AI) space (where no one can hear you scream).
As we all know, AI is becoming ubiquitous. Recently, for example, I’ve posted columns on AI-powered holograms, AI co-hosts on video podcasts, AI-powered circuit design, AI-powered verification, AI-powered PCB layout, AI-based PCBA supply chain verification, AI-powered documentation generation, and… the list goes on.
In my AI co-host column, I told how the human co-host, Ashraf “Ash” Amin, uses AI to help him structure everything from daily routines to long-term goals across all the major areas of his life—health, business, learning, spirituality, and family. I mentioned this to Dave, who responded that one of his friends had recently been struggling with raising his 12-year-old daughter, and he’d turned to ChatGPT for advice. Just a few short years ago, this would have struck me as strange. Now, by comparison, I find myself having more of a “ho hum” reaction.
Of course, just talking about this reminds me of The Contractual Obligation Implementation episode of The Big Bang Theory. This was the one where Leonard, Howard, and Sheldon discover that—as part of their contract with the university—they are required to serve on a committee that promotes science among young women. This almost leads to Sheldon making a very unfortunate mistake.
But we digress… Early in our chat, Dave commented that, “There are kids growing up today who will never have known a world without artificial intelligence.” As I alluded to in the opening paragraph of this column, this manages to be both obvious and unexpectedly insightful at the same time. On the one hand, “Duh.” On the other hand, “Wow, that’s right!”
But none of this is what I set out to talk about…
What I wanted to say is that all sorts of companies are diving deep into the AI waters. Some, like OpenAI, are creating large language models (LLMs). Some, like DeGirum, are creating Model Zoos (collections of pre-trained AI models that can be easily deployed in edge applications). Some, like Kinara (soon to become part of NXP), are creating AI-enabled system-on-chip (SoC) devices. And some, like Alif Semiconductor, are creating AI-enabled microcontrollers for edge applications.
So, what’s all this got to do with the chaps and chapesses at Virtium? As you may be aware, Virtium is famous in the embedded and industrial sectors for offering both SSDs and memory modules, with an emphasis on ruggedness, reliability, and customization. Virtium stands out for its focus on industrial environments, U.S.-based design and manufacturing (they also have design and manufacturing operations in Vietnam), and value-added features like firmware support and power-loss protection.
However, you may have missed the news that Virtium acquired the Swedish company Embedded Artists in December 2024. With a stellar 20+ years in the industry, Embedded Artists develops embedded hardware platforms with AI capabilities, especially through systems-on-module (SOMs) based on NXP processors with integrated neural processing units (NPUs). They focus on enabling edge AI in industrial, smart vision, and IoT applications, offering both off-the-shelf modules and custom design support.
Now, the guys and gals at Virtium Embedded Artists are entering the arena with a new AI-focused SOM. Earlier this week, they announced the launch of their iMX8M Mini DX-M1 SOM, which integrates a quad-core application processor and a 25 TOPS AI hardware accelerator chip with associated memory on a board with a compact 82mm x 50mm footprint.
Meet the iMX8M Mini DX-M1 SOM (Source: Virtium Embedded Artists)
By integrating an advanced NPU in the form of a DEEPX DX-M1 AI accelerator into the SOM, Virtium Embedded Artists eliminates the need for embedded device manufacturers that want to implement AI functions to plug a discrete AI processor module into their system. This provides multiple benefits to OEMs and designers, including board space savings, simplified systems design, streamlined bill-of-materials, and faster time-to-market.
The new Virtium Embedded Artists SOM is based on the i.MX 8M Mini from NXP Semiconductors, an application processor that features four 1.6GHz/1.8GHz Arm Cortex-A53 CPU cores backed by 2GB of LPDDR4 memory, as well as a 400MHz Cortex-M4 controller core. The i.MX 8M Mini provides strong support for video and imaging applications thanks to its video engine with 1080p codec, 2D/3D graphics engine, and 4-lane MIPI-DSI interface, as well as a Gigabit Ethernet interface.
Working in tandem with the DEEPX DX-M1 NPU, the i.MX 8M provides the image processing capability required to support AI-enabled vision systems used in popular and emerging vision applications. These include drones, security and surveillance, automated inspection and monitoring, transportation, MedTech, and AgTech.
The high-performance DX-M1 AI accelerator provides 25 TOPS of throughput at an average power consumption of 5W, making this SOM ideal for power-constrained edge computing applications. In this new Virtium Embedded Artists implementation, the SOM provides 4GB of LPDDR5 memory for the AI processor accessed via a 64-bit, 4-channel data bus, thereby allowing the DX-M1 to run multiple AI models concurrently without performance degradation.
One final point that’s well worth reemphasizing in these days of despondency and doubt fueled by tariffs and trade restrictions is the fact that Virtium Embedded Artists has design and manufacturing operations both in the USA (which is ideal for American companies wishing to purchase these SOMs) and in Vietnam, which is advantageous for the rest of the world, including the European and Asian markets.
I can’t help myself. I’m left wondering what parental guidance questions my parents would have asked of ChatGPT had such a thing been around when I was growing up. But then I realize that it’s silly to speculate on such matters. My parents wouldn’t have had any questions because I was an absolute delight.
What say you? Do you have any thoughts you’d care to share on anything you’ve read here? For example, can you add to my introductory collection of “There are kids growing up today who will never have known a world without…” topics?
There are kids growing up now and will for sometime in the future have never understood the meme power of marketing ploys.
There are kids who believe that we have systems that have are somehow intelligent and don’t know that they are wrong.
There are adults who cannot understand the concept of “computers cannot think” and that they are nothing like how the brain works.
Machine learning is a not intelligent and mining text to generate text is not a function of intelligence it it actually wrong and I hope one day that this stupid world will wake up and realise that they have been mislead.
There will be kids that grow up knowing the history of the great AI con and they will call it EI (ersatz intelligence) because it’s fake and that building intelligence into systems is about “mining” sensed reality and and interacting with it. Not something you can do with just a computer.
EI ( Ersatz Intelligence )
The main thing is that you are staying cheerful, and you haven’t succumbed to bitterness LOL. On the one hand, I agree with your “computers cannot think” point. On the other hand, when they have the ability to sift through billions of inputs and present relevant responses to queries, it can certainly feel “close enough for government work.”
I also agree that mining text to generate text is not a function of intelligence. You actually missed a point here, because current/future generations of AI are/will be trained on content, much of which was generated by AI. Logically, over time, this should make the AIs less good at what they do.
On the other hand, irrespective of whether you call it AI or EI, it’s becoming increasingly useful in many different areas of life. It’s like having a pocket calculator — very useful just so long as you know how to do the calculations by hand if you need to, and you don’t believe the results without cross-checking (in the calculator case, I typically start by doing a rough calculation to get a “feel” of the expected result. That way if the calculator gives me a result that’s off by an order of magnitude, I know something is wrong somewhere.
“There are kids growing up now and will for sometime in the future have never understood the meme power of marketing ploys. There are kids who believe that we have systems that have are somehow intelligent and don’t know that they are wrong.”
True.
“There are adults who cannot understand the concept of “computers cannot think” and that they are nothing like how the brain works.”
This statement is based on the idea that “Only a brain can think”. This is true if you define the terms that way (a common problem due to the limitations of semantics which crops up in many complex subjects such as quantum mechanics). In the broadest terms, we don’t know exactly what thinking is, or even how a brain works (I assume when you use the term brain, you’re referring to a “healthy” human brain of a specific level of development). If we ignore the problem of language and base the question on defined effects, we may or may not create something that “thinks” in a way we can relate to. Alien possibilities (such as fungi), may generate an effect on us but not one we can necessarily relate to as caused by “thinking”. On the simplest level, the human brain and desktop computer both operate on binary manipulations, synaptic verses semiconductor, with the synaptic operation occurring at chemical speed (changing the sodium level), and the semiconductor at electromagnetic speed (limited of course by the clock function). Above that, the architecture is quite different. This does not mean we can’t emulate how a thought process is done, but it does mean we probably have a long way to go – if it is ever achieved.
Having said that, the question is still one of operance. The Turing Test simply implies that if you can’t tell the difference between a computer and a human response, there isn’t a functional one. This ignores the limits of the person (or machine), doing the comparison. The final issue is one of effects. If the smartest, wisest etc., person, or people can’t tell the difference, the effects of the comparison will be equal.
“Machine learning is a not intelligent and mining text to generate text is not a function of intelligence it it actually wrong and I hope one day that this stupid world will wake up and realise that they have been mislead.”
When you say “intelligent”, I have to assume you mean cognizant? That’s the big one. Current AI is not true “synthetic cognizance”, usually differentiated these days from the usual “fake AI”, with the term AGI (Artificial General intelligence). We’re still trying to define cognizant enough to apply the term in some sort of practical way. Normally this means “a creature aware that it is different from all the stuff that surrounds it”. My dogs seem to be aware and, within limits, are “intelligent”. Can a computer be brought to any level of awareness – eventually? It may still be a real bad idea too. Due to the limitations of semiconductor physics, work is currently being done on “wet chips” (biologically built, using DNA or such). Can such a creation still be called a computer? To answer such a question, we’re back to the limitations of semantics…
Naturally, the first question I had was, “Do their boards come in any other colors than, you know, red?”
My 19” rack is mostly a matte, crinkle-finish black, as befits a sober Protestant system, and red boards look too much like they’re there to have fun, rather than the hard work that builds moral fiber and true character. Oh sure, red is suitable for libertines and certain higher-income Catholics, but I want MY AI to take me seriously.
“…I want MY AI to take me seriously.”
I have bad news, I’m afraid, because my AI says that your AI doesn’t take you seriously at all.
I’ll bet your AI is really some guy in Bangalor with a bad haircut…
Harsh words, old friend 🙂