I’m sure I’ve asked you this before, but have you ever read Dirk Gently’s Holistic Detective Agency by the late, great Douglas Adams? If so, I’m sure you’ll agree that it’s hard to beat the author’s own description of this tale as a “thumping good detective-ghost-horror-whodunnit-timetravel-romantic-musical-comedy-epic.”
This was followed by The Long Dark Tea-Time of the Soul. There was supposed to be a third novel in the series—The Salmon of Doubt—but, sadly, Douglas passed away before completing it.
Dirk is a detective. His real name is Svlad Cjelli. He changed it because he felt this name didn’t inspire much confidence for someone trying to run a detective agency. He picked “Dirk” (where a “dirk” is a type of dagger) to sound forceful or authoritative, perhaps even a bit threatening or mysterious, but then added “Gently” to soften the blow (no pun intended) and make him seem more approachable.
The guiding principle underlying Dirk’s detective methods is his unfailing belief in “the fundamental interconnectedness of all things.” Based on this belief, instead of looking for direct evidence or following traditional leads, Dirk believes that everything in the universe is linked, so any clue, no matter how trivial or bizarre, could ultimately be relevant.
Thus, rather than following a neat logical trail, Dirk’s investigations often involve wild tangents—like time travel, ghosts, poetry, or an Electric Monk riding a horse through a bathroom. At its heart, the concept is both a satire of causality and a whimsical nod to chaos theory: small events can have enormous consequences, and you can’t always predict which ones will matter. In short: everything is connected, even if it seems absurd (especially if it seems absurd).
You may scoff at the idea of the interconnectedness of things—take your time; I’ll wait until you’ve finished—but I constantly run into this sort of thing in my own life.
Around the beginning of May 2025, for example, I was on a video conference call with Jinger Zeng, who is the Community Manager at Hackster.io (an Avnet community). Other Avnet platforms include element14 and Avnet Boards. Meanwhile, Avnet has positioned Hackster.io as a flagship project-based, hardware-focused developer network under its community umbrella.
The purpose of our call was to discuss the possibility of my writing a regular “Throwback Thursdays” column talking about technologies of yesteryear (as an aside, I’m happy to report that my first two columns have already gone live: Bodacious Wooden Breadboards and Springing Into Action with Spring Connectors).
The reason I mention this here is that our conversation took place the week before this year’s Online Embedded Conference (see Only the Most Epic Embedded Online Conference Ever!) at which I was destined to give a presentation (see AI in Embedded Systems and Life Writ Large), so I asked Jinger if she was planning to attend.
Jinger replied that she wasn’t able to make this year’s event due to previous commitments, but she made mention of the fact that her friend, Kwabena Agyeman, Co-Founder and President at OpenMV, would also be presenting at the conference (see How AI Accelerated Microcontrollers Will Change Embedded Systems), and she said Kwabena’s contribution promised to be a cracking conversation.
To be honest, I’m almost invariably up to my armpits in alligators work-wise, and I spend much of my days running around in ever-decreasing circles shouting “Don’t Panic!” As a result, I probably wouldn’t have taken the time to watch Kwabena’s presentation prior to Jinger’s recommendation, but I’m so glad I did.
Following Kwabena’s talk, I reached out to him, and we ended up having a conversation of our own. We began by discussing Zephyr, a real-time operating system (RTOS) designed for devices with as little as 8 KB of RAM. Zephyr is increasingly being recognized as an important part of the future of embedded systems, especially for resource-constrained, secure, and connected devices. Kwabena made it clear that he’s not trying to start a culture war, but for the kind of work the folks at OpenMV are doing, he believes that running MicroPython bare-metal is the better approach.
And what are the folks at OpenMV doing? Well, the clue lies in the name because OpenMV stands for “Open Machine Vision.” OpenMV’s goal is to make machine vision accessible, affordable, and programmable for makers, engineers, and educators—the way Arduino made microcontrollers accessible to everyone. The name reflects the following core ideas:
Open: The project is open source:
- The firmware, software (OpenMV IDE), and hardware designs are published under permissive licenses.
- Community-driven contributions are encouraged.
Machine Vision: The core capability:
- OpenMV boards let microcontrollers perform computer vision tasks like face detection, object tracking, QR/code recognition, AprilTag detection, and more—all without needing a PC or GPU.
- Written in MicroPython, these tasks run directly on embedded hardware (ARM Cortex-M or AI-enabled microcontrollers like the Alif Ensemble E3).
Now, I know I’m a bit biased, but I have to say that I love the AI-enabled microcontrollers coming out of Alif Semiconductor. In the case of the Ensemble E3, for example, we’re talking about an ultra-low-power postage-stamp-sized microcontroller that contains:
- A dual-core Arm CortexM55, running at 400 MHz and 160 MHz.
- 2 × Arm EthosU55 microNPUs—one delivering ~204 GOP/s (256 MAC/cycle) and the other ~46 GOP/s (128 MAC/cycle)—and a 2D GP, enabling real-time machine vision like YOLO object detection at ~30fps.
- Crucially, it also includes 13.5 MB of on-chip SRAM, which is a massive amount of memory for an MCU.
Historically, machine vision has been locked behind expensive, complex, proprietary industrial systems. OpenMV changes that by:
- Offering compact, low-power modules.
- Running real-time vision on the edge.
- Giving users a Python scripting environment (MicroPython) to interact with camera feeds.
OpenMV’s latest and greatest offerings are two powerful AI vision cameras that were unveiled in March 2025: the OpenMV N6 and OpenMV AE3. In a crunchy nutshell, these bodacious beauties can be summarized as follows:
OpenMV N6 ($120 at the time of this writing)
- Processor: STM32N6 with an 800 MHz Cortex-M55 CPU and 1 GHz NPU delivering ~600 Giga‑ops.
- Camera & Power: 1 MP global-shutter camera at 120 fps, hardware H.264/JPEG encoder; under 0.75 W consumption.
- Connectivity: Wi‑Fi, BLE, gigabit Ethernet, and plenty of GPIO.
- Performance: Up to 600× faster ML performance compared to older OpenMV models.
The OpenMV N6 (Source: OpenMV)
OpenMV AE3 ($80 at the time of this writing)
- MCU: Alif Ensemble E3 (dual‑core Cortex‑M55 at 400 MHz + 160 MHz) with dual Ethos‑U55 NPUs (~250 Giga‑ops).
- Camera & Power: 1 MP global-shutter camera; ultra-low power (<0.25 W at full operation, <2.5 mW deep sleep).
- Extras: IMU, microphone, 8×8 distance sensor, Wi‑Fi, and BLE onboard.
- Form Factor & Performance: Just 1″× 1″ and capable of running YOLO-style algorithms at ~30 fps.
The OpenMV AE3 (Source: OpenMV)
Common N6/AE3 Highlights
- MicroPython support via the OpenMV IDE: Write Python code to interface with vision models directly.
- ULAB module enables efficient NumPy-like processing for AI workflows.
- Training integrations: Partners with Roboflow and Edge Impulse for seamless AI model development and deployment.
- Extensible ecosystem: Compatible accessories include thermal (FLIR Lepton, Boson) and event cameras, plus upcoming connectivity shields like LTEM, NBIoT, GPS, and rugged enclosures.
Between the two, the N6 is the high-performance flagship (1 GHz NPU, better connectivity), while the AE3 offers a stunningly low-power, ultra-compact alternative. Both enable advanced edge-AI vision tasks.
I don’t know about you, but I can envision a lot of machine vision applications in my future. And, speaking of you, what do you think about all this? Do you have any thoughts you’d care to share with the rest of us?