feature article archive
Subscribe Now

Cadence Brings Clarity to EMI

EMI is the ghost in our machines, the phantom of our electronic operas. We create our systems with a specific purpose, and our engineering efforts aim to hone and optimize toward that goal. At the same time, lurking in the copper traces and wayward return paths are silent specters seeking to derail our plans. They haunt our designs undetected, biding their time until the final day of reckoning, when they make their presence known, trashing our schedules and wreaking havoc on our best laid plans. 

They say there are two … Read More → "Cadence Brings Clarity to EMI"

Best and Worst Smart Home Interfaces

“’Easy to use’ is easy to say.” – Jeff Garber

My tech-savvy kids are preparing to move to another country, so I just inherited a lot of “smart home” gadgets that won’t work in their new place. Now I’ve got a box full of smart connected electrical outlets, Wi-Fi wall switches, wireless security cameras, and other gizmos. Hey, free toys! But configuring them all back to back was an exercise in frustration – with a few notable exceptions. 

Naturally, every single device … Read More → "Best and Worst Smart Home Interfaces"

Updating the Centuries-Old On/Off Switch

“I’m so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark.” — Muhammad Ali

In the Beginning, there were electrons. Then came wire. Then the switch. And it was good. 

At some level, wires and switches are all we need to make digital electronic systems. Submicron transistors are the switches and stacked metal layers are the wires. Progress, eh? 

But some systems … Read More → "Updating the Centuries-Old On/Off Switch"

If AMD Buys Xilinx

Six years ago, we speculated about what would happen if Intel were to buy Altera. A year later, they did, and a lot of our speculation came true – Intel has leveraged Altera technology to defend their dominance of the data center, the FPGA market has changed direction, and the Altera culture has been largely assimilated into the larger Intel pool. The decades-long feud between Xilinx and Altera has cooled.

Two years ago, we … Read More → "If AMD Buys Xilinx"

What Is a Compiler, Anyway?

“We still have judgement here, that we but teach bloody instructions which, being taught, return to plague th’inventor.” – Macbeth, 1.7

Today we dive into Computer Programming 101. 

Computers don’t speak English. Or Mandarin, or German, or Spanish, or any other human language. Despite how Siri and Alexa may appear, computers and other gadgets are native speakers of their own binary tongues, languages that we can’t understand. 

That means that if you want to program a … Read More → "What Is a Compiler, Anyway?"

The JOYCE Project to Equip Machines with Human-Like Perception

Did you ever watch the British television science fiction comedy Red Dwarf? The stage for this tale is the eponymous spaceship Red Dwarf, which is an enormous mining vessel that is 6 miles (10 km) long, 5 miles (8 km) tall, and 4 miles (6 km) wide. Series 1 through Series 8 originally aired on BBC 2 between 1988 and 1999 (somewhat reminiscent of Whac-A-Mole, there were reboots in 2009, 2020, 2016, 2017, and 2020).

The underlying premise follows low-ranking technician Dave Lister, who awakens after being in suspended animation for three million years to find … Read More → "The JOYCE Project to Equip Machines with Human-Like Perception"

Weird Instructions I Have Loved

“Simplify and add lightness.” – Colin Chapman

If you don’t write assembly-language programs, you’ll miss out on some strange, interesting, wonderful, or uniquely powerful instructions lurking inside your processor. Some are immensely helpful. Others are just… weird. 

Any processor can add and subtract, and most can multiply integers. A few can even do integer division. And some have a floating-point unit (FPU) for dealing with fractions. 

But can your chip do 4-dimensional transforms? Table lookups? … Read More → "Weird Instructions I Have Loved"

Using Not-a-Blockchain to Secure Embedded Devices

“Most people want security in this world, not liberty.” — H. L. Mencken

Mention “distributed ledger technology” and you’ll either get a blank stare or someone will shout, “Blockchain!” The underlying mathematics behind cryptocurrencies and blockchain can, like any technology, be used for different purposes. The headlines tend to emphasize those that affect consumers, but there are a lot of other ways we can employ distributed-ledger technology without setting up massive cryptocurrency server farms. 

Iota Foundation</ … Read More → "Using Not-a-Blockchain to Secure Embedded Devices"

Pulling Yourself up by Your Bootstraps

A couple of days ago as I pen these words, I received a message on LinkedIn from someone asking, “Can you advise me about what books a beginner can peruse for learning NASM?” To be honest, this was a bit of a tricky one, not least that I didn’t have a clue what NASM was, so I did what I usually do in a time of crisis, which is to have a surreptitious Google.

You can only imagine my surprise and delight to discover that NASM stands for “Netwide Assembler,” which is an assembler and disassembler … Read More → "Pulling Yourself up by Your Bootstraps"

featured blogs
Oct 20, 2020
Voltus TM IC Power Integrity Solution is a power integrity and analysis signoff solution that is integrated with the full suite of design implementation and signoff tools of Cadence to deliver the... [[ Click on the title to access the full blog on the Cadence Community site...
Oct 19, 2020
Have you ever wondered if there may another world hidden behind the facade of the one we know and love? If so, would you like to go there for a visit?...
Oct 16, 2020
Another event popular in the tech event circuit is PCI-SIG® DevCon. While DevCon events are usually in-person around the globe, this year, like so many others events, PCI-SIG DevCon is going virtual. PCI-SIG DevCons are members-driven events that provide an opportunity to le...
Oct 16, 2020
[From the last episode: We put together many of the ideas we'€™ve been describing to show the basics of how in-memory compute works.] I'€™m going to take a sec for some commentary before we continue with the last few steps of in-memory compute. The whole point of this web...