No Apparent Loss, But What Does It Mean?
Synopsys has joined an illustrious list of high-value names that are members of a growing club: companies that have been hacked.
This week they announced an unauthorized breach in their “EDA, IP and optical products and product license files through its customer-facing license and product delivery system.” They were careful to note that “no customer project or design data in this system was accessed, and there is no indication that this incident affected any other Synopsys systems.” And, critically, “The license and product delivery system does not store personally identifiable information (PII) or payment card information (PCI).” And that they’ve closed the means of access that was used.
I was able to chat with Synopsys’s Dave DeMaria to get a better understanding of what happened. So let’s start by laying out the situation.
Altera Kicks Up the Tools
It can be tough being in the lead. For years, one of the brightest spots for Altera Corporation in the FPGA market competition with archrival Xilinx, Inc. has been their Quartus II tool suite. It’s no secret that Xilinx struggled with their old ISE tools, ultimately leading to a complete, ground-up redesign. Altera’s Quartus II continued doing its job, while Xilinx had the “new shiny object” - with all the excitement and problems that go along with re-designed tools.
But, what then do you do if you’re Altera? You’ve been ahead in tools for years, your customers are pretty happy with Quartus II, and they’ve invested a lot learning to use it effectively. Your competitor (and their customers) have bitten the bullet and started from scratch with a new but modern tool suite. You don’t want to disrupt the good thing you’ve got going with Quartus II, but neither do you want to have the perception that you’re the ones with the “old” tools. You’re in a bit of a pickle.
The Key to Optimizing the Value of Hardware Design and Verification Engineers
The electronics and semiconductor markets have always been very competitive, and the ongoing consolidation trend has raised the stakes even higher. Additionally, significant investments must be made and requirements set well before the first unit ships. This up-front effort contributes significantly to the electronics value chain, where concepts and algorithms are invented and implemented as hardware or software. The hardware side is especially challenging due to its permanency and per-unit variable cost. The task of the hardware engineering team is to implement those concepts and algorithms in competitive and cost-effective silicon and to exhaustively verify that it will function as intended in every scenario.
Audio Weaver Claims Big Development Savings
One way of looking at the Internet of Things (IoT) is to think of it as endowing machines with human characteristics – in particular with respect to our ability to sense the world. To some extent, past efforts have served to sense things that we’re not so good at sensing precisely ourselves. Like temperature or pressure or orientation.
That helps humans, who can then combine their own senses with machine senses for better decisions. But if we further enable our machines to do what we can do, then we can have the machines do without us. (I know… not so good for jobs…). So cameras and microphones will replace our eyes and ears, playing an increasingly important role as our ability to process their data improves.
A Path to Flexible System Implementation
Firstly – if you are an existing FPGA user, you may not find much that is new in this piece, but really, it is not aimed at you. What would be useful is if you share it with your system architect colleagues and your software colleagues, for whom much of this may well be new and useful.
You are beginning a new project - let's say a motor control system. You can assemble components on a board – possibly a processor, a DSP, an FPGA for peripherals, and a networking ASIC. The result is a relatively large board, with the inherent reliability issues and a high BoM cost
Intel… Then What?
We wrote a lot in these pages (even long before it happened) about the market and technology trends and pressures that led to Intel’s bid to acquire Altera. We dove into the data center and dug up the game-changing combination of conventional processors with FPGA fabric that can form a heterogeneous computing engine, which could conquer the current plague of power problems that limit the size and capacity of server farms around the world. We argued that Intel needed Altera - as a strategic defense to protect its dominance of the data center in a future world where FPGA-accelerated processing was the norm rather than the exception.
Intel came, offered, negotiated, and eventually won. Now, pending approval of various worldwide regulatory bodies, Altera will most likely become part of the world’s largest semiconductor company. But what then?
Synopsys Announcements at ITC Time
One of the notable skills repeatedly demonstrated by director Robert Altman was the ability to take a number of smaller stories and knit them together into a larger story. You know, one narrative that pulls the distinct seemingly-unrelated bits together into a larger truth.
Today, we’ve got a few – 3, to be specific – little stories. How to weave them together? What’s the common theme? Well, test, for one. But that’s a blah theme – on reserve in case there’s nothing else.
Cadence Tensilica Vision P5 Lets the Light In
The Internet of People has cameras - literally billions of them. They are in smartphones, laptops, tablets, WiFi devices - it sometimes seems they’re watching our every move. This incredible volume of information is then (somewhat) intelligently analyzed, edited, and moderated by the vast visual computing power of the enormous array of human brains behind these cameras. The amount of computation required to filter, process, and interpret this image data is staggering. The end result is, of course, an almost infinite wasteland of cat videos on Facebook and YouTube. But video processing has higher purposes as well.
Blue Pearl Brings Sanity to Debug
We don’t need no stinkin’ FPGA debug tools! Debug tools are for those “other” engineers. You know, the ones who make mistakes.
As engineers, it’s hard to admit we’re fallible. Most of us have spent our careers, and maybe our whole lives, being lauded for our technical prowess. We pride ourselves on our ability to solve problems quickly, to design things that are robust and reliable, and to anticipate the twists and turns that the real world will throw at our creations.
That’s why I feel bad for the people at Blue Pearl Software who have to sell their tools. They have to begin the discussion at a place that’s already uncomfortable for most engineers. We don’t want to talk about our mistakes. We don’t really want to admit that we ever make them. And, on the rare occasion when we do have a problem, we try to forget about it as soon as possible. Our hindsight goggles may be 20/20 on the good stuff, but we can all be a little forgetful about our missteps.
Oh, and Ships Their First 16nm Fin-FET Zynq Device
Sure, the announcement that Xilinx is now “shipping” their first 16nm FinFET-based super-amazing Zynq UltraScale+ All Programmable MPSoCs is kinda’ a big deal. Zynq UltraScale+ is unquestionably the most capable SoC we’ve ever seen, and it is difficult to even imagine the game-changing applications that will be built with this device.
Just as a refresher, Zynq UltraScale+ is a multi-core heterogeneous computing device that includes quad-core, 64-bit, ARM Cortex-A53s, dual-core Cortex-R5 real-time processors, a Mali-400MP graphics processor, enormous amounts of advanced FPGA fabric, a hardened H.265/264 codec unit, an “Advanced Dynamic Power Management Unit” for ASIC/ASSP-grade application power management, a configuration security unit to help lock down your design, forward-looking DDR4/LPDDR4 memory interface support, and copious amounts of on-chip ultra-high-speed “UltraRAM” for buffering and so forth.
Which Designers Need to be Schooled?
A simple, straightforward EDA whitepaper recently got me asking some demographic questions. The answers say something about where analog and digital designs are being done – and where they’re coming together. (And yes – sorry, but the Internet of Things (IoT) is involved.)
Whitepapers are popular these days. Nothing new about the whitepaper concept, but how many of them get written during a given period of time (shall we call it the “whitepaper density”?) varies with the economic cycle. And the density is high at the moment.
Whitepapers can be a really useful way to get a technical message out without it sounding like marketing – if done right. You can talk about your product, unlike in an editorial article. But if you keep the tone analytical and professional, it doesn’t come across as marketing. You hold back on the, “OUR PRODUCT EFFING ROCKS” enthusiasm. More like, “Given these numbers and the trends in the industry, there’s a high likelihood that our product would be perceived as effing rocking.”
What Is a Second and How Do You Measure It?
What is a second and how do you measure it?
It’s a bit like the chicken and the egg question. Do we improve accuracy in time-keeping in response to the needs of a new technology, or do we get new technologies because we can be more accurate in measuring time?
Early rural societies didn't need accuracy much greater than morning, afternoon, dinnertime, etc. As things got more sophisticated, accuracy became more important. Urban societies required more co-ordination, and so public clocks, often with bells to toll the hour and later the quarter hour, were set up. Long sea journeys, particularly driven by the commercial and military needs of North America, drove the improvement in chronometers, where accuracy of ± 2 seconds a day was sufficient to avoid shipwreck.
Synopsys Introduces HAPS-80
Moore’s Law, Moore’s Law, Moore’s Law... Up and to the right on a log scale. More of everything forever. Constantly getting bigger, faster and better; more complex, harder to design, more expensive to build, and… way, way harder to verify and debug.
Moore’s Law can make you dizzy standing still.
One thing we have learned in the past couple of decades is that prototyping complex systems before committing to silicon is basically mandatory. And programmable logic is by far the best and most widely accepted underlying technology for hardware prototyping. For most of the history of FPGA-based prototyping, the majority of companies have either built their own prototypes from scratch using off-the-shelf FPGAs on boards they develop in-house, or with ad-hoc solutions comprised of pre-made hardware from one company, design software from other companies, and the remainder of the bits and pieces scrounged together from whatever source was available.
Transforming the Family Business
You know that old Italian restaurant down the road a ways? Mamma – that’s what we called her; no one knew her actual name – ran that place for longer than anyone can remember. The recipes were secret. The books were… well, we don’t ask about the books. Money comes in, money goes out, shuddup and eat yer gnocchis, okay? Somehow, the bills got paid and the employees got paid and the customers kept coming.
And then something happened to Mamma. And the one person holding the whole thing together was no longer there to hold everything together. And now what happens? Folks know how to go on autopilot, so the plates of pasta keep coming, but sooner or later either Mamma’s secrets must all be unearthed or everyone simply has to find a new way. And that new way should learn from the challenges that arise when too much of the business is transacted inside the mind of one individual.
A review of "Moore's Law: the Life of Gordon Moore"
More years ago than I want to think about, I expressed interest in a PR job at Intel. In a preliminary meeting, things were going well until I was asked how I dealt with confrontational situations. My reply, that I worked hard beforehand to make sure that the need for confrontation didn't arise, was clearly the wrong answer, and so - probably just as well - I never worked for Intel. After reading Moore's Law: the life of Gordon Moore, Silicon Valley's quiet revolutionary, I now know why this was an important question.
Gordon Moore is the only widely known name from the founding fathers of the silicon age. This is in the main because of Moore's Law, which is misunderstood and misquoted daily. And, unlike many of the others of the founding fathers, he is still alive and the company he founded is still a major (if not the major) player. So a biography of a living legend should be more than welcome.