feature article
Subscribe Now

Synopsys Hacked

No Apparent Loss, But What Does It Mean?

Synopsys has joined an illustrious list of high-value names that are members of a growing club: companies that have been hacked.

This week they announced an unauthorized breach in their “EDA, IP and optical products and product license files through its customer-facing license and product delivery system.” They were careful to note that “no customer project or design data in this system was accessed, and there is no indication that this incident affected any other Synopsys systems.” And, critically, “The license and product delivery system does not store personally identifiable information (PII) or payment card information (PCI).” And that they’ve closed the means of access that was used.

I was able to chat with Synopsys’s Dave DeMaria to get a better understanding of what happened. So let’s start by laying out the situation.

  • The breach involved their SolveNet portal, which is where engineers go for things like help, finding their licenses, etc.
  • This network is physically distinct from other networks – in particular, from any remaining cloud computing infrastructure (which has diminished over the last couple years).
  • The attacked network is partitioned into three sections:
    • An area for downloading updates. Binaries are available here.
    • An area for accessing licenses.
    • An area for filing and managing support tickets
  • The areas affected by the breach affected only the first two of those three.
  • No one actually purchases anything from this portal, which is why no financial artifacts were at risk.
  • This was not a cloud platform; users were executing their tools on their own company networks. There was nothing in the breach that enabled access to any company networks; this is why no customer design data was put at risk.
  • The attackers did have visibility into which customers had licenses. But given the number of companies that use Synopsys tools in at least one part of their design flow, there’s probably not a lot of news there.
  • They discovered the breach themselves based on unusual activity: a large number of licenses being downloaded in October. They then brought in an outside forensic company to figure out the parameters of the attack.
  • Law enforcement has been brought in to try to track who did it. Motivations are unknown at this point.

Part of me shrugs and says, “another day, another break-in.” But that just reflects a level of jadedness brought on by endless headlines trumpeting hacking projects with far worse results – like the wholesale sweeping up of massive numbers of credit card numbers.

Even though this seems to have been an event of far less consequence – no real consequence, in fact, if the announcement is taken at face value – it’s hard not to extrapolate to what could have happened – or might yet happen – to Synopsys or any of the other EDA guys, big or small. (Obviously bigger companies make bigger targets, depending on what you’re after…)

Synopsys was one of the original leaders in exploring the use of cloud computing for EDA. And there are really compelling reasons why leveraging the flexibility of the cloud can bring solid benefits to a project – except that those prospects are routinely nuked by concerns about security and the integrity of company IP when in the cloud. Synopsys’s reduced cloud activity reflects this challenge.

And it’s not just an EDA/chip design thing. I was once associated with a company doing software analysis, and they originally did that in the cloud. The cloud platform enabled an extraordinarily creative and useful visualization of difficult, complex software relationships. That visualization worked through the browser, eliminating the need to test the user interface (UI) on dozens of different computers. If the browser worked, then the UI would work.

In the end, however, resistance to uploading critical software apparently necessitated a non-cloud solution. Or perhaps a “private cloud” solution that leverages a company’s own server farm, under the (not always correct) assumption that their own stuff is harder to hack than a public cloud. In fact, to that last point, Synopsys found that, when sharing the news with individual customers, some of those customers voiced sympathy, since they’d suffered similar attacks.

Companies that want to leverage the cloud have to do so very carefully, and with nuanced messaging. We saw OneSpin’s approach to cloud usage for formal analysis a couple of years ago, which avoided sending any specific design data into the cloud. Instead, any needed proofs were abstracted and sent, one at a time, into the cloud to be solved, and each solution was recast into the concrete terms of the design back in the designer’s computer after it returned from the cloud.

Solutions like the latter can really be pretty safe – and yet, when selling the story, you’re almost immediately on the defensive as you say you leverage the cloud and then have to quickly fill in the details even as you watch the customer’s crest fall. (I can’t speak for OneSpin’s specific sales experience here, but I can speak from my own experience trying to navigate such nuanced messages.)

To be clear, it does not sound like the Synopsys breach involved or got close to cloud infrastructure. The problem is, one can’t help wondering, “Is that because the cloud computing platform is that much more secure, or because the hackers simply didn’t look there this time?”

[rant] Part of me is constantly amazed that web infrastructure can be so easy to poke holes in. I’m no web programmer, but if two examples are any indication, there should, in fact, be no surprise. Because, as far as I can tell, the examples illustrate how the most fundamental rules we were taught about how to write good software don’t appear to apply to the web.

In the first example, when I learned how to write software, one of the cardinal rules is that you validate any input before acting on it. That way you’re not proceeding with nonsense input, resulting in unpredictable results (since you obviously didn’t include those nonsense inputs in your test suite). So you can imagine my surprise upon hearing that there were websites where you could literally place a database query into an input form and it would blithely go execute the query. No questions asked, no validation to add a layer of separation between ridiculous or malicious input and the server response.

The second cardinal software design rule example is that users should be able to operate using normal human formats (however arbitrary); the computer should take on the effort of normalizing the data formats for storage and processing. Well, we seem to be going backwards on that one when it comes to the web.

Take the simple case of phone number fields. Ignoring international numbers for the moment, there are a few well-recognized formats for US phone numbers, and you’d think that, after how many years of web technology?, there would be solid modules that capture all the possibilities so that programmers don’t have to constantly rewrite this stuff and users should no longer need to fret about which formats to enter.

In other words, we should be improving beyond the old situation where you enter a valid phone number and the website responds with, “This is not a valid number. Please enter a valid number.” Without telling you what they’re looking for, of course. My response was always, “Yes, it’s valid; you’re just too stupid to understand it.” International makes it harder, but by now, we should be able to handle that. (I know, this is a tiny thing, except when you change phone numbers and have to edit 30 different website profiles, each of which has a different – and unstated – view on what constitutes a valid phone number.)

So have we improved things? No: the latest thing is that we’re supposed to magically know that we are now responsible for removing all formatting and just giving a ten-digit (or whatever) string of numbers all crammed together so that the programmers on the other end don’t have to bother with formats at all. We’re not getting better; we’re getting lazier. Rather than having the computer normalize the data, they’re making users do it.

If these examples are any indication, it should be no surprise that the web is full of holes. Which leads to jadedness and a stifled yawn when yet another breach occurs. [/rant]

I also checked in with the other two Big Guys to see if they’d had break-ins (and what their transparency policies were). Mentor’s CIO Ananthan Thandry said, “To the best of our knowledge, no breaches have occurred at Mentor. Mentor Graphics has a policy of openly disclosing to our customers any events that could compromise proprietary or personal data they have entrusted to us.”  As of this writing, I’ve had no official response from Cadence (and I’ll update if one comes in).

Does that mean that Mentor has a better website than Synopsys? I’m certainly not drawing that conclusion. Synopsys got nailed this time; who will be next? And there will be a next. Do we give up and assume that anything on the Internet will get hacked, period? That there is no such thing as true security? I know I’ve heard that expressed by pundits in the past.

Even if that’s not true, there’s no real way to demonstrate that, “Yeah, I know those guys got hacked, but we’ve done a better job with security.” It’s a tough messaging challenge for PR: would they ever fess up and say, “Yeah, we know we’ve got real security problems, and we’re hoping to find them before attackers do”? Not likely. But if they beat their chests and tout invincibility, then they’ve just announced a hackathon. And even if a company’s web code is solid, there’s no test or benchmark that you can pull out to show that it is, in fact, the case.

To that last point, it turns out that a recent Synopsys acquisition, Codenomicon, had started working with Underwriting Laboratories (UL) to address specifically this issue. Synopsys has continued that effort; the ultimate goal would be to have some sort of “stamp” on a website that certifies it as UL-approved, much the way toasters and electric razors are. It’s confidence that you won’t be burned, only this time figuratively.

That will perhaps help build trust in cloud-based tools and environments. Which has implications for EDA, but even more so for the public in a cloud-connected Internet of Things world. Well, except that the public, to date, has been only nominally concerned about turning over their prize jewels to the Internet. Unlike EDA customers…

 

More info:

Synopsys’s full announcement

 

One thought on “Synopsys Hacked”

Leave a Reply

featured blogs
Jun 20, 2018
In this week'€™s Whiteboard Wednesday, Marc Greenberg, walks us through a typical ADAS system architecture and then provides a real-life testimonial on the value of these systems. https://youtu.be/EQL4jeD25_g...
Jun 19, 2018
Blockchain. Cryptocurrencies. Bitcoin mining. Algorithmic trading. These disruptive technologies depend on high-performance computing platforms for reliable execution. The financial services industry invests billions in hardware to support the near real-time services consumer...
Jun 7, 2018
If integrating an embedded FPGA (eFPGA) into your ASIC or SoC design strikes you as odd, it shouldn'€™t. ICs have been absorbing almost every component on a circuit board for decades, starting with transistors, resistors, and capacitors '€” then progressing to gates, ALUs...
May 24, 2018
Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact. A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation '€“ as couples are wont to...
Apr 27, 2018
A sound constraint management design process helps to foster a correct-by-design approach, reduces time-to-market, and ultimately optimizes the design process'€”eliminating the undefined, error-prone methods of the past. Here are five questions to ask......