feature article
Subscribe Now

A Nice DATE in Nice

I have to confess a weakness – I love trade shows/exhibitions. I enjoy being involved in designing, creating and setting up trade show booths, enjoy working on a booth and talking to new people, and enjoy walking the aisles and learning about new products and new companies, particularly at a crowded show where there is a special buzz. I believe that, when properly used, they are an efficient weapon in the promotional arsenal and can easily repay the investment by the exhibitors and that, when properly used, they are an efficient way of adding to professional knowledge and expertise. Yet, apart from Germany (and, I am told, Japan), trade shows are becoming less and less popular.  

Take DATE: for some years in the late ‘90s it was a large and flourishing show, well attended and well supported by industry, alongside a top level conference with industry and academic input. Companies often used the exhibition and related press conferences to pre-brief on their DAC announcements, or to announce European versions of their tools. This year there was a small exhibition alongside a very flourishing conference.

The conference attracted a record number of submissions, and the papers covered the full gamut of EDA. The panels and keynotes I attended and the briefings I received tended to be on subjects clustered at the beginning and end of the IC design chain. At the end of the chain, the issues of moving to ever-smaller process nodes were reflected in renewed interest in the physics of transistors and the issues of design for manufacturing.

The starting point of the chain is increasingly being seen at the system level – ESL is becoming reality after years of hype and mirrors. This theme ran from the first keynote to one of the last workshops. In part it reflects what Wally Rhines, of Mentor, sees as a significant future issue: developing and debugging the software within ASICs and SoCs. This emphasis also reflected another aspect of today’s products: the increasing use of multiple processors in systems and the issues of programming and debugging them. The two, previously separate, worlds of designing ICs and implementing embedded systems are increasingly converging with the issues of multiple processors and software common to both. This was reflected in the conference keynote from Turing Award winner Joseph Sifakis of France’s Verimag research lab.

The other conference keynote, from Mike Muller, CTO of ARM, Has anything changed in electronic design since 1983? was light-hearted in its approach but serious in its message. While clearly the end products have changed (the mobile phone had just been introduced in 1983 and was the size of two house bricks, for example), one thing Mike claimed that hadn’t changed was: “Never trust a hardware guy when he is talking about multi-core.”

While 1983 wasn’t really multi-core, but multi-processor, the issues of programming and debugging remain the same 25 years later. There is still no magic bullet to turn legacy code into parallel code, the tools for debugging parallel systems are still under-developed, and Amdahl’s Law, that the benefits of parallelising an application are limited by the execution time of the largest sequential segment of the application, still holds good.

This was also the subject of a panel session, where several companies looked at the different issues from very different perspectives. ARM is looking at synthesisable, multi-core processors, with the Cortex-A9 where, depending on the application, it is possible to create a device with between one and four processors and cores for clusters, such as the forthcoming Cortex-M8/9. PicoChip was talking about its picoArray, a multi-core approach that has been used for building DSPs, giving the company a strong presence in the femtocell market. ST-Ericsson is also using multi-core in communications, this time in the handset to keep up with what seems like an insatiable desire from the users for more features. Another panel member was NXP, which has been using collections of different types of processor, including VLIW (Very Long Instruction Word) cores, to build television control, which gets ever more complicated. These were all hardware guys: what was Mike Muller’s comment again?

A panel that left rather mixed feelings discussed whether there was still a role for start-ups. A number of senior executives from well established companies presented what they felt were strong and logical reasons that the day had passed when it was possible for a start-up to make a difference. It was faintly reminiscent of the mythical comment that there was no need for the patent office to continue since everything had already been invented. It was left to one of the audience to point out that start-ups had provided the bulk of innovative ideas for the history of the silicon industry, and, historically, many of them were acquired by established companies looking for innovative ideas before they made an IPO.

One of the interesting things about exhibitions is that, as well as catching up with companies you know, you get to meet new ones. New to me was a small French company founded by two brothers and devoted to modelling the thermal behaviour and power characteristics of ICs at the system level. The company, DOCEA Power, has a product called ACEplorer (Abstract Concept of Energy) which, working at the ESL level, above the RTL, uses an XML description to create the ACE Power model of the power in a system, either from IP-XACT and similar standards, or by hand using a state machine/hierarchical approach within a graphical interface. The model can be used to generate power intents for UPF (Unified Power Format – now IEEE P1081 if you have not been following the power saga). Thermal modelling can be from a Spice model or using DOCEA tools.

Once the system is modelled, then it is possible to explore power and thermal architectures through simulating behaviour. DOCEA claims that it can provide thermal simulation of a complex SoC in 0.1s compared to the four hours needed for simulations using finite-element methods.

Returning to my ruminations on the effectiveness of visiting trade shows: in three days I was exposed to a range of speakers, panels, companies and ideas. Canny exhibitors had contacted me to set up meetings – and they also set up meetings with customers and prospects. Their booths were hopping, while alongside them the booths of, possibly, equally interesting companies were standing empty, apart from a bored salesman. I also planned my visit, scheduling slots to listen to papers and attend panels and requesting meetings with exhibitors I thought looked interesting. In Germany (and, I am told, Japan), trade shows/exhibitions continue to draw visitors for whom attending a trade show in a systematic way is seen as an important part of their development as engineers. And since the visitors are coming, so are the exhibitors, with most of them not spending huge amounts of money on ego-stroking booths, but using the booth as a work-place for meetings and demos.

There is an argument that the web has replaced the trade show, and there is some strength in this. But at a good show you can ask questions, see demonstrations, and look the sales guy straight in the eye. You can repeat the process as many times as necessary when evaluating competing products.

Still to come is this year’s very late DAC. (It is the last week in July.) The chair of DAC, Andrew Kahng, was active in Nice and making a good pitch for the event, particularly as both Cadence and Magma are back as exhibitors. But at the time of writing, the exhibitor list still looks a little thin; especially considering the event is in San Francisco with easy access from across the world as well as from Silicon Valley. There are some twenty companies (including DOCEA) exhibiting for the first time. My feel – if you are not going to the conference, think about going to the show. Plan your visit in advance, and, if you make appointments (and I urge you to), make sure that they are, as far as possible, in logical progression around the hall. It is a long walk from one end to the other!

Give it a try and then let me know what you think.

Leave a Reply

featured blogs
Oct 21, 2020
We'€™re concluding the Online Training Deep Dive blog series, which has been taking the top 15 Online Training courses among students and professors and breaking them down into their different... [[ Click on the title to access the full blog on the Cadence Community site. ...
Oct 20, 2020
In 2020, mobile traffic has skyrocketed everywhere as our planet battles a pandemic. Samtec.com saw nearly double the mobile traffic in the first two quarters than it normally sees. While these levels have dropped off from their peaks in the spring, they have not returned to ...
Oct 19, 2020
Have you ever wondered if there may another world hidden behind the facade of the one we know and love? If so, would you like to go there for a visit?...
Oct 16, 2020
[From the last episode: We put together many of the ideas we'€™ve been describing to show the basics of how in-memory compute works.] I'€™m going to take a sec for some commentary before we continue with the last few steps of in-memory compute. The whole point of this web...

Featured Video

Four Ways to Improve Verification Performance and Throughput

Sponsored by Cadence Design Systems

Learn how to address your growing verification needs. Hear how Cadence Xcelium™ Logic Simulation improves your design’s performance and throughput: improving single-core engine performance, leveraging multi-core simulation, new features, and machine learning-optimized regression technology for up to 5X faster regressions.

Click here for more information about Xcelium Logic Simulation

Featured Paper

The Cryptography Handbook

Sponsored by Maxim Integrated

The Cryptography Handbook is designed to be a quick study guide for a product development engineer, taking an engineering rather than theoretical approach. In this series, we start with a general overview and then define the characteristics of a secure cryptographic system. We then describe various cryptographic concepts and provide an implementation-centric explanation of physically unclonable function (PUF) technology. We hope that this approach will give the busy engineer a quick understanding of the basic concepts of cryptography and provide a relatively fast way to integrate security in his/her design.

Click here to download the whitepaper

Featured Chalk Talk

Embedded Display Applications Innovation

Sponsored by Mouser Electronics and Texas Instruments

DLP technology can add a whole new dimension to your embedded design. If you considered DLP in the past, but were put off by the cost, you need to watch this episode of Chalk Talk where Amelia Dalton chats with Philippe Dollo of Texas Instruments about the DLP LightCrafter 2000 EVM. This new kit makes DLP more accessible and less expensive to design in, and could have a dramatic impact on your next embedded design.

Click here for more information about Texas Instruments DLP2000 Digital Micromirror Device (DMD)