posted by Dick Selwood
The imitation Game
We rarely do film reviews - mainly because we rarely see films that are relevant - but the Imitation Game is about Alan Turing, the breaking of the German Enigma code and the invention of the computer. Or is it?
Turing was a mathematical genius, probably well along the Asperger's spectrum, worked as a code-breaker, was gay, and whether he invented the computer or was the catalyst that helped others define a computer, is one of history's great debates.
The film works through flashbacks from 1952, when he is being interrogated for homosexual acts - then illegal - to around 1940 and the Bletchley Park code breaking centre and also to 1929/30 when he was at school. While the film may have some aspects of Turing as a person correct, (although downplaying his sexuality) the writer felt it necessary to over-dramatise events. So while Turing was already working on cryptography and the German Enigma machine in 1939, he is shown as arrogantly demanding to work at Bletchley in 1940. He hand builds the first Bombe, appealing to Churchill for support to complete it. He was indeed part of a group who appealed to Churchill for more resources, but only after they had had a number of Bombes built by professional engineers, and used them to break codes wanted more to keep up with the flood of messages.
However the film avoids the incredible crudity of the film U-571, in which a US warship is shown retrieving code books and Enigma machine from a German U-boat in 1942. The incident was based on reality – just. Sailors from HMS Bulldog retrieved an Enigma and code books from U-110 in April 1941.
It barely mentions that he died of self-inflicted cyanide poisoning, rather than continue with the chemical castration. With all the caveats, it is an entertaining way to spend around two hours. Then go out and read about the real Turing. http://www.turing.org.uk is a good starting point.
And the film title? The Imitation Game is Turing's own name for what we would now call the Turing Test. i.e. Can you, through question and answer, conclude if you are talking to a human or a computer?
posted by Bryon Moyer
There’s a bit of repositioning going on in EDA-land. It involves Synopsys’s popular Verdi tool, acquired through the SpringSoft purchase. Conceived as a flexible debug tool, it also has an open scripting environment that gives engineers access to data in the fast signal database (FSDB) file. With that capability, folks have been bolting analysis utilities onto Verdi for a while on an ad hoc basis.
This hasn’t gone unnoticed at Synopsys, and they’re now in the process of repositioning Verdi: it’s not just for debug anymore. While it obviously still includes debug in its expanded portfolio, Synopsys is adding features that don’t necessarily fit the debug profile.
One of those is Verdi Coverage. This is intended to help build and track a verification plan that is tightly synchronized with the design requirements. This concept might be familiar to any of you that have seen similar tools in the software space from companies like LDRA.
The assumption here is that verification tests spring from requirements. (If it’s not required, then why are you testing it?) And all requirements should be documented in a requirements document. Verdi Coverage lets you tie tests to requirements and tick off coverage at the requirements level.
Where this can be particularly helpful is when requirements change. Yeah, it’s a thing; it happens. Who knew. Verdi Coverage tracks the requirements documents and can notice when changes occur. This allows you to go in and modify the verification plan accordingly, if needed.
How do they do that? They rely on the PDF file format for the document. Best practice is to use an outline structure in that document. They capture the text from the document, along with some meta-information about where the text is to be found.
And when the document changes? How can they pinpoint the changes? Diff technology. Off the shelf, actually. Apparently the ability to diff two files has gotten pretty good these days. (It’s not as easy as you might think: as soon as one thing changes, then everything after it might seem different unless you can identify the type and scope of the change and then get back on track with unchanged text.) The important thing is this: there’s no special formatting you need to do so that this will work. Write a well-organized, well-structured document (so that a human can process it well) and Verdi Coverage will be able to handle it.
Far from being a debug thing, this becomes a planning tool up front, creating a specific link between requirements and the elements of the verification plan. It applies across verification technologies (formal, simulation, etc.). As long as the requirements document is being kept up to date, there’s no reason for the verification plan to get out of synch with it.
You can find out more in their release.
posted by Bryon Moyer
The last couple weeks have involved two events with sensors center-stage. The MEMS Executive Congress is a confab of executives from the MEMS industry (and some non-MEMS companies), put on by the MEMS Industry Group (MIG). It’s been around for years.
TSensors, by contrast, started last year as a push by MEMS luminary Janusz Bryzek to identify and eliminate roadblocks to achieving sensor volumes in the trillions (the “T” in “TSensors” is for “trillion.”) While the MIG event tends to involve a conventional conference pace (hopefully with unconventional new ideas), TSensors involves two days of rapid-fire presentations (18 minutes to present, 2 minutes of questions… the timer is ticking!).
I learned lots of new things at both events, and I’ll be rolling out details over time. But, backing up a level, I wanted to take note of the tone taken in particular by TSensors.
The TSensors theme was “Abundance,” leveraging the popular book by Peter Diamandis. First of all, the tone of the book (which I’ll freely admit I haven’t read myself) is said to be highly optimistic – a refreshing take in a time when things don’t always feel like they’re going well.
But the other thing that I came away with was a renewed sense of engineering doing things that help the world. Frankly, some of the goals – like access by all to health care – might be viewed as problematic in some corners. Be that as it may, it felt good to think about the impact of our work on real people.
It’s not like money left the equation; heck, one of the repeated themes was the need to reduce sensor costs so that we can do these things while still rewarding folks for their innovations; it won’t work otherwise. But the difference was that the money, while necessary and important, wasn’t the be-all end-all goal in and of itself. It’s an enabler, not the final result.
Whether the bean counters strip all the hippie-dippy crap by the time this turns from PowerPoint to a business plan remains to be seen. But it’s good to look around occasionally and notice that we do some good work.
And there’s lots more good work to be done.