feature article
Subscribe Now

Jubilee Technologies

You may have heard – in Britain we had a major Royal event a few weeks ago.  It was celebrating 60 years of Elizabeth’s reign but, as many commentators probably told you, although she became Queen in February 1952, when her father died, she was not crowned until June 1953. And, like many other people, watching the Coronation in 1953 was my first experience of television. Along with about 15 adults and six other children we were crammed into the kitchen of neighbours of my grandparents. (The front room was kept for “best”.)

We watched a black and white picture, on a screen no more than nine inches diagonally. (Although this set didn’t have one, you could buy magnifying sheets, which limited even further the angle of view.) The whole set was in a wooden cabinet that stood about three feet high, and, below the screen, was a speaker aperture with an elaborate wooden grill over a woven cover.  In fact, for much of the day, the whole television was covered by a tablecloth with a plant pot on top.

The contrast and brightness were both so limited that, to see the picture, the curtains had to be closed. For several more years in Britain, there was only one channel, broadcasting for a few hours each evening, much of it live. And, of course, the set took hours to come up to speed, as the vacuum tubes and the cathode ray tube “warmed up” – the first silicon transistors were still a year away. (Although the television took only slightly longer to warm up than it seems to take some of today’s digital set top boxes to boot up.)

The logistical arrangements that brought us together that wet day were conducted by post – a letter posted in the morning was delivered locally in the afternoon then. Telephones were rare. In our street of newly built houses, I don’t think there was a single telephone at that time (and only two or three cars). In an emergency, if you had to make a call and there was no neighbour with a phone, then you used a red telephone box. And telephone calls were expensive and complicated.  It was possible to dial numbers that were on the same exchange, but all the other calls, even to a nearby town, had to be put through by an operator, and, for longer distance calls, they often had to be booked in advance.  It was normal to ask the operator to call you back after the call was completed to tell you what the cost was. (If you used this regularly, you knew to ask for an ADC call – Advice of Duration and Charge.)

Fifty-nine years later, the single television set has almost totally disappeared. As well as multiple receivers in many homes, TV programmes are streamed across the net, through cable and satellite, with analog terrestrial transmission being replaced by digital broadcasting (the digital switch-over is almost complete in Britain). People are watching on their computers, their iPads and even their phones. And telephones are ubiquitous: I have four handsets on my desk, a Skype set for free calls over the internet, a cell-phone, with a rental package that gives me free calls to most UK landlines and to other cells on the same network, and two landline phones from different suppliers linked to broad-band connections, and both of which give me extensive free calls within Britain and low cost (in some cases free) calls to the rest of the world.

A few years after the Coronation, I heard about computers. My headmaster’s brother worked for a strange company called International Business Machines, and he talked to us about these magic boxes.  A point he persistently made was that these magic boxes were nothing more than ”dumb clerks” that could do only what humans told them to do. After school, a short spell in an engineering company’s R and D section brought me into contact with computers again. At this time, most of the popular writing talked equally about analog and digital computers, with analog machines using exotic technologies like pneumatic pressure. A team in the company was evaluating the use of digital computers to assist in planning large-scale projects. One such project was being managed using the new technique of critical path analysis, and the critical path diagram was laid out over the entire floor of a large office, with changes being carried out by eraser and pencil. (The project came in on schedule and within budget, though.)

A move into libraries brought me back into contact with computers again, as we began the first steps to using them to compile the catalogues of library holdings. The college I worked at was a pioneer of on-line systems, using a DEC PDP-10 linked by dedicated lines to TeleType machines across the campus and in schools within about thirty miles. While the system was mainly used for maths and for the fledgling development of computer studies, one school was using a simple database system to build a longitudinal study of the village and its changes as a part of a geography course. The system was constantly being upgraded, but a huge breakthrough came when DEC switched from magnetic core memory to solid state, allowing the purchase of double the amount of main memory than had been authorised.  (I think it was from 4K 36bit words to 8K words.)

A big excitement was the arrival of a new teletype in the library that was not connected to the computer but to a wooden box, into which you jammed a telephone handset. Before you did that, you called a local number and explained that you were to be connected to a special line and that the funny noises were intentional. After this the line connected from London to Norway, thence via a satellite link to the east coast of the US and onwards through the early store-and-forward ARPANET to the Lockheed Dialog database. This was to allow the library to take part in a project for evaluating on-line searching of scientific literature, pretty exciting stuff in the late 1970s.

The TeleType only printed on to paper when the signal, generated by hitting a key (and you did need to hit), had completed the round trip:  there was an uncomfortable period where nothing very much seemed to happen. At this time a programmer working for me discovered how to route to other computers and began a long distance conversation with an operator in Hawaii. (Computers in those days had large operations teams working shifts around the clock.)  This required yet another satellite link, from the west cost of the US to Hawaii and back, so the response time began to be measured in minutes. They enjoyed their conversations though.

There were other computers around the campus – not then linked into a network, and a milestone was passed when the venerable Elliott 803 was decommissioned and replaced by the first of what became multiple DEC PDP-11s, which had the theoretical potential of talking to the PDP-10.

At the time of the Queen’s Silver Jubilee in 1977, the very first small, desktop computers were making their appearance. They usually used proprietary processor architecture, with differing, and sometimes idiosyncratic, instruction sets. Transferring anything, data or programs, between machines was almost impossible. The company I was by then working for decided that it would try to create an operating system that would present the same interface to users, despite the differing underlying architectures. The term “middleware” did not exist, and each implementation was individually tailored to run on the underlying instruction set – very hard work, which created enormous support problems. The project was dropped after a considerable investment.

Telephones in the UK, by this stage, were direct dialling, although they still routed through mechanical Strowger exchanges. (The first experiments with solid state switching in Britain were started in 1962, before all the appropriate technology was fully developed, and the memory of this failure haunted the General Post Office and its successor, British Telecoms (BT) for many years.) It was also possible to dial an increasing number of overseas countries, although transatlantic calls routed through satellites frequently had terrible echoes. Television was also passing through satellites, although there were very few direct to home broadcasts, the satellites being used to connect broadcast hubs to local broadcast relay stations.

Moving to this month’s celebrations, television was, as we have discussed, ubiquitous, with the live broadcasts reaching people through a raft of different channels and viewed on a raft of different platforms. What was also ubiquitous was people at the live events watching the events, not directly, but through the viewfinders or screens of their phones or cameras as they recorded the event for later viewing (or for posting on You-tube). 

On a side rant – this abstraction of the real experience is one of the most marked features of the current decade. Headsets mask out the sounds of current world, and then the camera is used to filter the sights, further protecting the user from real life. At Mont St Michel last year, hordes of tourists were walking though the narrow medieval streets and then the monastic buildings, not looking directly but merely through the viewfinder of their recording equipment. It seems unless you have the recorded evidence in some ways you weren’t actually there. And where do they find the time to view these things?

The huge difference in sixty years of technology is something that we know about in general, but sitting down and tying together the threads brings it home in specific ways. It feels like riding a roller coaster, powered by the consequences of Moore’s Law.

Douglas Adams once wrote:

“I’ve come up with a set of rules that describe our reactions to technologies:
1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you’re thirty-five is against the natural order of things.”

It is difficult to disagree with the great man, but I am well over thirty-five, and the technology that still keeps rolling on is still new and still exciting.

2 thoughts on “Jubilee Technologies”

  1. I remembering watching the coronation on our neighbor’s B&W TV. The commentator mention that this event had occurred 6 hours previously in England. The event was filmed, processed, put on a jet plane and was now being broadcast to the US audience. That was the state of the art at that time!!

Leave a Reply

featured blogs
May 14, 2021
Another Friday, another week chock full of CFD, CAE, and CAD news. This week features a topic near and dear to my heart involving death of the rainbow color map for displaying simulation results.... [[ Click on the title to access the full blog on the Cadence Community site....
May 13, 2021
Samtec will attend the PCI-SIG Virtual Developers Conference on Tuesday, May 25th through Wednesday, May 26th, 2021. This is a free event for the 800+ member companies that develop and bring to market new products utilizing PCI Express technology. Attendee Registration is sti...
May 13, 2021
Our new IC design tool, PrimeSim Continuum, enables the next generation of hyper-convergent IC designs. Learn more from eeNews, Electronic Design & EE Times. The post Synopsys Makes Headlines with PrimeSim Continuum, an Innovative Circuit Simulation Solution appeared fi...
May 13, 2021
By Calibre Design Staff Prior to the availability of extreme ultraviolet (EUV) lithography, multi-patterning provided… The post A SAMPle of what you need to know about SAMP technology appeared first on Design with Calibre....

featured video

Insights on StarRC Standalone Netlist Reducer

Sponsored by Synopsys

With the ever-growing size of extracted netlists, parasitic optimization is key to achieve practical simulation run times. Key trade-off for any netlist reducer is accuracy vs netlist size. StarRC Standalone Netlist reducer provides the flexibility to optimize your netlist on a per net basis. The user has total control of trading accuracy of some nets versus netlist optimization - yet another feature from StarRC to provide flexibility to the designer.

Click here for more information

featured paper

Use Configurable Digital IO To Give Your Industrial Controller the Edge

Sponsored by Maxim Integrated

As factories get bigger, centralized industrial process control has become difficult to manage. While there have been attempts to simplify the task, it remains unwieldy. In this design solution, Maxim briefly reviews the centralized approach before looking at what potential changes edge computing will bring to the factory floor. They also show a digital IO IC that allows for smaller, more adaptable programmable logic controllers (PLCs) more suited to this developing architecture.

Click to read more

featured chalk talk

Minitek Microspace

Sponsored by Mouser Electronics and Amphenol ICC

With the incredible pace of automotive innovation these days, it’s important to choose the right connectors for the job. With everything from high-speed data to lighting, connectors have a huge impact on reliability, cost, and design. In this episode of Chalk Talk, Amelia Dalton chats with Glenn Heath from Amphenol ICC about the Minitek MicroSpace line of automotive- and industrial-grade connectors.

Click here for more information about Amphenol FCI Minitek MicroSpace™ Connector System