When you are next in a hospital, watch the medical staff as they get into a lift. The odds are they will call out the floor they want and someone non-medical will press the button. This is not because they are lazy but because studies have shown that hospital lift buttons have more bacteria on them than toilet seats. Clearly a sensor that recognises a wave of the hand or a pushing gesture as a signal to choose a floor would be a good replacement for a physical button, but how do you know whether your gesture has been recognised? Imagine instead, that – as you made a pushing gesture, your hand could feel a “button”?
While gestures are a relatively common control mechanism in game consoles and Augmented/Virtual Reality, they tend to be big and sweeping movements and lack clear definition. A new British company aims to change this. Meet Ultrahaptics.
The company is less than three years old, but it has attracted over £12 million from VC investment and a European Union grant, and it has engagements with many tier 1 companies around the world. These are, in the main, confidential, but two names that are in the public domain are vehicle manufacturers Jaguar Land Rover, and high-end audio company, Harman.
The basis of the technology is something that we all learned in elementary physics – sound travels as a pressure wave through the air. Normally we don’t feel this pressure in the human hearing range (unless you are standing in front of a speaker stack at a heavy metal gig) but as long ago as the 1970s it was found that ultrasound in the 40 kHz range (humans top out at around 20 kHz) can be felt on the skin.
Fast forward to early 2009, when Tom Carter, a student in Computer Science at the University of Bristol, was looking for a topic for his Master’s dissertation and was steered to ultrasound as a haptic (touch) technology. Since the 1970s, ultrasound transmitters have become widely used in sensing applications – for example in parking a car – and so are easily available and cheap. Tom focused the output of an array of transmitters so that they produced a strong feeling on the skin. Varying the focus and transmissions can produce a feeling of different shapes, ranging from a stream of bubbles to 3D shapes, like a pyramid. Adding this to existing gesture capture systems made it possible to get dramatically increased accuracy.
In parallel with starting his Doctorate, Tom, with his supervisor and with the backing of the University, started a company called Ultrahaptics.
One thing that has worked in Ultrahaptics’ favour has been that Tom was in Bristol. Since the arrival of my old employer INMOS in 1979, Bristol and nearby Bath have been at the centre of significant growth in electronics. The University of Bristol has a long tradition of supporting spin-out companies and, along with the universities of Bath, Exeter, Southampton and Surrey, is part of the SETsquared Partnership, which was last year ranked first in the world as a university business incubator. In Bristol, SETsquared is based in Engine Shed – part of the mid-nineteenth century station built by the visionary engineer, Brunel – crammed with other incubators and business services, and this where Ultrahaptics began commercial operations. The fledgling company was introduced to experienced manager, Steve Cliffe, who joined as CEO and has brought on board the seasoned management team needed to drive the company from a very exciting idea to a commercial success. With the Round A funding earlier this year, they have left the SETsquared nest and set up home a few hundred yards away, in Bristol’s Enterprise Zone. (Enterprise Zones are government-designated areas for start-up companies and growing organisations, where planning rules are relaxed, local property taxes (business rates) are non-existent within certain limits, there are tax breaks on capital investment, and there is super-fast broadband.)
The company has recently reached and passed 40 employees (from 12 nationalities) and is actively recruiting. Most people work in an open plan area, but an unusual feature is the company kitchen. Big enough to hold all staff, if you are lunching in the office it is mandatory to use the kitchen; eating at desks is forbidden. Food can be brought in (there is a food truck area only a few yards away), and Steve Cliffe sees this as an important part of team building.
What are all these guys doing? The company has been selling evaluation kits to customers, and these require significant work to customise them to meet the customers’ needs. The EU grant (from a part of the Horizon 2020 programmes aimed at helping new SMEs (Small and Medium Enterprises)) is to help productise the system, making adoption easier: the plan is to create an API. There are also people beginning to develop reference designs for some of the target application areas. And, of course, there is the marketing and sales team, and even some administrative people.
The companies that have bought evaluation kits are now moving towards using the technology in production products, and there is now a first licence in place, which will start bringing in royalty revenue at the end of 2016. Apparently each licence, at least initially, will be individually negotiated, with royalties being based on the value of the end product.
What is being licensed is the control software for the array of transmitters, which runs on an ARM Cortex-based micro or an FPGA. (The evaluation platforms are powered by silicon from fellow Bristol university spin-off XMOS.) The system is totally agnostic to the motion-capture system, and the company is in conversation with a number of vendors.
There has been interest from a wide range of companies, but the company has decided to concentrate initially on a small number of target markets. The first is automotive, where it is responding to considerable interest. The increased use of a single-touch screen for infotainment – including GPS navigation – and for cabin environment control and other minor matters, means that you can no longer just reach over and turn a knob or press a button. Instead you have to look, taking your eyes off the road for, arguably, longer than is safe. With the Ultrahaptics technology, you can still reach for and turn a knob, press a button and push a slider. A video of the possibilities (a part of which is on a YouTube compilation) shows a driver selecting a radio channel and controlling the volume, all while driving a Tesla.
Domestic appliances is another market place where controlling without touching has an immediate attraction. The demo for this market is a cooker hob with glass heating rings. A quick push against the “control” turns a ring off or on, whilst a rotating movement increases or decreases the temperature. Now we need someone to add ultrahaptics to the famous internet toaster.
The companies developing Virtual and Augmented Reality and gaming consoles are a natural market. There have been gloves and finger clips, but they have required complex cabling to transmit signals and provide the power to activate pistons or other pressure generators. The only, perhaps minor, criticism is that, while the system can generate the feel of an object, it can’t stop you, for example, from pushing your hand through a wall.
Another demonstration is that of controlling a DJ’s mixing console through virtual controls. More prosaic, but still fascinating to most people who experience it, is the ultrahaptic loudspeaker demonstrated by Harman at CES earlier this year. The interface allows selection of sources, of tracks on a source, and the volume at which the music is played, all through fairly compact hand movements.
As a package, Ultrahaptics, the company, has a lot going for it: an intriguing technology, a deeply experienced and competent management team, a good level of investment (in fact the A round was oversubscribed and had to be scaled back), already a significant cash flow (the company is predicting $2.5 million in sales in 2016) and, at least at the moment, more potential customers than they can service.
I suppose at this point I should be sitting back and, with an expression of disbelief, be saying, “And yet…” But so far, apart from an innate cynicism any hack should have that says that if something looks that good there must be a flaw somewhere, I haven’t found that flaw, and to be honest, I hope that there isn’t one.