feature article
Subscribe Now

Return News Roundup

Power, Sensors, Clock Trees, Multicore and Compression Algorithms

September, at least in the Northern Hemisphere, is often seen as the real start of the year. Companies are returning from their summer holidays and revving up with new promotional activities and, particularly in even numbered years in Europe, starting to work towards the huge techno-fest that is electronica in Munich in November. Now this may be a very interesting observation, but why is this relevant? Well, in the last few weeks, I have been exposed to a raft of interesting things, many of which would be worth a whole article in their own right, but, given the limitations of space and time, I have decided to bundle together several different stories from across a wide spectrum of electronics.

Power supplies

Much of what is written in the electronics media concentrates on digital chips and their design and manufacture. We are probably as guilty as most in focusing on these areas, but, after designs have been implemented in these ever-more-challenging process nodes, the chips have to go onto a board, and then they require power. Given the huge efforts that go into reducing the power within SoCs, ASICs, and microcontrollers, it is perhaps surprising that getting the power to them as efficiently as possible doesn’t get so much public attention. I recently sat through presentations from companies with power supply products at a number of levels. Clearly, if you are doing power conversion, the energy that the convertor uses is going to be a critical factor. Firstly, higher energy use comes with a cost, and, secondly, the energy wasted appears as heat, which has to be dissipated.

Intersil was talking, amongst other things, about their range of automotive power products. Automotive environments are particularly nasty, and automotive applications cover a huge range of technology. While 12 volts is still the market standard for car batteries, some European car makers are working towards using 48V. At the same time, the electronics, the engine control units, the infotainment systems, and other electronic systems run on “standard” 5V or 3.3V. To match this, Intersil are launching a range of devices. One is a 55V synchronous buck controller (ISL786268), supporting currents from 5 to over 25A, and 12, 24, and 48V. At almost the other end of the scale is a device to manage the battery for the e-call safety system. In Europe all new cars have to be fitted with a system that will call the emergency services after an accident. Clearly this has to be powered by an independent battery, and this battery needs to be kept in peak charge, even after the car has been standing for a long time and the e-call system has been dormant. The ISL78692 is designed to just that task in a 3×3 mm package (pass the magnifying glass).

XP Power is working in a different domain, and they were talking about ac-dc power modules and, in particular, a new family that can be PCB, chassis, or DIN rail mounted. The ECE60 family provides 60W at voltages from 3.3V to 48V, with efficiencies of up to 89%, in a package only 91.4 x 38.1 x 28mm, convection cooled.

XP will be launching other power products before electronica, as will Exar.

Timing

Getting timing issues resolved on a single chip can be a problem. So just think of the timing problems for the Internet of Things. Just look at all the different elements, from the things themselves, through hubs, switches, and networks, and the racks of servers, all with their own clocks, and some with multiple clocks. As many of these clocks are supplied by Silicon Labs, the company has developed the Clock Tree Expert, a tool for creating clock trees for Internet infrastructures, from the top to the bottom, including networking, telecoms and data centres.

Sensors

TSensors Summit is an organisation trying to pull together roadmaps for the trillion sensors that will, in time, form the external interfaces to the IoT, amongst other things. According to the organisation, the number of sensors in consumer applications alone has already risen from 10 million in 2007 to 10 billion in 2014. I rather naively expected the sessions on roadmaps to actually put forward roadmaps that would indicate what different technologies were going to offer to different markets in what timescales. What I hadn’t realised is how early we are in the roadmap process, and, instead, a number of the sessions were on building the roadmaps. Once I wrenched my sense of perspective around, things got interesting and, at times, mind blowing. For example, Ira Feldman, a consultant, looked at the Global Challenges of healthcare, a clean environment, clean energy, and education. For each of these, it is apparent that the use of sensor technology can contribute to meeting the challenge: for hunger, the growth of smart food production, such as improved aquaculture, vertical farming (where you stack huge growing trays above each other to make the best use of limited space), deploying robots in the field (literally in the fields), and other techniques can help to increase the amount of food grown and possibly improve its quality. All these applications are going to rely heavily on sensors. This part of the roadmap is missing the studies that will define the problems more precisely and then look at technologies, including sensors, that can help to solve them. This will also identify gaps in the current technologies that require filling and look at how they can be filled. In other sessions, I came across a new piece of jargon for working on this – Technology Readiness Level (TRL – a new TLA). The levels run from 1 – an area identified for lab research, to 9 – technology actually deployed in real-life applications. Assessing the TRL of a technology allows the developers of a roadmap to get a more realistic time scale for when it can actually be useful.

The same analysis process is going ahead, not just for the other Global Challenges, but also for industrial, automotive and consumer market places.

I will be tracking TSensors Summit and reporting back on it in more depth in due course.

Multicore

Multicore wasn’t invented by Intel in 2006 – it has been achievable since INMOS launched the transputer in 1984 (although we called it parallel processing – looking at the activity rather than the objects that were needed to achieve it), and it was certainly discussed for years before that. Despite this, there is still ferocious debate over architectures, programming models, and languages, as well as implementation details. However, last month’s Multicore Challenge in Bristol was less ferocious than many, perhaps because many of the speakers were – either directly or indirectly – ex-INMOS. (Declaration of interest – that covers me as well).

The discussion has moved on in some areas: multicore is now heterogeneous rather than homogeneous, for example. Your cell phone has multiple processors, some with conventional architectures, others labelled as Graphics Processing Units.  Imagination Technologies, seen very much as a GPU IP company, changed the game when it bought MIPS, and it used the meeting to describe the new i6400 core (or family of cores) with multithreading and virtualisation. The cores can also be used in multiples in SoCs and ASICs. Jim Turley reviewed it a few weeks ago. XMOS, Infineon and Huawie also discussed new products. None of them, however, gave marketing presentations, but, instead, they used their products as examples of particular approaches to architecture for multicore.

But David May (the implementer of the transputer, Professor of Computing at Bristol University and founder of XMOS) tried to bring us back to some of the basic issues in a keynote. He argued that we are working in a wide range of different computing areas, but we are still mainly reliant on an architecture that was developed to run the desktop computer. It is time to go back to a simpler model, both of architectures and of the tools to use them. To gain the processing power that the large-scale computers are going to need will also require an investment in development tools and in training users to develop efficient working practices. Today’s inefficient approaches to programming are not going to be good enough.

This same message came up sort of in reverse in the closing panel discussion where the incredible complexity of many things is a barrier to creativity. My favourite quote of the day is from May: “We have focused a huge amount of investment on chasing Moore’s Law. If a tiny fraction of that money had been spent on looking at software and algorithms, one would have thought we would have made a lot more progress.”

Compression algorithms

One of things that are interesting about Europe is the number of small companies who are getting on with things that are important but who don’t make a huge song and dance. (As opposed to some big companies who make a huge song and dance about announcements that are, frankly, insignificant.)

One I recently met is Westwood Rock. This a hardware and software design consultancy with a bunch of bright people and a track record of successful projects, only some of which they can talk about; others are still covered by confidentially agreements. With the Internet, they all work from home offices, using Skype and the occasional face to face. This removes a huge overhead from their business.

The reason for meeting them was that they have developed IP for the silicon implementation of a video compression standard. So what?

Well, the standard is unusual. It was developed by the BBC (British Broadcasting Corporation – the UK’s broadcaster and a technology innovator for nearly 100 years) to cope with shipping the vast quantities of data that make up a high resolution TV signal, both around the studios and from outside broadcasts. The end result is Dirac, a video compression system that was built with the objectives of high quality images and low latency in a low form factor, and which is also standardised, non-proprietary and royalty free.  It is recognised as SMPTE standard ST 2042-1 2012 and is also called VC-2.

It achieves a compression factor of 8 times – so the signals for HD-SDI television, which run at 1.5Gbit/s, can be carried over a 270Mbit/s link, and, at the same time, a standard monitor can be used to see the picture in real-time, albeit at low-level.

A broadcaster may compress and decompress a signal for up to six to eight generations, as recording, storage, editing and transmission take place, so Dirac is tuned for minimum loss of quality

Dirac is designed to be as future-proof as possible, so it is not tied to a specific picture format. At the same time, the latency is at worst a single frame and may be only a few lines.

Westwood Rock has taken this standard and has implemented a codec in hardware. The IP has been developed under a rigorous process, so it can be certified for high reliability applications. The implementation is also deterministic, again helping it to achieve. It has been implemented in an FPGA in a relatively small number of gates and with no need for extra memory and is also available as IP for ASIC or SoC implementations, although Westwood Rock could carry out a complete design for a customer, if required.

OK, not earth-shattering, but still a nice piece of engineering, the sort of thing that is a pleasure to learn about and share.

Leave a Reply

featured blogs
May 5, 2021
New 5G infrastructure is powering smart city projects worldwide; explore the importance of IoT security for smart city solutions in public safety & logistics. The post How 5G Networks Will Accelerate Development of Smart Cities appeared first on From Silicon To Software...
May 4, 2021
What a difference a year can make! Oh, we're not referring to that virus that… The post Realize Live + U2U: Side by Side appeared first on Design with Calibre....
May 3, 2021
As a NASA flight enthusiast, the idea of unmanned aerial vehicle systems (also known as drones) sounds like a lot of fun. A good example of how fun drones can be is through drone racing'¦yes you read that right'¦ drone racing! However, apart from how fun they can be, drones...
May 2, 2021
https://youtu.be/1HEd6JCriCQ Made in Groveland CA (camera Carey Guo) Monday: Package Assembly Design Kits Tuesday: Rapid Adoption of the Arm Server-Class Processors Wednesday: Arm V9A Thursday:... [[ Click on the title to access the full blog on the Cadence Community site. ]...

featured video

The Verification World We Know is About to be Revolutionized

Sponsored by Cadence Design Systems

Designs and software are growing in complexity. With verification, you need the right tool at the right time. Cadence® Palladium® Z2 emulation and Protium™ X2 prototyping dynamic duo address challenges of advanced applications from mobile to consumer and hyperscale computing. With a seamlessly integrated flow, unified debug, common interfaces, and testbench content across the systems, the dynamic duo offers rapid design migration and testing from emulation to prototyping. See them in action.

Click here for more information

featured paper

Compact. Precise. Connected. Increase productivity with intelligent edge computing.

Sponsored by Texas Instruments

Smart devices in factories and buildings are getting smaller and more capable, with enhanced real-time control, robust connectivity, and configurable web services. Read about new processor technology that is unleashing the true potential of Industry 5.0 and the Internet of Things.

Click here to read more

featured chalk talk

Minitek Microspace

Sponsored by Mouser Electronics and Amphenol ICC

With the incredible pace of automotive innovation these days, it’s important to choose the right connectors for the job. With everything from high-speed data to lighting, connectors have a huge impact on reliability, cost, and design. In this episode of Chalk Talk, Amelia Dalton chats with Glenn Heath from Amphenol ICC about the Minitek MicroSpace line of automotive- and industrial-grade connectors.

Click here for more information about Amphenol FCI Minitek MicroSpace™ Connector System