feature article
Subscribe Now

Scopes

Much More than Just a Wriggly Line

If you are really up-to-date on what is happening in the world of oscilloscopes, then I am afraid that this Embedded Technology Journal Update is not for you – unless you want to go to our comments page and add your two cents’ worth of correction. But if, like me, you were vaguely aware that things are changing in the measurement field, then brace yourself.

The cathode ray tube, with its wriggly signal, (OK, with its wave form) was so much the shorthand for “electronics” that The Plessey Company, for a while Britain’s leading electronics company, used a stylized screen trace as its logo. With the rise of digital systems, another form of analysis tool, the logic analyzer, was developed to look at the zeros and ones and provide, as its name suggests, some analysis of what was happening. These two sat beside each other on the bench, but the two boxes are now increasingly merging into just one.

The last few years has seen an explosion in the use of high speed serial buses. For moving large quantities of data around a system, or between different systems, serial communications dominate. USB 3, HDMI, SATA, and PCI Express (in its various incarnations) are all driving communication forward. (In lower bandwidths, buses like CAN, I2C and FlexRay are now well established.)

Debugging systems using these high speed buses is a significant challenge, and the oscilloscope manufacturers are rising to this, not just with improved hardware, but also with software, since today’s oscilloscopes are, from one perspective, effectively high-end, special-purpose PCs. They have evolved from the all-analog circuits driving the CRT to all-digital systems with fast processors running an operating system (normally Windows), supported by hard disks, high resolution screens, and their own high-speed communications links. Built on top of these is the hardware that is specific to an oscilloscope: this includes the analog-to-digital conversion (ADCs) devices (often many of these, interleaved to get the speed needed for each channel) and the links from ADCs to the system under test. Although these are still called probes, hooking into a board brings its own issues. Behind the ADCs will be specialist high speed memory and, since engineers like to twiddle knobs, there is circuitry to link the array of knobs on the fronts of the boxes to the system.

(Some companies argue, “Why buy yet another PC? Instead, buy one of the PC add-on boxes we sell.” We will discuss this approach later.)

Normally, you use an oscilloscope to identify, to remove or correct, noise, jitter and timing issues. In oscilloscope specification terms this means looking at the technology for capturing, storing, displaying, and analyzing the signals.

For capture, the oscilloscope’s bandwidth and sampling rates are important, and the bandwidth is going to be defined at around two to three times that of the clock of the signal under test. This has lead to a spec-sheet war between the oscilloscope suppliers like Le Croy, Agilent and Tektronix. At the top end of the data sheets will be around 30 GHz bandwidth, with sample rates 2.5 to 3 times that; for example, 80 GS/s (Giga samples per second). One data sheet also cites memory sufficient for 512 Mega points of analysis and edge triggering of greater than 15 GHz.

These oscilloscopes will have multiple channels for both analog and digital data and will have large screens (up to 15”) for data display. These large screens are necessary, not just for clearer and easier to read display of the analog signal, but also to make it easier to display both analog and digital at the same time.

For these high speeds, it has been a long time since probes have been simple, passive wires. Clearly, as soon as you attach a probe to a circuit, it immediately becomes a part of the circuit and exerts an influence on the signal that is under test. So probes themselves are now complex, active, subsystems with characteristics that need to be closely matched to signals and devices.

Larger memory, as well as assisting the torrent of data from the system under test to the display screen, also provides a huge hike in problem solving. For example, setting a trigger that is activated when a specific state occurs is not an uncommon test activity. With large amounts of memory, it is now possible to examine not just the system state when the trigger is activated, but also the activity before the trigger state: the amount of activity that can be displayed is obviously related to the amount of memory. Since the primary cause of an event can take place several seconds before the event occurs, which for a high speed channel can represent many megabytes of data in both the digital and analog flows, the greater the depth of memory, the greater the chance that it can be identified.

Those of you with long memories may recall Polaroid camera attachments. These were mounted onto the front of the CRT and, if you pressed the button at exactly the right time, they captured a particular image permanently. Younger readers may remember printing oscilloscopes that drew traces on scrolling paper, either with nibs at the ends of arms (think of the lie detector in many movies) or with an array of inkjet heads. Today, both captured and analyzed data is stored on a hard drive and transmitted, via USB, through Ethernet or through proprietary buses to “ordinary” PCs for further analysis and for project documentation and archiving. There is clearly significant value in being able to access the actual test and conformance data at future times, for certification or for field support.

Ethernet capability, as well as providing communication for data storage, also allows live analysis to be seen by others not in the test lab, or for the oscilloscope to be controlled remotely.

Since the upper-end oscilloscopes have significant processing capability, it is possible to add functionality through software. One of the most significant of these additions is protocol identification and analysis. The ability to correlate a part of the message, say the message length identifier in a header package, with the other elements of the signal can be extremely valuable in helping to identify strange elements.

Protocol understanding can also be used to assist in protocol conformance testing for new controller chips. The larger manufacturers provide software that reads the data being transmitted from a device under test and evaluates whether it matches the defined standard. For receiving devices, the oscilloscope provides traffic and then measures how the device is coping.

The functionality that has been developed for the extreme machines is now migrating down to the midrange machines. While they can neither reach the ultra-high speeds of the top end machines nor match the sampling rates or memory depth, they are more than adequate for measuring and providing conformance testing for the established serial standards and for a wide range of other applications.

The lower-end products from the larger manufacturers are also benefiting from trickle down, although with fewer channels, or maybe only analog capability. The bigger and more established companies are using their feature-rich oscilloscopes to maintain their sales in the face of very low-cost competition from the Far East, particularly China. These instruments, originally not much more than digital versions of the old analog CRT boxes, are beginning to provide very cost-effective options for simple testing or production line use.

So far we have talked about only the dedicated boxes, but there is a significant growth in “add-in oscilloscopes.” These are external boxes that connect to the PC (usually through USB) and carry out much of the oscilloscope functionality, using the PC for the PC functionality. These are often viewed as just a low-performance, low-cost option, particularly useful for field support, since the field staff will normally carry a laptop. But external oscilloscopes are now pushing well into the mid- to upper-range field, with the top end having bandwidth exceeding 10 GHz. These have to be a serious option for a wide range of test and measurement activities.

You can spend serious money on top-end oscilloscopes. By the time you have added complex probes, options, and software packages, you can easily blow a quarter of a million dollars. (Yes, I do mean $250,000.) On the other hand, for a couple of hundred dollars you can have a very competent piece of kit, with far higher speeds and functionality than were generally available only a few years ago.

Good oscilloscopes are still the defining badge of the electronics engineer. Matching oscilloscope capabilities, such as core hardware, accessories, probes, and software, against the system under development can provide otherwise unobtainable information about the system. And knowing the capability of the oscilloscopes that will be used in device and system test and bearing those capabilities in mind during system development and board layout can play a significant part, not just in improving testing but in reducing time-to-market and improving end-product quality. And those top-end oscilloscopes are just amazing.

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Introduction to the i.MX 93 Applications Processor Family
Robust security, insured product longevity, and low power consumption are critical design considerations of edge computing applications. In this episode of Chalk Talk, Amelia Dalton chats with Srikanth Jagannathan from NXP about the benefits of the i.MX 93 application processor family from NXP can bring to your next edge computing application. They investigate the details of the edgelock secure enclave, the energy flex architecture and arm Cortex-A55 core of this solution, and how they can help you launch your next edge computing design.
Oct 23, 2023
24,243 views