feature article
Subscribe Now

Fishing for Signal Integrity

SerDes Tuning Basics

Pieter has been a fisherman all his life.  When he was a boy, he would spend every spare moment at the pier with his grampa’s old fishing rig, catching whatever would wander near the dock, all the while watching the fishing fleet leave and return under the bridge that spanned the entrance to the small harbor.  He never liked cleaning the fish, however, so he would catch and release them, returning home with wildly exaggerated stories of the giant fish he had landed and then returned.  Since no one but Pieter ever saw those fish, and since he never bothered to actually weigh or measure them, his exaggerated claims gradually became the truth – to him, at least.

Karl was Pieter’s older brother.  Karl never saw the sense in Pieter’s passion for fishing.  He rolled his eyes every day when his younger sibling would return spinning his obviously too-tall tales of conquests at the pier.  Karl was practical-minded.  He became a businessman – a distributor of fresh seafood products.  When his younger brother Pieter bought a fishing boat and set out to fish for a living, Karl would buy all the fish his brother could provide.  In order to protect his brother’s pride, he would normally credit him for almost double the weight he brought in, making the difference up by paying him a lower price per pound. 

Karl was applying equalization on the receive end of the fish transfer from his brother.  Karl knew the catch would come in with a lower actual weight than Pieter claimed, so he would “equalize” up to the weight his brother believed was appropriate for his day’s work. 

Mason had been a fisherman in the same village as Pieter and Karl for his entire life.  He wasn’t a passionate soul like Pieter, though.  Mason was a tough and pragmatic sailor who worked hard to bring in a good catch and even harder to take care of his customers.  He knew that a good deal of his catch could be lost on a bad day due to sloppy handling by the dock hands and by pessimistic weighing practices on the part of some of his stingier customers.  When a customer ordered 200lbs of fish, Mason would always send 240, just to be on the safe side. 

Mason was applying transmit pre-emphasis to his seafood sales.  He knew there would be some loss, so he tried to apply the right amount of extra boost on his end to make sure the correct amount of product arrived at the destination.

Karl and Mason had each optimized signal integrity by applying the appropriate techniques for their particular situation.  Karl, by applying equalization on the receive end, could compensate for the poor performance of his brother Pieter.  Mason, by applying pre-emphasis on the transmit end, could offset the inevitable losses by the unscrupulous parties on the receiving end of his services. 

There came a time, however, when Pieter’s boat needed to be repaired.  Karl was forced to look for another source of fish, and he went to Mason.  Karl applied the same equalization he would have used with Pieter and ordered 400 lbs of fish, when he really only needed 200.  Mason obliged with his normal 20% transmit-end pre-emphasis and delivered 480 lbs.  Karl’s company, out of habit, cut Mason a check for a little less than the market price of 200 lbs.  That day, in our little fishing village, we learned that even though pre-emphasis and equalization can both be good things in salvaging signal integrity, using them together can yield less than ideal results.  We also learned that, if our village had just practiced good weighing and handling procedures in the first place, much of this copious over-compensation could have been averted.

Every small fishing village is unique, of course, even though they share common characteristics.  The world of signal integrity for high-speed serial connectivity works pretty much the same way.  When we want to send chip-to-chip signals through a board, across a backplane or, (cough cough) through a cable, we face serious challenges in getting our data to arrive reliably at its destination.  While high-speed serial connections eliminate the clock-synchronization issues associated with wide, high-speed parallel busses, they introduce their own unique set of signal integrity challenges.

As high-speed signals travel from your transceiver, they are attenuated at every step, beginning with the bonding wire (if you’re using a wire-bond technology), through the physical pin or solder ball, and through the run of FR-4 (or whatever your trace material might be).  Even more loss happens at connectors, backplanes, and cables.  In general, the high-frequency components are attenuated more than the lower ones.  Past the 1GHz mark, attenuation can be fairly severe, owing primarily to skin effect and dielectric loss.  The result of this attenuation is increasing intersymbol interference (ISI) where temporal spreading, reflections, and other effects can cause previous bits to interfere with the current bit. 

It makes sense, then, to boost those high frequency components at the source.  Like Mason, we know that there will be losses in the process, and we apply transmit-end pre-emphasis to compensate, boosting the high-frequency components to allow for our expected losses.

Most FPGA-based transceivers have programmable pre-emphasis.  If you’re creating your own connection to a non-programmable or third-party receiver, transmit pre-emphasis is your primary weapon in offsetting attenuation losses.  In order to set pre-emphasis intelligently, however, you’ll need some pretty detailed information on the path from your transmitter to the receiver.  One way to accomplish this is to perform an HSPICE simulation of the channel.  You’ll need models of your specific transceiver and S-parameters for the PCB route your signal is traversing.  These measurements can either be extracted from a physical board or estimated from the PCB layout using EDA tools.  FPGA vendors also offer signal integrity tools that can help to expedite this characterization and simulation process.

Pre-emphasis is not free, of course.  The primary penalty is increased power consumption.  Since SerDes connections are already somewhat of a power bottleneck in many systems, you’ll want to be judicious – avoiding the over-use of pre-emphasis.  There are also additional side effects such as crosstalk and EMI that are increased with the use of pre-emphasis.

If you are managing only the listening end of the connectionand the transmitter is out of your control, you’ll have to rely on equalization to salvage the signal.  Equalization passes the incoming signal through a high-pass filter that offsets the unequal attenuation of different frequency ranges.  Most FPGA-based transceivers also allow programmable equalization.  Equalization parameters can typically be specified at configuration time and even adjusted on-the-fly during operation.  Like pre-emphasis, a high-quality analog simulation is the key to setting pre-emphasis.  Obviously, the penalty for equalization is overall attenuation of the signal. 

In many of the more ubiquitous SerDes standards such as PCI express, the vendor will often supply pre-emphasis and/or equalization settings for you based on their own characterization of their boards with the compliance testing setups.  These settings will generally provide a good starting point, and you can then make incremental adjustments for best performance using your own board.

Often, programmable pre-emphasis and equalization are used as crutches to get around poor design practices.  Poor termination, badly routed and mismatched differential traces, and problems with vias and connectors can have a profound effect on signal integrity.  Considering signal integrity at board layout time, using the correct connectors, and taking advantage of features such as on-chip termination can improve signal integrity dramatically, reducing the need for excessive compensation with pre-emphasis and equalization.

Some of the signal integrity solution comes from the components and packages you choose as well.  Each transceiver has a transmit jitter specification (how much jitter it typically generates) and a receiving-end jitter-tolerance specification.  The lower the jitter, and the higher the jitter-tolerance, the easier time you’ll have plucking the signal out of the noise.

Choosing the right part with the right transceivers and the right package, doing careful PCB layout, and making proper use of the pre-emphasis and equalization adjustments won’t necessarily help Karl with his oversupply of smelly fish, or Pieter with his broken fishing boat, or Mason with his careless dock hands – but it will have you well on the way to reliable high-speed serial communications.

Leave a Reply

featured blogs
Dec 8, 2023
Read the technical brief to learn about Mixed-Order Mesh Curving using Cadence Fidelity Pointwise. When performing numerical simulations on complex systems, discretization schemes are necessary for the governing equations and geometry. In computational fluid dynamics (CFD) si...
Dec 7, 2023
Explore the different memory technologies at the heart of AI SoC memory architecture and learn about the advantages of SRAM, ReRAM, MRAM, and beyond.The post The Importance of Memory Architecture for AI SoCs appeared first on Chip Design....
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

Dramatically Improve PPA and Productivity with Generative AI

Sponsored by Cadence Design Systems

Discover how you can quickly optimize flows for many blocks concurrently and use that knowledge for your next design. The Cadence Cerebrus Intelligent Chip Explorer is a revolutionary, AI-driven, automated approach to chip design flow optimization. Block engineers specify the design goals, and generative AI features within Cadence Cerebrus Explorer will intelligently optimize the design to meet the power, performance, and area (PPA) goals in a completely automated way.

Click here for more information

featured paper

Universal Verification Methodology Coverage for Bluespec RISC-V Cores

Sponsored by Synopsys

This whitepaper explains the basics of UVM functional coverage for RISC-V cores using the Google RISCV-DV open-source project, Synopsys verification solutions, and a RISC-V processor core from Bluespec.

Click to read more

featured chalk talk

Power Gridlock
The power grid is struggling to meet the growing demands of our electrifying world. In this episode of Chalk Talk, Amelia Dalton and Jake Michels from YAGEO Group discuss the challenges affecting our power grids today, the solutions to help solve these issues and why passive components will be the heroes of grid modernization.
Nov 28, 2023
1,427 views