Understanding S-Parameter Measurement and Fixture De-Embedding Variation Across Multiple Teams and Tool Sets

Keysight

As an electrical engineer, I like being on the cutting edge of technology. That’s why I’m excited to work on the new IEEE P370 standard regarding the electrical characterization of printed circuit boards and related interconnects.


 To build a standard, you have to understand user error. Recently, I worked on a study that evaluated the way our fellow electronic engineers measure S-parameters and de-embed fixture data of high-frequency, high-data-rate PCBs. Our goal was to understand how measurements and de-embedding results vary across multiple teams when testing the same circuit.

Studying Variations in S-Parameter Measurement Results

My colleagues and I weren’t interested in finding the best combination of DUTs (devices under test), fixtures, and VNAs (vector network analyzers) to perform such tests. In fact, the measurement equipment and fixture de-embedding tools were anonymized.
 

After all, testing equipment and de-embedding algorithms have fairly accuracy. Variations emerge depending on how a test is set up, what quality cables and adapters are used, and whether environmental factors have been incorporated into the test methodology, among other things.

More than ever, repeatable measurements are vital. The days of a single engineer doing everything are gone.


More than ever, repeatable measurements are vital to our industry. The era of a single engineer performing all the measurements, simulations, and de-embedding are long gone. Instead, multiple teams of engineers—often working in different facilities, cities, or even countries—share component S-parameter files, and use different equipment and fixture de-embedding methods to test the same circuit.
 

The findings of our study will help define the IEEE P370 standard regarding best practices and methodology for custom calibration reference planes, such as those on a PCB for measuring components and interconnects.

Assembling Our Test Kits

Many RF and high-speed digital engineers have an innate trust in measured S-parameters for passive components. However, they view simulations of these same passive components with skepticism. That’s because VNAs are calibrated annually using NIST traceable standards, whereas simulation tools are not.
 

Unfortunately, there are no NIST traceable PCB calibration kits, and therefore engineers have to trust that the PCB calibration structures used are accurate and that the algorithms employed are appropriate.


For the purposes of this exercise, we used an early prototype of Signal Microwave’s plug-and-play PCB test coupon kit designed specifically to support the establishment of the IEEE P370 standard. Two different DUTs were measured: the first with 2.92mm connectors measured to 40 GHz, and the second with 1.85mm connectors measured to 60 GHz.
 

Our goal was to understand the variations that arise when different teams use different tools and methodologies to measure the same DUT. As a result, a number of VNAs and de-embedding software packages were used. VNA equipment providers included Anritsu, Keysight Technologies, and Rhode and Schwarz. De-embedding software included Keysight Technologies Physical Layer Test System (PLTS) Software, Ataitec IS, Smart Fixture De-Embedding, and a custom reference tool developed for the P370 standard.

Minimizing Guidelines

We also provided generic guidelines for VNA measurements, measured data checking, de-embedding, and de-embedding verification. However, there were no specific requirements on the measurement setup other than the required maximum measurement frequency.
 

We kept our instructions to a minimum. This created a degree of uncertainty that mimicked real-world signal integrity laboratory measurements. It was up to each team to select the appropriate measurement setup, and to make sure it was performing as expected. Teams were also free to select the people doing the measuring.
 

In real-world tests, measured data is often received with an indication of the test equipment used, but no information about the most recent calibration date, or the type of measurement cables used. Real-world results also don’t specify the level of experience of the people performing the tests, and we’ve all experienced instances of different engineers and technicians providing different results. We wanted to mirror all of this in our study.

Finding Unexpected Variations

We used four different methods to analyze test data. These included width of pen tests and vector error analyses proposed by study co-author Eric Bogatin, as well as the time-domain and frequency domain S-parameter quality metrics that are part of the emerging IEEE P370 standard.
 

I expected some minor variations in the test results. When we started to graph the measurements, everything seemed to be in order, but when we looked at the actual numbers, some of the differences were extreme. For the most part, the worst-case errors were below -15 dB, which represents more than 15% variation. However, in one case, we saw the variation spike upward 20 dB to 170%! The acceptable variation is -20 dB or 10%. 

Looks can be deceiving. Analyze the numbers, not just the graph.


Fortunately, we were able to pinpoint the cause. There was a significant S21 phase error between the signal going into the board and the signal coming out. It could have been a loose coaxial connector, or the test engineer may have created a longer signal path by adding an extra adapter after everything else had been calibrated. 

Establishing Best Practices to Minimize Errors

As you can see, human error plays a huge part in the accuracy of test results, as does the quality of the components used, including the fixtures, cables, and connectors. This study is helping us establish the set of best practices for testing DUTs that will be finalized into the IEEE P370 standard in the coming year. 


Some of our measurement guidelines include:

  • Calibrating VNAs per manufacturer guidelines (yearly or every two years)
  • Inspecting all connections visually for measurement stability and replacing them as      needed
  • Using proper adapters
  • Choosing the right number of IFFT points and an appropriate IF bandwidth
  • Using “unknown-thru” VNA calibration technology
  • Performing a post-calibration stability check
  • Comparing results against a known reference

We also recommend using a plug-n-play reference kit, like the one used in these tests, and one of the de-embedding tools we used. Finally, we are also recommending that labs start using IEEE P370-compliant electrical fixtures and consistency checks once the standard is finalized.

To build a standard, you have to understand user error.

  

Our committee just published a draft of the standard for yearlong industry-wide community review, so there will be some updates and modifications.

Improving Measurement Outcomes Through Education

One issue I’d like to address—but which isn’t part of the standard—is the lack of temperature and humidity monitoring in today’s labs.


When I started in this business, our lab had a thermometer and a humidity sensor, which we checked every four hours. We’d write down the readings, and if they weren’t in an acceptable range, we would either recalibrate our equipment or call maintenance.

 

These days, everyone’s pressed for time. It seems like we’re constantly racing the clock, and some measurement best practices have fallen to the wayside. One of the leading proponents of proper testing procedures is Keysight Technologies. Their Metrology and Measurement education team provides online and real-world digital and RF measurement education through webinars, seminars, classes, and white papers.
 

Most of this training is free, and covers everything from the basics, to advanced techniques, and the latest developments in the field. Electrical engineers at every stage of their career can acquire new knowledge and skills through Keysight’s educational initiative, and our entire industry benefits from improved measurement methodologies.
 

Keysight’s Heidi Barnes, senior applications engineer and 2017 DesignCon Engineer of the Year, is an invaluable member of the P370 committee. She is a signal integrity engineer who works on the design and simulation team at Keysight, providing signal integrity simulation and analysis solutions in PathWave Advanced Design System (ADS). Heidi helps us see measurement standards from an end user’s perspective. Her contribution helps us to create a better methodology, and to build better tools that are accessible and visible to users. 

Effecting Behavioral Change in Measurement Methodologies

We hope that a new behavioral model will emerge in the frequency measurement domain, and that we can then transfer it to the time domain. We also hope to minimize issues that emerge when interpolating data between high- and low-frequency data points. Current hardware and software tools have no standard way of dealing with this, so we’d like to establish one. Perhaps we’ll see the fruit of our research integrated into new technologies, like Keysight’s Pathwave design and test software platform.


As I said at the beginning, I enjoy being on the cutting edge of technology, and I’m proud to work with my industry colleagues to establish the IEEE P370 standard. This working group came together in 2015, and is truly a labor of love. All of us are volunteering our time, since this is on top of our day jobs. We really enjoy working with one another, and it’s always a pleasure to come together and see what we can build.

Standards—like good research methodologies—are built on trust and collaboration.

  

It is a joy to put out an idea and to have one of your colleagues say, “Oh, yeah! I’ll take that on.” After all, standards—like good research methodologies—are built on trust and collaboration.