feature article
Subscribe Now

Xilinx and Samsung Collaborate on 5G

Key Conspiracy Features Still Lacking

These are crazy times. Even though most of the world is locked down in quarantine right now due to the COVID-19 pandemic, engineering teams around the world are still working – through a variety of clever means – to continue making progress on the deployment of 5G infrastructure. 5G is by far the most ambitious rollout of wireless networking technology ever attempted, and it brings with it daunting engineering challenges. We are seeing in real time the enormous value the current global network infrastructure is delivering at this time of crisis, and the motivation to get 5G up and running at scale, with all the advantages it brings, has become even stronger. 

At the same time, small groups of misinformed conspiracy theorists around the globe are creating chaos and confusion with absurd notions about 5G technology. Google “5G” and you’ll find more misinformation than fact, with the search results sadly cluttered with stories of cell towers being burned, ridiculous claims about wireless technology spreading viruses, and crazy notions of global conspiracies of evil that would put most Bond villains to shame.

As many of our readers are the people actually creating 5G technology, you understand firsthand the frustration of dealing with this kind of idiotic misinformation campaign focused on your professional work. So, let’s look instead at the technology challenges, and perhaps the social issues will fade into obscurity once the doomsday scenarios fail to materialize. As most of us know, one of the most difficult problems to solve in 5G deployment – particularly for the higher-bandwidth standards – is beamforming for massive MIMO (multiple-input multiple-output) antennas – allowing spectrum to be shared by multiple users simultaneously, thus dramatically increasing the efficiency of frequency band use by the system. 

Massive MIMO takes the cute little dual- or triple-antenna MIMO that we’ve all known and loved on our WiFi routers and kicks it into overdrive with enormous arrays of up to 128 antennas (64 transmit and 64 receive), and even multiuser MIMO (MU-MIMO), which allows for the transceiver to talk to more than one receiver at a time. Easy, right? Uh, not so much. Massive MIMO demands massive compute density for the beamforming signal processing, and even the biggest, baddest FPGAs buckle under the load. Xilinx and Samsung have just announced a partnership to use Xilinx’s newest Versal devices, which include FPGA logic fabric, a powerful vector array processor, enormous amounts of DSP resources, piles of memory, a network-on-chip, and multi-core ARM-based applications processors (in addition to a laundry list of other features too long to itemize here) to drive beamforming and other core capabilities in widespread 5G deployments.

Programmable logic devices such as Versal are ideal for 5G deployment because of their hardware flexibility. As we have discussed in the past, 5G is not actually a single wireless technology, but a title for a collection of wireless standards ranging from low band (600-700MHz) which can cover large geographic areas with 5G performance in the 30 to 250 Mbps range, mid-band (2.5/3.5GHz) with a smaller (several-mile) radius delivering 5G performance in the 100 to 900Mbps range, and high-band (millimeter wave/24-39GHz) delivering a blazing 1-3 Gbps within a one-mile or lower radius. FPGAs can be in-system reconfigured to meet any of those demands, and they can be updated as standards evolve – extending the service life of infrastructure equipment. Versal, however, will particularly shine on the extreme demands of the millimeter-wave standards because of its unparalleled ability to parallelize beamforming DSP. 

In case you were worried, our analysis also finds that the Xilinx devices fall far short on key conspiracy capabilities such as remote DNA re-writing, human thought monitoring, causing cancer, killing birds, and cloaking Sasquatch with invisibility protection. Yes, some of the brightest engineers in the world are working on 5G technology. But even on our best days, we engineers are just not as smart as the conspiracy theorists give us credit for. Sorry folks. World conquest will have to wait a few more G’s to achieve. 

What today’s engineers are able to achieve, however – in Xilinx’s case – are powerful vector processing arrays which can be used both for AI inference and for exactly the kind of parallel DSP required for massive MIMO beamforming. This is a unique feature of Versal (and one of the reasons Xilinx claims Versal is a new category of device, the Adaptive Compute Acceleration Platform, or ACAP – and not an FPGA.) Versal brings a remarkable density of DSP resources to bear on the 5G problem with equally remarkable power efficiency. This combination means that 5G gear will be able to meet tight system constraints on form-factor and power, while still meeting the demands of beamforming from the initial 4T4R 5G antennas up to the largest 64T64R antenna arrays. Sadly, for conspiracy theorists, this also means that less power and less spectrum is required – and thus you’ll be exposed to lower doses of evil electromagnetic radiation than ever while streaming those conspiracy videos to your smartphones.

The Samsung socket is a major win for Xilinx, as it should pull a huge volume of their high-end devices into infrastructure build-out – particularly for the highest-density gear supporting urban areas, and those are likely to be very high-value sockets (that’s not-so-secret code for exotic chips with high margins and even higher price tags). However, Xilinx will sell plenty of these chips just for the deployment, so please don’t give them more replacement business by burning down towers. The ones you’re burning down aren’t even 5G towers anyway… Idiots.

One thought on “Xilinx and Samsung Collaborate on 5G”

Leave a Reply

featured blogs
May 22, 2020
As small as a postage stamp, the Seeeduino XIAO boasts a 32-bit Arm Cortex-M0+ processor running at 48 MHz with 256 KB of flash memory and 32 KB of SRAM....
May 22, 2020
Movies have the BAFTAs and Academy Awards. Music has the GRAMMYs. Broadway has the Tonys. Global footballers have the Ballon d’Or. SI/PI engineers have the DesignCon 2020 Best Paper Award? Really? What’s that? Many people are familiar with annual awards mentioned....
May 22, 2020
[From the last episode: We looked at the complexities of cache in a multicore processor.] OK, time for a breather and for some review. We'€™ve taken quite the tour of computing, both in an IoT device (or even a laptop) and in the cloud. Here are some basic things we looked ...
May 21, 2020
In this time of financial uncertainty, a yield-shield portfolio can protect your investments from market volatility.  Uncertainty can be defined as a quantitative measurement of variability in the data [1]. The more the variability in the data – whet...

Featured Video

Featured Paper

How LiDAR Delivers Critical Distance-Sensing for Self-Driving Cars

Sponsored by Maxim Integrated

Commercialization of autonomous cars represents an exciting journey ahead, and LiDAR technology in ADAS is right in line to become a significant player in the future of autonomous vehicles. Its performance depends on the optical front-end, as well as how the signal is transmitted through the signal chain and then processed. An important component in this signal chain is the transimpedance amplifier (TIA). Read more for an overview of how LiDAR works and what to look for in an effective TIA.