feature article
Subscribe Now

R U Going to Attend the Second Annual Chiplet Summit?

As I pen these words, only a few days remain until the start of the biggest, bestest, and most bodacious second annual Chiplet Summit, which is to be held 6-8 February 2024 at the Santa Clara Convention Center in Santa Clara, California.

As an aside (yes, I know we just started and it’s a little early in the day for asides, but I’m just that sort of fellow), as you may or may not know, one of my many hats is that of the Editor of Designing Electronics North America (DENA) magazine. The reason I mention this here is that DENA is an official supporting publication for the Chiplet Summit. As part of this, an extra 1,000 copies of the February 2024 issue of DENA have been printed and will be made available to attendees of the show.

Three examples of past issues of DENA that are available for free download (Source: DENA)

Yes, I did say “printed.” Well spotted! In point of fact, DENA is primarily a print publication—any engineer can register for a free subscription that will be delivered to his or her desk—although PDF downloads are also available.

There’s so much I want to tell you about this august event, but before we plunge into the fray with gusto and abandon (and aplomb, of course), I wanted to share something tremendously exciting that just crossed my desk.

For countless yonks now (and that’s a lot of yonks whichever way you count them), I’ve been waffling furiously about two major technology domains: artificial intelligence (AI) and mixed reality (MR). (See also, What the FAQ are AI, ANNs, ML, DL, and DNNs? and What the FAQ are VR, MR, AR, DR, AV, and HR?)

As I’ve said before (to anyone who couldn’t get out of the way fast enough), I believe it won’t be long before the combination of artificial intelligence and mixed-reality will dramatically change the way in which we interact with the world, our systems, and each other.

Let’s assume we are all wearing some form of mixed-reality headset. One of the examples I often give is that I read a lot of books (happy face) but I spend a lot of time trying to track down nuggets of knowledge and tidbits of trivia in books I read weeks, months, or years in the past (sad face). I can imagine saying to my AI, “What was that book I was reading that mentioned Lady Ada in the context of artificial intelligence?” I can also imagine my AI responding, “That book was called ’20 Algorithms That Will Make You Squeal in Delight.’” I can then envisage myself saying, “Where is that little scamp?” And I can envision my AI retorting, “It’s on the bookshelf in the corner hiding behind a pile of books you put there last week,” while, at the same time, some sort of graphical indication appears to guide my eyes across the room to the pile of books in question (maybe even giving them a transparent appearance while highlighting the location of the book I’m searching for behind them). Later, when I retrieve the book from the shelf, the AI could whisper in my ear, “The reference to Lady Ada appears on page 42.”

My current mixed reality headset is a Meta Quest 3. I love it to bits. I can’t describe how it feels to fire my virtual Wallace and Gromit jam gun (for “jam” you may read “jelly” or “preserves,” if you wish) and see a blob of virtual jam fly across my real family room and splat against the real wall (you have to be there). On the downside, I probably wouldn’t be too enthusiastic about wearing a headset of this ilk all day. On the bright side, I’m hoping it won’t be long before we view our mixed reality world through something like the offering from Kura that looks like a pair of regular glasses but that boasts 50 million mixed reality pixels for each eye (see Kura’s AR Glasses are Meta-Droolworthy!).

But we digress… The exciting news I wish to share is the event-based gaze-tracking system for AI-enabled smart frames and VR/MR systems that was just announced by the awesome folks at Zinn Labs. These frames have an integrated camera that’s looking at what you are looking at. Eye-trackers in the frames are used to determine what you are looking at. Let’s say you are looking at a plant at a garden center, for example. You can simply say something like “What is this plant called and does it need a lot of maintenance?” The AI system knows what you are looking at, identifies the plant, and answers your other questions. You MUST WATCH THE VIDEO! Yes, there’s a bit of a delay (which will be addressed by mmWave 5G and 6G). Yes, the quality of the spoken response is a little “flat.” But this is just a tempting teasing taste of what is to come.

How will these systems be implemented? Well, one thing of which I’m sure is that it won’t be long before we start to see chiplet-based solutions powering our mixed reality headsets. Ooh, did you see what I just did there? (Can you spell segue?)

The concept of chiplets isn’t new. Some companies have been using them for years. Intel, for example, uses transceiver chiplets (they call them “tiles”) in their high-end Agilex FPGAs and SoC FPGAs. But Intel is an outlier because it controls all stages of the design and implementation flow. The long-term goal is for chip designers to be able to select “off-the-shelf” hard chiplet IPs from a catalog, create one or more chiplets of their own, and use this collection of chiplets to (relatively) quickly and easily implement a multi-die system. How long-term are we talking about? Who knows? But I bet it will be a shorter long-term than most people think.

I’ve written a couple of columns on this over the past few months, including the following:

Just yesterday at the time of this writing, I was chatting with my old chum Mick Posner, who is Vice President of Product Management for Synopsys’s High Performance Computing IP Solutions. One of the things on which we agreed is that the potential for chiplet use by a wider audience than the Intels of this world started to explode in 2023, and that things are going to get much, much more exciting very, very quickly. Our conversation ranged across topics like standard interfaces and hierarchical-based chiplet and multi-die system test strategies. The main point Mick wanted to bring to my attention (and for I to bring to your attention) is that Synopsys has all the tools, flows, and methodologies in place to design and verify multi-die systems now.

I also got to chat with Sam Salama, who is CEO at Hyperion Technologies, which is a manufacturer of microelectronic systems and advanced packaging. Sam will be giving a keynote presentation at the Chiplet Summit titled New Packaging Technology Accelerates Major Compute Tasks. In his keynote, Sam will be discussing an economical new 3D packaging technology that allows larger packages, much higher power dissipation (up to 1,000W per package), and substrates that exceed 100mm by 100mm (beyond the limitations of silicon interposers and without the warpage issues).

There are a bunch of other keynotes that will be “must see” events, including Creating a Vibrant Open Chiplet Economy by Bapi Vinnakota, who is the Leader of the Open Domain-Specific Architecture (ODSA) Sub-Project at the Open Compute Project (OCP); Creating the Connectivity for AI Everywhere by Tony Chan Carusone, who is CTO at Alphawave Semi; and Multi-Die Systems Set the Stage for Innovation by Abhijeet Chakraborty, who is VP of Engineering at Synopsys.

You can find a full list of keynotes on the Chip Summit website, along with a full conference program. Also, I just heard (literally as I’m penning these words) that the TechArena will be featuring the Chiplet Summit on its media platform next week. As you may know, the TechArena is led by industry veteran Allyson Klein, whose 22-year career at Intel included serving as GM of Data Center Marketing, and who is deeply engaged in following the growth of chiplet architectures in the industry. In fact, the TechArena has covered chiplet innovation since the announcement of UCIe in 2022, and it has featured Chiplet Summit sponsors Alphawave Semi, Arm, Intel, and the Open Compute Project among the 70+ industry leaders who have appeared on the TechArena podcast and been featured in TechArena articles.

Suffice it to say that if you see chiplets in your future, then the Chiplet Summit is the place to “see and be seen.” I am drooling with desire to attend this event myself, but, sadly, it was not to be. Maybe next year. Until then, I will have to live vicariously through you after you’ve registered to attend and you email me to tell me what you’ve seen. As always, I look forward to your insightful comments and perspicacious questions.

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

Larsen & Toubro Builds Data Centers with Effective Cooling Using Cadence Reality DC Design

Sponsored by Cadence Design Systems

Larsen & Toubro built the world’s largest FIFA stadium in Qatar, the world’s tallest statue, and one of the world’s most sophisticated cricket stadiums. Their latest business venture? Designing data centers. Since IT equipment in data centers generates a lot of heat, it’s important to have an efficient and effective cooling system. Learn why, Larsen & Toubro use Cadence Reality DC Design Software for simulation and analysis of the cooling system.

Click here for more information about Cadence Multiphysics System Analysis

featured chalk talk

Gas Monitoring and Metering with Sensirion SFC6000/SFM6000 Solutions
Sponsored by Mouser Electronics and Sensirion
In this episode of Chalk Talk, Amelia Dalton and Negar Rafiee Dolatabadi from Sensirion explore the benefits of Sensirion’s SFM6000 Flow Meter and SFC Flow Controller. They examine how these solutions can be used in a variety of applications and how you can get started using these technologies for your next design.
Jan 17, 2024
26,446 views