feature article
Subscribe Now

Big Data Moves to “Electronics”

Optimal+ Covers Board, System Manufacturing

Last year, right around this time, we took a look at a company called Optimal+. They were harvesting reams of semiconductor test data so that analysts could rummage through, looking for clues to manufacturing tweaks that could improve yield or throughput without compromising quality. With latency of up to around 10 minutes, it provided something close to real-time feedback.

And any new learnings could be tested on mounds of historical data to see “what would have happened” had the new rule been in place. If good things would have happened, then they could immediately roll the new rule out.

Of course, in the abstract, there’s nothing about these generic concepts that is tied to semiconductors. So, in the way of any ambitious, growing company, you want to expand your technical prowess to other areas. Especially if potential customers from those other areas get jealous of the semi guys and say, “Dude, I want what they’re having.”

So you launch in the direction of the next obvious step up the food chain: circuit boards. Or further yet: systems. Same as chips, right? Just on FR-4 (or whatever) instead of silicon. Right? A little bit right maybe? No?

Once you get to practical matters, chips and boards and systems are very different things – in particular because the supply chain is very different. And the most pressing problems they have may also be different. And then there is the fact that boards can be reworked, while chips can’t. Systems may get returned by end users. (And will also be used by end users… we’ll come back to that in-use thing in a moment.)

And so Optimal+ has announced Global Ops for Electronics. This is not an extension of their semiconductor product, but is a completely separate offering.

Optimal+ uses the term, “electronics,” which slightly confused this literal engineer. After all, if anything is an electronic item, it’s a semiconductor, whose only moving parts are electrons. But no, in this case, the word “electronics” refers to systems that use electronic components… um, no, that’s not helping. Heck with it; turn off brain and accept that “electronics” means systems, and chips are but inputs to the system.

Figure_1.png 

(Image courtesy Optimal+)

Whereas the semi version focuses on test and yield, Optimal+’s version 6.5 also looks at quality and reliability – and return material authorizations (RMAs) are a good proxy for how you’re doing out there (and likely an underrepresentation, since not everyone goes through the hassle of a return).

So now, Optimal+ gives access to the entire manufacturing history of the system for use in analyzing trends. “Genealogy” is a big part of this – tracking which version of a flow or test program was used and how many rework cycles a system had. And, in a wrinkle that’s distinctly different from semis, it includes which contract manufacturer (CM) was used.

Semis are certainly built by contracted foundries, but it’s much less common to have a single chip sourced from multiple foundries, since designs are generally optimized for a target foundry’s proprietary process. With mechanical assembly, however, using multiple vendors is common, so that adds a new parameter to a system’s history.

Their first offering in this area, called Global Ops for Electronics, targets yield using assembly house data. In the last quarter of this year, they’ll follow that up with tools for managing new product introductions (NPI) and production ramp-up. Right now, the traceability stops with incoming components, but in the future, a connection to Global Ops for Semiconductors will provide access to semiconductor data history as well. This connection is referred to as the Supplier Quality Network, and it’s still in early stages. For now, Global Ops for Semiconductors and for Electronics remain distinct, disconnected products.

Figure_2.png 

(Image courtesy Optimal+)

One of the new angles they’re working aims to reduce expensive returns. Given the genealogy aspect, any return can, in theory, have all of its individual paperwork examined to look for early hints that there might have been a problem. For instance, some parameter might have been within limits, but only just – the tests passed, but then the system got returned. So a tighter test limit might result from such an “aha” moment.

Machine learning generates many of these “ahas” by correlating outcomes with production statistics. It may take many, many samples before a trend becomes evident, so, by definition, this approach requires tons of data – not just current in-process data, but also historical data.

Exactly what that data is and how it’s represented will vary by manufacturer. So adopting Optimal+ means working with them to stitch the specific flow together. And, to be clear, it’s the OEM that’s in charge here, not the manufacturer, even though you need the cooperation of any and all CMs in your flow.

In fact, that raises a tricky issue with the data: isolation. The OEM wants to see all data from all CMs, but the CMs don’t want other CMs to see their data. So each CM has its visibility limited to itself – they can see their own data, but not that of their competition. Only the OEM gets to see the whole shebang.

There’s one other future aspect to this tool: including in-use data in the history. What if users are “doing it wrong?” Or what if it turns out that humid days cause increased failures? Wouldn’t it be great to know that when trying to sleuth out causes of failure? What about reminding users about preventative maintenance (PM)?

Well, that’s where we get into some murky waters. Folks are (not unreasonably) queasy about being tracked. So learning that every move they make with their gadgetry is being sent to Headquarters might not be welcome news. I have to confess to having written off such concerns about deep data mining a decade or so ago to the conspiracy crowd; revelations over the last years have proven me wrong.

So before Optimal+ adds this last bit, a few things need to be in place:

  • A legal framework of what is and what isn’t OK.
  • Presumably, user anonymization, although it’s never been clear to me how a company can prove it’s doing that (a single programmer or analyst could so easily violate the best of corporate intentions).
  • Assurance to users that the data gathering will be under user control.

There’s no question but that this additional data would add incredible richness to the kinds of analysis that Optimal+ provides, but our entire industry has some work to do before the public will be ready for that.

 

More info:

Optimal+ Global Ops for Electronics

16 thoughts on “Big Data Moves to “Electronics””

  1. Pingback: musicxraylogin
  2. Pingback: Petplay
  3. Pingback: binaural
  4. Pingback: DMPK Studies
  5. Pingback: agen bola online
  6. Pingback: casino online
  7. Pingback: Cheap

Leave a Reply

featured blogs
Oct 23, 2020
Processing a component onto a PCB used to be fairly straightforward. Through hole products, a single or double row surface mount with a larger center-line rarely offer unique challenges obtaining a proper solder joint. However, as electronics continue to get smaller and conne...
Oct 23, 2020
[From the last episode: We noted that some inventions, like in-memory compute, aren'€™t intuitive, being driven instead by the math.] We have one more addition to add to our in-memory compute system. Remember that, when we use a regular memory, what goes in is an address '...
Oct 23, 2020
Any suggestions for a 4x4 keypad in which the keys aren'€™t wobbly and you don'€™t have to strike a key dead center for it to make contact?...
Oct 23, 2020
At 11:10am Korean time this morning, Cadence's Elias Fallon delivered one of the keynotes at ISOCC (International System On Chip Conference). It was titled EDA and Machine Learning: The Next Leap... [[ Click on the title to access the full blog on the Cadence Community ...

featured video

Demo: Inuitive NU4000 SoC with ARC EV Processor Running SLAM and CNN

Sponsored by Synopsys

See Inuitive’s NU4000 3D imaging and vision processor in action. The SoC supports high-quality 3D depth processor engine, SLAM accelerators, computer vision, and deep learning by integrating Synopsys ARC EV processor. In this demo, the NU4000 demonstrates simultaneous 3D sensing, SLAM and CNN functionality by mapping out its environment and localizing the sensor while identifying the objects within it. For more information, visit inuitive-tech.com.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

featured paper

Designing highly efficient, powerful and fast EV charging stations

Sponsored by Texas Instruments

Scaling the necessary power for fast EV charging stations can be challenging. One solution is to use modular power converters stacked in parallel. Learn more in our technical article.

Click here to download the technical article

Featured Chalk Talk

Rail Data Connectivity

Sponsored by Mouser Electronics and TE Connectivity

The rail industry is undergoing a technological revolution right now, and Ethernet connectivity is at the heart of it. But, finding the right interconnect solutions for high-reliability applications such as rail isn’t easy. In this episode of Chalk Talk, Amelia Dalton chats with Egbert Stellinga from TE Connectivity about TE’s portfolio of interconnect solutions for rail and other reliability-critical applications.

Click here for more information about TE Connectivity EN50155 Managed Ethernet Switches