feature article
Subscribe Now

Will Intel Make It In Mobile?

Big Name’s Big Chips Haven’t Made It Big in Little Devices

If you ask the average person to name the world’s biggest computer chip company they’ll say Intel. In reality, Intel makes only about 2% of the world’s microprocessors and microcontrollers. The rest of the business is scattered among dozens of other CPU and MCU makers. ARM-based chips, for example, are five times more popular than Intel’s x86-based chips, yet few normal people know anything about ARM.

Intel wants to change that. Not the ARM thing—the 2% thing. Intel wants a bigger slice of the embedded-processor pie, and it’s been eyeing the mobile market hungrily. ARM ate Intel’s lunch, so to speak, totally devouring cell phones and tablets, two areas where Intel has just about zero appeal. But if cell phones and tablets are going to replace PCs as our computers of choice, Intel needs to get moving. And get mobile.

To that end, Intel has been beavering away on a whole slew of x86 processors intended for mobile devices. There’s Medfield and Clover Trail, Saltwell and Silvermont, and many more codenames ahead on the roadmap. In short, Intel is deadly freakin’ serious about pushing x86 processors into low-power, portable, and handheld devices.

Can the company succeed? Or do embedded developers hate Intel too much?

The usual complaint leveled against Intel’s processors (or any x86 chip, for that matter) is that it’s too power-hungry. While ARM and MIPS and nearly everyone else spent the last 20 years designing and building smaller, lighter, and more-efficient RISC processors, Intel fired up the forge and built bigger, badder, and faster chips. That was great for PCs but exactly the wrong strategy for mobile devices. Now Intel is playing catch-up, but it’s saddled with all the extraneous circuitry it added over the past four decades. You can’t simply pare that away to make a streamlined x86-lite. Lose just one instruction or opcode and you sacrifice binary compatibility, and then what would be the point? A few x86 clone makers tried that before, and failed miserably (and expensively). Intel and the x86 are hoist by their own petard. 

But Intel has physics on its side. The company runs some of the best silicon factories on the planet, and good manufacturing can make up for a lot of architectural cruft. Its 22nm “Tri-Gate” process, in particular, will make even a Pentium seem like a Prius. That’s something the ARM and MIPS licenses can’t duplicate. They’re perennially one or two generations behind in silicon technology, so Intel is pushing its advantage to the fullest.

Another thing about software compatibility: Is that really a good thing in the mobile market? Intel fanatically preserves binary compatibility across all its x86 chips, but how does that help mobile designers? There’s precious little mobile middleware available for x86, as opposed to that for ARM code, which is practically falling out of trees. To make a new tablet based on x86 would mean writing or porting almost all of the code yourself. Make an ARM-based tablet, on the other hand, and you can pick up most of the code for free. Hiring programmers would be easier, too.

To allay developers’ fears about code availability, Intel recently shook hands very publicly with Google, and the two companies showed off Android running on Intel’s Medfield processor (an upcoming Atom chip for cell phones), while the chip company promised there’d be at least one Medfield/Android phone in 2012. “See?” Intel appeared to be saying, “we can run Android just like those other guys!”

Technology aside, Intel has a completely different business model from either ARM or MIPS. If you’re a developer and you want to use an Intel chip, you have to get it from… Intel. Or maybe AMD, but the two companies’ chips are hardly interchangeable. On the other hand, if you want an ARM-based processor, you’ve got dozens of choices. Granted, all those chips are different in some way, too, but you’ve got a heck of a lot more options. Intel has no interest in licensing its x86 architecture; the company wants to be a chip maker, not just a chip designer. That sole-source philosophy turns off a lot of embedded designers who prefer the competitive licensed marketplace and would rather not be utterly dependant on one (very large) chip company. It’s a bit of the open-source software philosophy applied to microprocessors.

What about performance? Intel is known for its fire-breathing processors, but does that translate into the mobile realm? For the most part, it does. Early Atom chips weren’t all that fast, nor were they all that appropriate for smaller handheld devices. But more-recent generations of the Atom family tree have been much better, and the upcoming Medfield and Clover Trail processors look very promising, indeed. Intel has clearly learned its lessons and is dialing-in more performance at the same time that it dials back the power requirements. Roughly speaking, upcoming Atom chips will be on a performance par with ARM’s Cortex-A19 and perhaps even Cortex-A15. And multicore? Yeah, both sides have got that.

Momentum is one thing that works against Intel. Ironically, the world’s best-known chip company is a virtual outsider when it comes to mobile processors. Engineers tend to use whatever processor they used last time. Programmers tend to prefer familiar chip families, too. That means tablet designers and cell phone makers are 99% certain to use ARM-based chips again, because they used ARM-based chips last time. Why would they switch without a compelling reason?

Intel hopes to provide that compelling reason, but I’m not convinced. Even if Intel’s upcoming mobile processors are just as fast, just as power-efficient, and just as inexpensive as competing ARM-based processors—and those are all substantial assumptions—Intel would only be level-pegging. That’s not enough to get a hardware maker to switch. That’s only enough to ante into the game. And even if the hardware was competitive, that doesn’t help the software situation at all. Nearly every portable wireless device in the world uses an ARM-based processor, so there’s a huge array of ARM code for that infrastructure. Code that’s already been tested, government-approved, and field-proven. The same goes for operating systems and, to some extent, even applications. Intel’s chips have very little of that. Yes, you can get Android and a handful of other operating systems. But middleware? Wireless certification? Protocol stacks? Better start coding.

And in an Et tu, Brute? moment, Microsoft has loudly proclaimed that Windows 8 will run on ARM processors as well as Intel chips. This isn’t the first time Microsoft has strayed from its sometime twin (remember Windows NT on Alpha and MIPS?), but this time the knife stroke comes at a particularly sensitive time. Windows, at least, was an x86-only option; if you wanted Windows you had to talk to Intel (and/or AMD). Now even that thread of hope is gone. Windows is no longer an assurance of “Intel Inside” or vice versa. Embedded designers can use Windows 8 on their mobile devices while still choosing an ARM-based chip to run it on. Whether or not Windows 8 ever becomes popular for embedded and mobile devices is an entirely different discussion. For another day. 

One thought on “Will Intel Make It In Mobile?”

  1. You know… I really hope that, if cellphones and tablets replace real computers, that serious docking stations become available. I only use the computing on my cellphone when I absolutely have to while away from my computer. I hate trying to read on the minuscule screen, I hate thumbing, and don’t even get me started on touchscreen typing (cell or tablet).

    I want a big screen and a real keyboard (oooo, can someone make one with IBM Selectric haptic feedback??!!!) for real old-fashioned high-speed touch typing where I don’t have to worry that every other keystroke is wrong (and absolutely without autoco-wrecks). If those happen to plug into some tiny boxlet instead of a laptop (or desktop), I’m cool with that…

    Jeez I feel like a grumpy old man right now…

Leave a Reply

featured blogs
Nov 25, 2020
It constantly amazes me how there are always multiple ways of doing things. The problem is that sometimes it'€™s hard to decide which option is best....
Nov 25, 2020
[From the last episode: We looked at what it takes to generate data that can be used to train machine-learning .] We take a break from learning how IoT technology works for one of our occasional posts on how IoT technology is used. In this case, we look at trucking fleet mana...
Nov 25, 2020
It might seem simple, but database units and accuracy directly relate to the artwork generated, and it is possible to misunderstand the artwork format as it relates to the board setup. Thirty years... [[ Click on the title to access the full blog on the Cadence Community sit...
Nov 23, 2020
Readers of the Samtec blog know we are always talking about next-gen speed. Current channels rates are running at 56 Gbps PAM4. However, system designers are starting to look at 112 Gbps PAM4 data rates. Intuition would say that bleeding edge data rates like 112 Gbps PAM4 onl...

featured video

Improve SoC-Level Verification Efficiency by Up to 10X

Sponsored by Cadence Design Systems

Chip-level testbench creation, multi-IP and CPU traffic generation, performance bottleneck identification, and data and cache-coherency verification all lack automation. The effort required to complete these tasks is error prone and time consuming. Discover how the Cadence® System VIP tool suite works seamlessly with its simulation, emulation, and prototyping engines to automate chip-level verification and improve efficiency by ten times over existing manual processes.

Click here for more information about System VIP

featured paper

Keys to quick success using high-speed data converters

Sponsored by Texas Instruments

Whether you’re designing an aerospace system, test and measurement equipment or automotive lidar AFE, hardware designers using high-speed data converters face tough challenges with high-frequency inputs, outputs, clock rates and digital interface. Issues might include connecting with your field-programmable gate array, being confident that your first design pass will work or determining how to best model the system before building it. In this article, we take a look at each of these challenges.

Click here to download the whitepaper

Featured Chalk Talk

RF Interconnect for Wireless Applications

Sponsored by Mouser Electronics and Amphenol RF

The 5G revolution puts daunting demands on antenna technology. With massive MIMO and mm wave, 5g opens a whole new era in antenna solutions. In this episode of Chalk Talk, Amelia Dalton chats with Owen Barthelmes and Kelly Freeman of Amphenol RF about 5G antenna technology, and how Amphenol RF’s HD-EFI connector solution can help you with your next 5G design.

Click here for more info about Amphenol RF 5G Wireless Connectors