feature article
Subscribe Now

Will Intel Make It In Mobile?

Big Name’s Big Chips Haven’t Made It Big in Little Devices

If you ask the average person to name the world’s biggest computer chip company they’ll say Intel. In reality, Intel makes only about 2% of the world’s microprocessors and microcontrollers. The rest of the business is scattered among dozens of other CPU and MCU makers. ARM-based chips, for example, are five times more popular than Intel’s x86-based chips, yet few normal people know anything about ARM.

Intel wants to change that. Not the ARM thing—the 2% thing. Intel wants a bigger slice of the embedded-processor pie, and it’s been eyeing the mobile market hungrily. ARM ate Intel’s lunch, so to speak, totally devouring cell phones and tablets, two areas where Intel has just about zero appeal. But if cell phones and tablets are going to replace PCs as our computers of choice, Intel needs to get moving. And get mobile.

To that end, Intel has been beavering away on a whole slew of x86 processors intended for mobile devices. There’s Medfield and Clover Trail, Saltwell and Silvermont, and many more codenames ahead on the roadmap. In short, Intel is deadly freakin’ serious about pushing x86 processors into low-power, portable, and handheld devices.

Can the company succeed? Or do embedded developers hate Intel too much?

The usual complaint leveled against Intel’s processors (or any x86 chip, for that matter) is that it’s too power-hungry. While ARM and MIPS and nearly everyone else spent the last 20 years designing and building smaller, lighter, and more-efficient RISC processors, Intel fired up the forge and built bigger, badder, and faster chips. That was great for PCs but exactly the wrong strategy for mobile devices. Now Intel is playing catch-up, but it’s saddled with all the extraneous circuitry it added over the past four decades. You can’t simply pare that away to make a streamlined x86-lite. Lose just one instruction or opcode and you sacrifice binary compatibility, and then what would be the point? A few x86 clone makers tried that before, and failed miserably (and expensively). Intel and the x86 are hoist by their own petard. 

But Intel has physics on its side. The company runs some of the best silicon factories on the planet, and good manufacturing can make up for a lot of architectural cruft. Its 22nm “Tri-Gate” process, in particular, will make even a Pentium seem like a Prius. That’s something the ARM and MIPS licenses can’t duplicate. They’re perennially one or two generations behind in silicon technology, so Intel is pushing its advantage to the fullest.

Another thing about software compatibility: Is that really a good thing in the mobile market? Intel fanatically preserves binary compatibility across all its x86 chips, but how does that help mobile designers? There’s precious little mobile middleware available for x86, as opposed to that for ARM code, which is practically falling out of trees. To make a new tablet based on x86 would mean writing or porting almost all of the code yourself. Make an ARM-based tablet, on the other hand, and you can pick up most of the code for free. Hiring programmers would be easier, too.

To allay developers’ fears about code availability, Intel recently shook hands very publicly with Google, and the two companies showed off Android running on Intel’s Medfield processor (an upcoming Atom chip for cell phones), while the chip company promised there’d be at least one Medfield/Android phone in 2012. “See?” Intel appeared to be saying, “we can run Android just like those other guys!”

Technology aside, Intel has a completely different business model from either ARM or MIPS. If you’re a developer and you want to use an Intel chip, you have to get it from… Intel. Or maybe AMD, but the two companies’ chips are hardly interchangeable. On the other hand, if you want an ARM-based processor, you’ve got dozens of choices. Granted, all those chips are different in some way, too, but you’ve got a heck of a lot more options. Intel has no interest in licensing its x86 architecture; the company wants to be a chip maker, not just a chip designer. That sole-source philosophy turns off a lot of embedded designers who prefer the competitive licensed marketplace and would rather not be utterly dependant on one (very large) chip company. It’s a bit of the open-source software philosophy applied to microprocessors.

What about performance? Intel is known for its fire-breathing processors, but does that translate into the mobile realm? For the most part, it does. Early Atom chips weren’t all that fast, nor were they all that appropriate for smaller handheld devices. But more-recent generations of the Atom family tree have been much better, and the upcoming Medfield and Clover Trail processors look very promising, indeed. Intel has clearly learned its lessons and is dialing-in more performance at the same time that it dials back the power requirements. Roughly speaking, upcoming Atom chips will be on a performance par with ARM’s Cortex-A19 and perhaps even Cortex-A15. And multicore? Yeah, both sides have got that.

Momentum is one thing that works against Intel. Ironically, the world’s best-known chip company is a virtual outsider when it comes to mobile processors. Engineers tend to use whatever processor they used last time. Programmers tend to prefer familiar chip families, too. That means tablet designers and cell phone makers are 99% certain to use ARM-based chips again, because they used ARM-based chips last time. Why would they switch without a compelling reason?

Intel hopes to provide that compelling reason, but I’m not convinced. Even if Intel’s upcoming mobile processors are just as fast, just as power-efficient, and just as inexpensive as competing ARM-based processors—and those are all substantial assumptions—Intel would only be level-pegging. That’s not enough to get a hardware maker to switch. That’s only enough to ante into the game. And even if the hardware was competitive, that doesn’t help the software situation at all. Nearly every portable wireless device in the world uses an ARM-based processor, so there’s a huge array of ARM code for that infrastructure. Code that’s already been tested, government-approved, and field-proven. The same goes for operating systems and, to some extent, even applications. Intel’s chips have very little of that. Yes, you can get Android and a handful of other operating systems. But middleware? Wireless certification? Protocol stacks? Better start coding.

And in an Et tu, Brute? moment, Microsoft has loudly proclaimed that Windows 8 will run on ARM processors as well as Intel chips. This isn’t the first time Microsoft has strayed from its sometime twin (remember Windows NT on Alpha and MIPS?), but this time the knife stroke comes at a particularly sensitive time. Windows, at least, was an x86-only option; if you wanted Windows you had to talk to Intel (and/or AMD). Now even that thread of hope is gone. Windows is no longer an assurance of “Intel Inside” or vice versa. Embedded designers can use Windows 8 on their mobile devices while still choosing an ARM-based chip to run it on. Whether or not Windows 8 ever becomes popular for embedded and mobile devices is an entirely different discussion. For another day. 

One thought on “Will Intel Make It In Mobile?”

  1. You know… I really hope that, if cellphones and tablets replace real computers, that serious docking stations become available. I only use the computing on my cellphone when I absolutely have to while away from my computer. I hate trying to read on the minuscule screen, I hate thumbing, and don’t even get me started on touchscreen typing (cell or tablet).

    I want a big screen and a real keyboard (oooo, can someone make one with IBM Selectric haptic feedback??!!!) for real old-fashioned high-speed touch typing where I don’t have to worry that every other keystroke is wrong (and absolutely without autoco-wrecks). If those happen to plug into some tiny boxlet instead of a laptop (or desktop), I’m cool with that…

    Jeez I feel like a grumpy old man right now…

Leave a Reply

featured blogs
Aug 3, 2021
Picking up from where we left off in the previous post , let's look at some more new and interesting changes made in Hotfix 019. As you might already know, Allegro ® System Capture is available... [[ Click on the title to access the full blog on the Cadence Community si...
Aug 2, 2021
Can you envision intelligent machines creating a 'work of art' involving biological implementations of human legs being used to power some sort of mechanism?...
Jul 30, 2021
You can't attack what you can't see, and cloaking technology for devices on Ethernet LANs is merely one of many protection layers implemented in Q-Net Security's Q-Box to protect networked devices and transaction between these devices from cyberattacks. Other security technol...
Jul 29, 2021
Learn why SoC emulation is the next frontier for power system optimization, helping chip designers shift power verification left in the SoC design flow. The post Why Wait Days for Results? The Next Frontier for Power Verification appeared first on From Silicon To Software....

featured video

DesignWare Controller and PHY IP for PCIe 6.0

Sponsored by Synopsys

See a demo of Synopsys’ complete IP solution for PCIe 6.0 technology showing the controller operating at 64GT/s in FLIT mode and the PAM-4 PHY in 5-nm process achieving two orders of magnitude better BER with 32dB PCIe channel.

Click here for more information about DesignWare IP for PCI Express (PCIe) 6.0

featured paper

Configure the charge and discharge current separately in a reversible buck/boost regulator

Sponsored by Maxim Integrated

The design of a front-end converter can be made less complicated when minimal extra current overhead is required for charging the supercapacitor. This application note explains how to configure the reversible buck/boost converter to achieve a lighter impact on the system during the charging phase. Setting the charge current requirement to the minimum amount keeps the discharge current availability intact.

Click to read more

featured chalk talk

Build, Deploy and Manage Your FPGA-based IoT Edge Applications

Sponsored by Mouser Electronics and Intel

Designing cloud-connected applications with FPGAs can be a daunting engineering challenge. But, new platforms promise to simplify the process and make cloud-connected IoT design easier than ever. In this episode of Chalk Talk, Amelia Dalton chats with Tak Ikushima of Intel about how a collaboration between Microsoft and Intel is pushing innovation forward with a new FPGA Cloud Connectivity Kit.

Click here for more information about Terasic Technologies FPGA Cloud Connectivity Kit