feature article
Subscribe Now

Head in the Clouds

Some Concepts Never Die

Is it just me, or is “cloud computing” the dumbest idea ever?

Oh, sure, I know IT managers and journalists and armchair pundits are getting all breathless over the cloud. The cloud is going to change everything. The cloud is going to spell the demise of Microsoft and the rise of Google, or Zynga, or something else. The cloud will rescue us all from… I’m not sure what. But it’ll be great. You just see.

To which I reply with a hearty, WTF?

Did you people not live through the 1980s? Do you not remember timesharing computers, modems, and dumb terminals? Did you think PCs and laptops were a step backwards?

I guess Moore’s Law was all just a mistake, then. And Amdahl’s Law, too. And Shannon’s Law. Faster computers, more RAM, bigger disks, better graphics—that was all just a distraction, was it? Just a 30-year detour before we could get to the good stuff?

Let me get this straight: we want to move our programs and data files off of our computers, where they’re fast, local, and private and move them all “into the cloud” where they’re remote, slow, and susceptible to hacking. And apparently we’re doing this because PCs have become too capable, too reliable, and too cheap. Yes, I begin to see the logic now.

Maybe all this talk of cloud computing is just a mass fit of nostalgia. Evidently we’re yearning for the bygone days of gigantic timesharing computers in some remote location that we access with a dumb terminal over the telecommunications network. Instead of a TTY Model 33 or a TeleVideo 925 terminal plugged into a Hayes modem, we now have Chromebooks and Wi-Fi. Never mind that the Samsung Chromebook costs as much as a real PC—you know, the kind with an operating system, a disk drive and a lot of RAM, that lets you install and run programs. Heavens! A computer that can run its own applications and store gigabytes of data and access it all instantly? How quaint!

Don’t get me wrong. I’ve used cloud-based applications before, and they have their place. Google Docs is useful for sharing notes with people in real-time. It’s easier than uploading a Word document, editing it, downloading the latest revision, making sure everyone else has the same version, and so on. For quick-and-dirty document sharing, it’s fine. But for real writing? Or spreadsheets? Or presentations? No, thank you.

Web-based apps have terrible user interfaces, for starters. On a local machine you can do anything you like. Each app has control of the screen, access to the keyboard and mouse, and nearly infinite local storage. Web-based apps have none of that. They have to make do with browser-based graphics, HTML-compatible input methods, and slow two-way communications. It’s like the old teletype days. Gee, I guess modern graphics was getting too good. What problem are we fixing, exactly?

When I type on my local computer, the keystrokes travel a grand total of maybe 3 inches from the keyboard to the microprocessor. When I type on a Web-based application, every single keystroke goes out through my Wi-Fi connection, through the router, out the local cable connection to some central office, and then up and down through who-knows-how-many optical and wireless backhaul links before it gets to the Web server. Every. Single. Keystroke.

When my local apps want to display something on my screen, pixels travel maybe 5-6 inches from my graphics chip to my LCD screen. With a Web-based app, some metadata describing the screen image travels untold miles over cable, optical, and/or wireless infrastructure from some windowless server farm in East Potato, Idaho to my screen. Gosh, that’s efficient. Not to mention a colossal waste of my graphics card. You know, the one with the multiple 256-bit 3D engines, 1GB of high-speed video RAM, and enough processing power to launch an ICBM?

Of course, my PC has an Internet connection just like everyone’s. So let’s take the least reliable part of the whole computer and put it in the critical path. Cloud computers demand an Internet connection. Normal computers work anywhere. But I guess the first kind is better because reliably is overrated. With cloud computers, everything you have—your data, your applications, your office work, your personal information—is inaccessible to you unless you’ve got a permanent (likely wireless) Internet connection. Forget working on airplanes or the beach. Or anywhere, really, that has spotty coverage. Think dropped cell phone calls are annoying? Imagine that your job ends where the coverage does. Oops, sorry, boss. I had the payroll files here a minute ago…

And who controls that all-important connection? The telecommunications companies. That means your access to your data is utterly dependant on AT&T or Comcast or Deutsche Telekom or Italtel or whatever local carrier you’re using. Since when has the cable or phone company been your best friend?

Those airwaves are also under the jurisdiction of your government. In the United States that’s the FCC, but every country has its own equivalent. Different rules in different countries; what happens when you travel and don’t want to (or can’t) organize a roaming agreement? Who’s to say what rules or regulatory changes might affect the broadband spectrum or the use of those airwaves? And in a worst-case scenario, what if wireless broadband is found to be a health hazard? Or if bad guys manage to jam the airwaves? That’s a very thin thread connecting you to everything you do.

And security? I can’t recall the last time someone hacked the hard disk inside my laptop. Oh, that’s right—never. Compare that to Sony (to pick on one example), which has been hacked twenty times in the last year alone. More than 100 million Sony customers had their names, e-mail addresses, passwords, and in some cases credit-card information stolen. It’s like personal data is just raining out of the cloud.

Nevertheless, proponents say cloud computing is safer than storing your data on a local computer. Because, you know, you could have your laptop stolen and where would you be then, huh? Yes, but physically stealing a laptop is a lot harder than guessing someone’s password. And even if your PC does get swiped, it’s only one laptop. Hacking an online account is far more destructive, since most people use the same password for everything.

Then there’s the forced software updates. On the cloud, everyone runs the same version because they have no choice. It’s all downloaded to your machine every time. Imagine how terrific that will be.

Think how much you enjoy the regular software updates you get now from Adobe, Apple, Oracle/Java, Microsoft, and the rest. Soon you can have that full-time! Never again be able to refuse (or even postpone) a software update. Your IT overlord will determine exactly when you update every bit of software you have.  

Forget fast boot times. You’ll spend every morning watching your applications download—and then learning about the changes in the new version.

Today, if you don’t like the new features of Office 2010 or Final Cut Pro X you don’t have to install them. Prefer to keep your old copy of Excel? No problem. But not in the cloud-based world. Every application update is forced on you whether you want it or not. New release is buggy? Everyone gets the bug. Next release patches it? Everyone gets the fix. It’ll be like watching the Jackie Gleason Show on network television all over again: we’ll all have the same shared experience to discuss around the water cooler.

Of course, IT managers love this whole idea because it’ll make support so much easier. By golly, we’ll keep those pesky users in line. No more variations. Everything homogenous. I’m sure most dictators love the idea of centralized control, too. It’s so much simpler than individual freedom. Strength through standardization.

I remember timesharing computers from the 1970s and ’80s. They were big and slow and expensive, but they were the only way normal people could get access to a computer. Then came the personal computer, and the world’s most dynamic, profitable, and personally enabling industry was born. But I guess all that was a bad idea. Time to go back to centralized remote timesharing.

So put on your Members Only jacket, gas up the Gremlin, and head to the disco. We’re going retro. It’s time to put our heads in the cloud.  

Leave a Reply

featured blogs
Apr 17, 2024
The semiconductor industry thrives on innovation, and at the heart of this progress lies Electronic Design Automation (EDA). EDA tools allow engineers to design and evaluate chips, before manufacturing, a data-intensive process. It would not be wrong to say that data is the l...
Apr 16, 2024
Learn what IR Drop is, explore the chip design tools and techniques involved in power network analysis, and see how it accelerates the IC design flow.The post Leveraging Early Power Network Analysis to Accelerate Chip Design appeared first on Chip Design....
Mar 30, 2024
Join me on a brief stream-of-consciousness tour to see what it's like to live inside (what I laughingly call) my mind...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

Silence of the Amps: µModule Regulators
In this episode of Chalk Talk, Amelia Dalton and Younes Salami from Analog Devices explore the benefits of Analog Devices’ silent switcher technology. They also examine the pros and cons of switch mode power supplies and how you can utilize silent switcher µModule regulators in your next design.
Dec 13, 2023
17,520 views