feature article
Subscribe Now

Head in the Clouds

Some Concepts Never Die

Is it just me, or is “cloud computing” the dumbest idea ever?

Oh, sure, I know IT managers and journalists and armchair pundits are getting all breathless over the cloud. The cloud is going to change everything. The cloud is going to spell the demise of Microsoft and the rise of Google, or Zynga, or something else. The cloud will rescue us all from… I’m not sure what. But it’ll be great. You just see.

To which I reply with a hearty, WTF?

Did you people not live through the 1980s? Do you not remember timesharing computers, modems, and dumb terminals? Did you think PCs and laptops were a step backwards?

I guess Moore’s Law was all just a mistake, then. And Amdahl’s Law, too. And Shannon’s Law. Faster computers, more RAM, bigger disks, better graphics—that was all just a distraction, was it? Just a 30-year detour before we could get to the good stuff?

Let me get this straight: we want to move our programs and data files off of our computers, where they’re fast, local, and private and move them all “into the cloud” where they’re remote, slow, and susceptible to hacking. And apparently we’re doing this because PCs have become too capable, too reliable, and too cheap. Yes, I begin to see the logic now.

Maybe all this talk of cloud computing is just a mass fit of nostalgia. Evidently we’re yearning for the bygone days of gigantic timesharing computers in some remote location that we access with a dumb terminal over the telecommunications network. Instead of a TTY Model 33 or a TeleVideo 925 terminal plugged into a Hayes modem, we now have Chromebooks and Wi-Fi. Never mind that the Samsung Chromebook costs as much as a real PC—you know, the kind with an operating system, a disk drive and a lot of RAM, that lets you install and run programs. Heavens! A computer that can run its own applications and store gigabytes of data and access it all instantly? How quaint!

Don’t get me wrong. I’ve used cloud-based applications before, and they have their place. Google Docs is useful for sharing notes with people in real-time. It’s easier than uploading a Word document, editing it, downloading the latest revision, making sure everyone else has the same version, and so on. For quick-and-dirty document sharing, it’s fine. But for real writing? Or spreadsheets? Or presentations? No, thank you.

Web-based apps have terrible user interfaces, for starters. On a local machine you can do anything you like. Each app has control of the screen, access to the keyboard and mouse, and nearly infinite local storage. Web-based apps have none of that. They have to make do with browser-based graphics, HTML-compatible input methods, and slow two-way communications. It’s like the old teletype days. Gee, I guess modern graphics was getting too good. What problem are we fixing, exactly?

When I type on my local computer, the keystrokes travel a grand total of maybe 3 inches from the keyboard to the microprocessor. When I type on a Web-based application, every single keystroke goes out through my Wi-Fi connection, through the router, out the local cable connection to some central office, and then up and down through who-knows-how-many optical and wireless backhaul links before it gets to the Web server. Every. Single. Keystroke.

When my local apps want to display something on my screen, pixels travel maybe 5-6 inches from my graphics chip to my LCD screen. With a Web-based app, some metadata describing the screen image travels untold miles over cable, optical, and/or wireless infrastructure from some windowless server farm in East Potato, Idaho to my screen. Gosh, that’s efficient. Not to mention a colossal waste of my graphics card. You know, the one with the multiple 256-bit 3D engines, 1GB of high-speed video RAM, and enough processing power to launch an ICBM?

Of course, my PC has an Internet connection just like everyone’s. So let’s take the least reliable part of the whole computer and put it in the critical path. Cloud computers demand an Internet connection. Normal computers work anywhere. But I guess the first kind is better because reliably is overrated. With cloud computers, everything you have—your data, your applications, your office work, your personal information—is inaccessible to you unless you’ve got a permanent (likely wireless) Internet connection. Forget working on airplanes or the beach. Or anywhere, really, that has spotty coverage. Think dropped cell phone calls are annoying? Imagine that your job ends where the coverage does. Oops, sorry, boss. I had the payroll files here a minute ago…

And who controls that all-important connection? The telecommunications companies. That means your access to your data is utterly dependant on AT&T or Comcast or Deutsche Telekom or Italtel or whatever local carrier you’re using. Since when has the cable or phone company been your best friend?

Those airwaves are also under the jurisdiction of your government. In the United States that’s the FCC, but every country has its own equivalent. Different rules in different countries; what happens when you travel and don’t want to (or can’t) organize a roaming agreement? Who’s to say what rules or regulatory changes might affect the broadband spectrum or the use of those airwaves? And in a worst-case scenario, what if wireless broadband is found to be a health hazard? Or if bad guys manage to jam the airwaves? That’s a very thin thread connecting you to everything you do.

And security? I can’t recall the last time someone hacked the hard disk inside my laptop. Oh, that’s right—never. Compare that to Sony (to pick on one example), which has been hacked twenty times in the last year alone. More than 100 million Sony customers had their names, e-mail addresses, passwords, and in some cases credit-card information stolen. It’s like personal data is just raining out of the cloud.

Nevertheless, proponents say cloud computing is safer than storing your data on a local computer. Because, you know, you could have your laptop stolen and where would you be then, huh? Yes, but physically stealing a laptop is a lot harder than guessing someone’s password. And even if your PC does get swiped, it’s only one laptop. Hacking an online account is far more destructive, since most people use the same password for everything.

Then there’s the forced software updates. On the cloud, everyone runs the same version because they have no choice. It’s all downloaded to your machine every time. Imagine how terrific that will be.

Think how much you enjoy the regular software updates you get now from Adobe, Apple, Oracle/Java, Microsoft, and the rest. Soon you can have that full-time! Never again be able to refuse (or even postpone) a software update. Your IT overlord will determine exactly when you update every bit of software you have.  

Forget fast boot times. You’ll spend every morning watching your applications download—and then learning about the changes in the new version.

Today, if you don’t like the new features of Office 2010 or Final Cut Pro X you don’t have to install them. Prefer to keep your old copy of Excel? No problem. But not in the cloud-based world. Every application update is forced on you whether you want it or not. New release is buggy? Everyone gets the bug. Next release patches it? Everyone gets the fix. It’ll be like watching the Jackie Gleason Show on network television all over again: we’ll all have the same shared experience to discuss around the water cooler.

Of course, IT managers love this whole idea because it’ll make support so much easier. By golly, we’ll keep those pesky users in line. No more variations. Everything homogenous. I’m sure most dictators love the idea of centralized control, too. It’s so much simpler than individual freedom. Strength through standardization.

I remember timesharing computers from the 1970s and ’80s. They were big and slow and expensive, but they were the only way normal people could get access to a computer. Then came the personal computer, and the world’s most dynamic, profitable, and personally enabling industry was born. But I guess all that was a bad idea. Time to go back to centralized remote timesharing.

So put on your Members Only jacket, gas up the Gremlin, and head to the disco. We’re going retro. It’s time to put our heads in the cloud.  

Leave a Reply

featured blogs
Sep 21, 2023
Wireless communication in workplace wearables protects and boosts the occupational safety and productivity of industrial workers and front-line teams....
Sep 21, 2023
Labforge is a Waterloo, Ontario-based company that designs, builds, and manufactures smart cameras used in industrial automation and defense applications. By bringing artificial intelligence (AI) into their vision systems with Cadence , they can automate tasks that are diffic...
Sep 21, 2023
At Qualcomm AI Research, we are working on applications of generative modelling to embodied AI and robotics, in order to enable more capabilities in robotics....
Sep 21, 2023
Not knowing all the stuff I don't know didn't come easy. I've had to read a lot of books to get where I am....
Sep 21, 2023
See how we're accelerating the multi-die system chip design flow with partner Samsung Foundry, making it easier to meet PPA and time-to-market goals.The post Samsung Foundry and Synopsys Accelerate Multi-Die System Design appeared first on Chip Design....

Featured Video

Chiplet Architecture Accelerates Delivery of Industry-Leading Intel® FPGA Features and Capabilities

Sponsored by Intel

With each generation, packing millions of transistors onto shrinking dies gets more challenging. But we are continuing to change the game with advanced, targeted FPGAs for your needs. In this video, you’ll discover how Intel®’s chiplet-based approach to FPGAs delivers the latest capabilities faster than ever. Find out how we deliver on the promise of Moore’s law and push the boundaries with future innovations such as pathfinding options for chip-to-chip optical communication, exploring new ways to deliver better AI, and adopting UCIe standards in our next-generation FPGAs.

To learn more about chiplet architecture in Intel FPGA devices visit https://intel.ly/45B65Ij

featured paper

An Automated Method for Adding Resiliency to Mission-Critical SoC Designs

Sponsored by Synopsys

Adding safety measures to SoC designs in the form of radiation-hardened elements or redundancy is essential in making mission-critical applications in the A&D, cloud, automotive, robotics, medical, and IoT industries more resilient against random hardware failures that occur. This paper discusses the automated process of implementing the safety mechanisms/measures (SM) in the design to make them more resilient and analyze their effectiveness from design inception to the final product.

Click here to read more

featured chalk talk

DIN Rail Power Solutions' Usage and Advantages
Sponsored by Mouser Electronics and MEAN WELL
DIN rail power supplies with their clean installation and quiet fanless design can be a great solution to solve the common power supply concerns that come along with industrial and building automation applications. In this episode of Chalk Talk, Kai Li from MEAN WELL and Amelia Dalton discuss the variety of benefits that DIN rail power supplies can bring to your next design. They examine how these power supplies can help with power buffering, power distribution, redundancy and more.
Nov 28, 2022
35,375 views