feature article
Subscribe Now

Head in the Clouds

Some Concepts Never Die

Is it just me, or is “cloud computing” the dumbest idea ever?

Oh, sure, I know IT managers and journalists and armchair pundits are getting all breathless over the cloud. The cloud is going to change everything. The cloud is going to spell the demise of Microsoft and the rise of Google, or Zynga, or something else. The cloud will rescue us all from… I’m not sure what. But it’ll be great. You just see.

To which I reply with a hearty, WTF?

Did you people not live through the 1980s? Do you not remember timesharing computers, modems, and dumb terminals? Did you think PCs and laptops were a step backwards?

I guess Moore’s Law was all just a mistake, then. And Amdahl’s Law, too. And Shannon’s Law. Faster computers, more RAM, bigger disks, better graphics—that was all just a distraction, was it? Just a 30-year detour before we could get to the good stuff?

Let me get this straight: we want to move our programs and data files off of our computers, where they’re fast, local, and private and move them all “into the cloud” where they’re remote, slow, and susceptible to hacking. And apparently we’re doing this because PCs have become too capable, too reliable, and too cheap. Yes, I begin to see the logic now.

Maybe all this talk of cloud computing is just a mass fit of nostalgia. Evidently we’re yearning for the bygone days of gigantic timesharing computers in some remote location that we access with a dumb terminal over the telecommunications network. Instead of a TTY Model 33 or a TeleVideo 925 terminal plugged into a Hayes modem, we now have Chromebooks and Wi-Fi. Never mind that the Samsung Chromebook costs as much as a real PC—you know, the kind with an operating system, a disk drive and a lot of RAM, that lets you install and run programs. Heavens! A computer that can run its own applications and store gigabytes of data and access it all instantly? How quaint!

Don’t get me wrong. I’ve used cloud-based applications before, and they have their place. Google Docs is useful for sharing notes with people in real-time. It’s easier than uploading a Word document, editing it, downloading the latest revision, making sure everyone else has the same version, and so on. For quick-and-dirty document sharing, it’s fine. But for real writing? Or spreadsheets? Or presentations? No, thank you.

Web-based apps have terrible user interfaces, for starters. On a local machine you can do anything you like. Each app has control of the screen, access to the keyboard and mouse, and nearly infinite local storage. Web-based apps have none of that. They have to make do with browser-based graphics, HTML-compatible input methods, and slow two-way communications. It’s like the old teletype days. Gee, I guess modern graphics was getting too good. What problem are we fixing, exactly?

When I type on my local computer, the keystrokes travel a grand total of maybe 3 inches from the keyboard to the microprocessor. When I type on a Web-based application, every single keystroke goes out through my Wi-Fi connection, through the router, out the local cable connection to some central office, and then up and down through who-knows-how-many optical and wireless backhaul links before it gets to the Web server. Every. Single. Keystroke.

When my local apps want to display something on my screen, pixels travel maybe 5-6 inches from my graphics chip to my LCD screen. With a Web-based app, some metadata describing the screen image travels untold miles over cable, optical, and/or wireless infrastructure from some windowless server farm in East Potato, Idaho to my screen. Gosh, that’s efficient. Not to mention a colossal waste of my graphics card. You know, the one with the multiple 256-bit 3D engines, 1GB of high-speed video RAM, and enough processing power to launch an ICBM?

Of course, my PC has an Internet connection just like everyone’s. So let’s take the least reliable part of the whole computer and put it in the critical path. Cloud computers demand an Internet connection. Normal computers work anywhere. But I guess the first kind is better because reliably is overrated. With cloud computers, everything you have—your data, your applications, your office work, your personal information—is inaccessible to you unless you’ve got a permanent (likely wireless) Internet connection. Forget working on airplanes or the beach. Or anywhere, really, that has spotty coverage. Think dropped cell phone calls are annoying? Imagine that your job ends where the coverage does. Oops, sorry, boss. I had the payroll files here a minute ago…

And who controls that all-important connection? The telecommunications companies. That means your access to your data is utterly dependant on AT&T or Comcast or Deutsche Telekom or Italtel or whatever local carrier you’re using. Since when has the cable or phone company been your best friend?

Those airwaves are also under the jurisdiction of your government. In the United States that’s the FCC, but every country has its own equivalent. Different rules in different countries; what happens when you travel and don’t want to (or can’t) organize a roaming agreement? Who’s to say what rules or regulatory changes might affect the broadband spectrum or the use of those airwaves? And in a worst-case scenario, what if wireless broadband is found to be a health hazard? Or if bad guys manage to jam the airwaves? That’s a very thin thread connecting you to everything you do.

And security? I can’t recall the last time someone hacked the hard disk inside my laptop. Oh, that’s right—never. Compare that to Sony (to pick on one example), which has been hacked twenty times in the last year alone. More than 100 million Sony customers had their names, e-mail addresses, passwords, and in some cases credit-card information stolen. It’s like personal data is just raining out of the cloud.

Nevertheless, proponents say cloud computing is safer than storing your data on a local computer. Because, you know, you could have your laptop stolen and where would you be then, huh? Yes, but physically stealing a laptop is a lot harder than guessing someone’s password. And even if your PC does get swiped, it’s only one laptop. Hacking an online account is far more destructive, since most people use the same password for everything.

Then there’s the forced software updates. On the cloud, everyone runs the same version because they have no choice. It’s all downloaded to your machine every time. Imagine how terrific that will be.

Think how much you enjoy the regular software updates you get now from Adobe, Apple, Oracle/Java, Microsoft, and the rest. Soon you can have that full-time! Never again be able to refuse (or even postpone) a software update. Your IT overlord will determine exactly when you update every bit of software you have.  

Forget fast boot times. You’ll spend every morning watching your applications download—and then learning about the changes in the new version.

Today, if you don’t like the new features of Office 2010 or Final Cut Pro X you don’t have to install them. Prefer to keep your old copy of Excel? No problem. But not in the cloud-based world. Every application update is forced on you whether you want it or not. New release is buggy? Everyone gets the bug. Next release patches it? Everyone gets the fix. It’ll be like watching the Jackie Gleason Show on network television all over again: we’ll all have the same shared experience to discuss around the water cooler.

Of course, IT managers love this whole idea because it’ll make support so much easier. By golly, we’ll keep those pesky users in line. No more variations. Everything homogenous. I’m sure most dictators love the idea of centralized control, too. It’s so much simpler than individual freedom. Strength through standardization.

I remember timesharing computers from the 1970s and ’80s. They were big and slow and expensive, but they were the only way normal people could get access to a computer. Then came the personal computer, and the world’s most dynamic, profitable, and personally enabling industry was born. But I guess all that was a bad idea. Time to go back to centralized remote timesharing.

So put on your Members Only jacket, gas up the Gremlin, and head to the disco. We’re going retro. It’s time to put our heads in the cloud.  

Leave a Reply

featured blogs
Oct 22, 2020
WARNING: If you read this blog and visit the featured site, Max'€™s Cool Beans will accept no responsibility for the countless hours you may fritter away....
Oct 22, 2020
Cadence ® Spectre ® AMS Designer is a high-performance mixed-signal simulation system. The ability to use multiple engines and drive from a variety of platforms enables you to "rev... [[ Click on the title to access the full blog on the Cadence Community site....
Oct 20, 2020
In 2020, mobile traffic has skyrocketed everywhere as our planet battles a pandemic. Samtec.com saw nearly double the mobile traffic in the first two quarters than it normally sees. While these levels have dropped off from their peaks in the spring, they have not returned to ...
Oct 16, 2020
[From the last episode: We put together many of the ideas we'€™ve been describing to show the basics of how in-memory compute works.] I'€™m going to take a sec for some commentary before we continue with the last few steps of in-memory compute. The whole point of this web...

featured video

Better PPA with Innovus Mixed Placer Technology – Gigaplace XL

Sponsored by Cadence Design Systems

With the increase of on-chip storage elements, it has become extremely time consuming to come up with an optimized floorplan with manual methods. Innovus Implementation’s advanced multi-objective placement technology, GigaPlace XL, provides automation to optimize at scale, concurrent placement of macros, and standard cells for multiple objectives like timing, wirelength, congestion, and power. This technology provides an innovative way to address design productivity along with design quality improvements reducing weeks of manual floorplan time down to a few hours.

Click here for more information about Innovus Implementation System

featured paper

An engineer’s guide to autonomous and collaborative industrial robots

Sponsored by Texas Instruments

As robots are becoming more commonplace in factories, it is important that they become more intelligent, autonomous, safer and efficient. All of this is enabled with precise motor control, advanced sensing technologies and processing at the edge, all with robust real-time communication. In our e-book, an engineer’s guide to industrial robots, we take an in-depth look at the key technologies used in various robotic applications.

Click here to download the e-book

Featured Chalk Talk

Rail Data Connectivity

Sponsored by Mouser Electronics and TE Connectivity

The rail industry is undergoing a technological revolution right now, and Ethernet connectivity is at the heart of it. But, finding the right interconnect solutions for high-reliability applications such as rail isn’t easy. In this episode of Chalk Talk, Amelia Dalton chats with Egbert Stellinga from TE Connectivity about TE’s portfolio of interconnect solutions for rail and other reliability-critical applications.

Click here for more information about TE Connectivity EN50155 Managed Ethernet Switches