feature article
Subscribe Now

Programming Below Decks

REST and the Furtherance of the Two-Class System

Do you remember that scene in Titanic where the lower-class passengers are trapped behind locked gates and left to drown? (To be fair, it’s more than one scene; it’s about half of the movie.) Their cabins are small, the viewing portholes are nonexistent (they’re probably below the waterline anyway), and there are no linen tablecloths, or polished silverware, or string quartets to be seen. It’s an us-versus-them world of transatlantic travel.

The same thing is happening with programmers. We’ve got the dinner jacket programmers who deal with abstract, high-level concepts. And then we’ve got the programmers working below decks, toiling away in the boiler rooms of the world’s hardware companies that make the world go ‘round.

The first group keeps their hands and noses clean. The second group sweats and grunts and labors on the sharp and uncomfortable underside of our daily products. As with any society, you can tell the plebes from the patricians by their language. The white-tie crowd speaks in Java or HTML5, while the hoi polloi converse using assembly language or C. Those with aspirations of moving up in the world might try learning C++, but that’s often just pretentious affectation.

I’m not saying that there’s anything inherently wrong with this. Only that it didn’t used to be this way. Class distinctions have been a recurring part of history. But it used to be that we all graduated from a single class. We were all programmers. Now we’re [__fill in the blank__] programmers.

This division was driven home to me when m’colleague Bryon Moyer pointed out how often the term “RESTful” interface programming had started to crop up in conversation. Everyone used RESTful as a desirable adjective, sort of like “paid vacation” or “free beer.”

REST, for those who are not au courant, is an acronym for representational state transfer, and it means coding a Web interface using nothing but the standard HTTP functions like GET and PUT. In a sense, you make your application look to all the world like it’s a browser.

The purported advantages of REST are that it’s scalable, since you’re relying only on functions that the whole world has already standardized; and that it’s easy to use. I mean, how hard can it be to memorize a four-word vocabulary: PUT, GET, POST, and DELETE? By definition, then, the Web itself can be called RESTful – a characterization I’m sure many nonprogrammers would dispute.

RESTful programming is also fault-tolerant, just like servers and browsers. RESTful programming calls are supposed to be idempotent and nullipotent – two words you don’t get to use very often – which basically means that you can accidentally repeat commands with no ill effects. That’s important when you’re working over a flaky Internet connection. For example, repeating a PUT command n number of times shouldn’t screw up the database you’re talking to.

That’s all swell… but it’s not programming. That’s more like snapping LEGO blocks together and calling it architecture, or finishing a paint-by-numbers velvet painting and calling it art. The programmers who rely on REST are able to do so only because the real programmers built the underlying code that makes it all work. I’m all in favor of building atop the work of others. Just don’t call it programming. The suits parading around on the upper decks of Titanic aren’t the ones making it go.

REST makes a fine knothole for cloud-based interfaces, but lately it’s being touted as a panacea; an API for all seasons. “Hey, we can code our entire application using RESTful interfaces!” Yes, and you can also run it on a CPU with exactly one instruction. That doesn’t make it a good idea.

Managers sometimes see REST as a safety net. With only four HTTP verbs and a stateless client-server model, how much trouble can your programmers get into? They’ll create simpler, safer, and more “robust” programs (whatever that means). But if that’s the manager’s major concern, perhaps he should consider removing sharp objects from the programmers’ lab. Or just hire better programmers. Supplying them with plush toys and special jackets with really long arms would make them safer, too. But not necessarily more productive.

I recently spoke with a large European industrial firm that is shifting all of its code development to Java. The company felt that standardizing on Java, rather than C or C++, would make its programmers more productive. They’d spend less time writing (and debugging) low-level code and more time thinking about high-level problems to be solved. They want their programmers to be “subject-matter experts,” not code smiths.

That approach makes sense – for them. In this specific case, with their products and their market focus, Java is perfectly capable of doing what they want. It’s enough of a “real” language that developers can implement just about any function they could need, but also abstract enough that they can gloss over some of the low-level detail their managers are hoping to avoid. If it works out as they hope, their programmers will indeed become subject-matter experts, proficient in various application niches.

But somebody, somewhere, still has to write the Java virtual machine they’re all going to rely upon. In fact, they’ll need several JVMs, because the company’s hardware is varied and nonstandard. In short, somebody will still have to do the hands-on, down-to-the-metal programming that this team will build upon.  It’s a team effort, even if you never meet the other members of the team. 

Leave a Reply

featured blogs
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
Analog Behavioral Modeling involves creating models that mimic a desired external circuit behavior at a block level rather than simply reproducing individual transistor characteristics. One of the significant benefits of using models is that they reduce the simulation time. V...
Apr 16, 2024
Learn what IR Drop is, explore the chip design tools and techniques involved in power network analysis, and see how it accelerates the IC design flow.The post Leveraging Early Power Network Analysis to Accelerate Chip Design appeared first on Chip Design....

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured chalk talk

Data Connectivity at Phoenix Contact
Single pair ethernet provides a host of benefits that can enable seamless data communication for a variety of different applications. In this episode of Chalk Talk, Amelia Dalton and Guadalupe Chalas from Phoenix Contact explore the role that data connectivity will play for the future of an all electric society, the benefits that single pair ethernet brings to IIoT designs and how Phoenix Contact is furthering innovation in this arena.
Jan 5, 2024
14,497 views