feature article
Subscribe Now

Amazon’s Head in the Cloud

Remarkable Depth and Breadth in Cloud Computing, and an Intriguing New Service

At face value, it is a bit of a brain twister: Amazon’s goal of being the “everything store” on the one hand, that is, and its massive cloud services business on the other. At first glance, not exactly peanut butter and chocolate. Walmart and Costco are not actively hawking their data processing capabilities—which one imagines as quite formidable—on the open market.

Turn the clock back a few years and it makes sense. Amazon developed their massive datacenters in-house because their requirements could not be readily met with existing solutions. As time passed, they developed more and more value-added differentiation. And, at some point, someone thinking well outside the box suggests “let’s monetize our unique datacenter capabilities by selling them in the emerging cloud computing market.”  At least that is how I envision it going down; I am sure reality was more nuanced and more interesting.

Amazon Web Services (AWS) is massive in every regard. Jeff Bezos isn’t talking numbers, but general consensus is that AWS consists of 1-2 million physical servers; sorry for the broad range there—again, nobody’s talking specifics.

Every bit as impressive as the sheer scale of the datacenter is the scale of the various services offered; AWS provides FAR more than raw CPUs horsepower. Just a sampling:

  • Storage in many different cost-performance configurations
  • Databases from pedestrian SQL to powerful column-oriented tools
  • Complete virtual desktop environments
  • Services as specific as media transcoding

All of the above and much more are available on-demand, in limitless quantity and at remarkably low pay-as-you-go prices. And at the risk of stating the obvious, many name-brand B2C and B2B cloud services are built and run entirely atop AWS.

While AWS represents just 2% of Amazon’s revenue, it is growing at roughly 40% YoY versus roughly 25% for merchandise. And it is not hard to imagine that AWS is more profitable than Amazon’s overall business, because, well, the company’s overall business has almost never been profitable.

Integrate all of the above factoids and one concludes that AWS is very important to Amazon and continues to garner plenty of attention from the company.

Connecting a few dots, AWS recently announced a new ‘C4’ instance using Intel’s shiny new octadeca core Xeon processor. That didn’t take very long. AWS will offer the C4 instance in multiple configurations from 2 to 36 virtual (hyperthreaded) cores and 4GB to 60GB of RAM. Almost exactly a year ago I mused:

Ideally, we want the ability to combine CPUs on multiple nodes into a super-node as the application workload demands.

While not the vector I was suggesting at the time, this new C4 instance is a solid (though less imaginative) solution to providing wide dynamic range scalability.

Speaking of imaginative, last month AWS announced a new service called Lambda. The funky name requires an explanation:

AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you, making it easy to build applications that respond quickly to new information.

“Quickly” is loosely defined as milliseconds; I say “loosely” because at one point 100 milliseconds is cited. Lambda provides a remarkable level of abstraction: write your function in Java Script (the only option for now) and Lambda handles ALL of the provisioning. No bothering with infrastructure, instances … nada. Triggering events come in all shapes and sizes: activity from other AWS services, data held in AWS … or external devices via Amazon Kinesis (“a fully managed service for real-time processing of streaming data”).

Now THIS is interesting, notably for IoT. I’ve discussed hybrid mobile-cloud and hybrid IoT-cloud computing over the past year. AWS Lambda enables all sorts of creative hybrid compute applications AND hides the vast majority of the cloud complexity to boot. An IoT device can trigger an event that executes complex analytics well beyond what is possible with the local microcontroller horsepower. IoT streams can interact with massive databases (my augmented reality application, as discussed in the first link in this paragraph). The aforementioned cornucopia of AWS services can be brought to bear.

And AWS Lambda appears to be extraordinarily cost-effective:

  • $0.20 per million event triggers
  • $0.0000021 per second of execution time with 128MB of RAM

I say “appears” because the clever marketing person who developed the pricing made certain that all of the numbers are VERY small (jillioniths of a penny!) while not exactly providing a lot of detail (execution time on what flavor instance?) one needs. Let’s hope that the result is NOT AT ALL like the passenger who recently racked up a $1200 bill for transpacific Wi-Fi.

How might we employ AWS Lambda in an IoT application? Picking one crazy idea at random here: imagine your teenager comes home late, triggers the security system AND forgets to disarm it. This creates a trigger to AWS Lambda:

  • Cameras at the front door and foyer snap photos in response and upload the images
  • AWS performs facial recognition and matches against a “white list” of known not-intruders
  • Assuming we get a STRONG match against your teenager, a full alarm is averted and replaced by a stern verbal warning. (Alternatively, the IoT deadbolt could simply lock him or her in the house.)

The above local/cloud workload partition plays to strengths: the facial recognition would require a fair amount of local compute horsepower, and depending on how (one should hope) infrequently the algorithm runs, it is far more efficient to pay-as-you-go in the cloud. This is a CRAZY idea, as noted above, especially given my near-paranoia on all things security related. Yet, if we use elliptic-curve cryptography to secure … topic for another day. In any case, I am VERY keen to see how AWS Lambda is used in real-world IoT applications, especially after a bit of experimentation and Darwinian selection.

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

One Year of Synopsys Cloud: Adoption, Enhancements and Evolution
Sponsored by Synopsys
The adoption of the cloud in the design automation industry has encouraged innovation across the entire semiconductor lifecycle. In this episode of Chalk Talk, Amelia Dalton chats with Vikram Bhatia from Synopsys about how Synopsys is redefining EDA in the Cloud with the industry’s first complete browser-based EDA-as-a-Service cloud platform. They explore the benefits that this on-demand pay-per use, web-based portal can bring to your next design. 
Jul 11, 2023
32,875 views