Cloud Security Authors: Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz

Related Topics: @CloudExpo, Cloud Security

@CloudExpo: Article

Amazon Launches High Performance Cloud – Hackers in Love

New service enables ultrafast number crunching (and password cracking)

Calling it a "nuclear-powered bulldozer", yesterday, Amazon announced and blogged about its newest cloud infrastructure service, the "Cluster GPU Instance", which delivers supercomputer calculation power for as little as $2.10 per hour.  The new instance type employs the same NVIDIA Tesla processor used in three of the five fastest supercomputers.  It is rated at 515 gigaflops (515 billion double-precision floating point calculations per second) and each Amazon instance employs two of them, giving each instance more than one teraflop of processing power.  Amazon further allows instances to be clustered "up through and above 128 nodes" for even more power. 

Theoretically, a 128-node cluster of the new Amazon EC2 instances would qualify as the 50th fastest computer in the world.  The new instance type enables a wide variety of calculation-intensive workloads for applications that include energy exploration, weather prediction, graphics rendering, and video transcoding.  And, oh, it is also good for enabling encryption code breaking and identity theft.

Amazon CTO Werner Vogels spoke at both 1st and 2nd Cloud Expos

I doubt Amazon's founder Jeff Bezos or its web services evangelist Jeff Barr are feeling like Oppenheimer did when he witnessed the first test of his creation, the atom bomb, and famously quoted Hindu scripture about becoming "the destroyer of worlds", but maybe they should feel that way.  This is a big moment for good and evil.

First, the Good

Not long ago, computing power of this magnitude was a very precious resource, only available to the large, well-funded companies, government agencies, and academic institutions who could afford to buy and manage expensive supercomputers from companies like Cray and IBM.

Then, things started to change with the growth of computer gaming, scientific visualization, computer animation, and media streaming, which drove the development and volume production of processing chips called "graphics processing units" (GPUs) by companies like NVIDIA and ATI.

In their namesake application of graphics processing, GPUs perform the complex "floating point" decimal arithmetic needed to render and manipulate highly detailed graphics and photorealistic computer-generated imagery, or "CGI".  (Conventional CPU chips, like the x86, ARM, and others, can only perform integer arithmetic "in hardware", making them ill-suited for efficient graphics processing.)

But, there are many other, non-graphical applications that also require floating point calculations and it was not long before GPUs were being used as mathematical co-processors in high-end scientific workstations and aggregated in servers.  Although they are much cheaper than first-generation super computers, these systems are still quite expensive, with workstations costing $10K and up and servers going for multiples of that.

Yesterday that all changed.  Now the tiniest company, even the lone quant, can have the same computational power, for as little as $2.10 per hour, with no up-front investment and access from anywhere in the world.

Amazon Cluster GPU Instances can lower the cost and accelerate the progress of fighting famine and disease, building safer, more fuel efficient vehicles and aircraft, finding and exploiting new sources of energy, and, of course, producing breathtaking visual entertainment.  We have not yet begun to imagine the new businesses and research projects this kind of cloud computing will make possible.

In their blog entry entitled "A Couple More Nails in the Coffin of the Private Compute Cluster" large scale computing specialists Cycle Computing provide a very detailed picture of how they have used this technology to build a value-added computation service in the Amazon public cloud to support these kinds of applications.

And, that is definitely all good.  But, every innovation has a dark side and this is no exception.

Then, the Evil

At the same time as Amazon was announcing the general availability of the EC2 Cluster GPU Instance, German programmer Thomas Roth, writing on his Stacksmashing.net blog, was showing how he used it to create a password "hash cracker" that could crack a six character password in 49 minutes ($1.71 to Amazon.)

The password he cracked was one that used the SHA1 hashing scheme designed by the National Security Agency and published by NIST as a Federal Information Processing Standard.  In 2005, SHA1 was found to contain a mathematical weakness that could enable security vulnerabilities and was deprecated accordingly, but not before it came to be employed in a number of widely-used security applications and protocols.  His cracker could also be used against MD5/4 and NTLM security protocols.  Like SHA1, these protocols have been deprecated or replaced, due to similar vulnerabilities, but, also like SHA1, only after becoming widely deployed.

So, the password Roth cracked so quickly was short and encrypted with a deprecated method, correctly suggesting that it would have been much more difficult to use the Amazon service to crack a longer, better encrypted password.  But, remember, he only used one cluster node and he was just fooling around.  He seems nonplussed about how hard it might be to take it further.

"This just shows one more time that SHA1 for password hashing is deprecated.  You really don't want to use it anymore!  Instead, use something like scrypt or PBKDF2!  Just imagine a whole cluster of this machines (Which is now easy to do for anybody thanks to Amazon) cracking passwords for you, pretty comfortable.  Large scaling password cracking for everybody! [...]

"If I find the time, I'll write a tool which uses the AWS-API to launch on-demand password-cracking instances with a preconfigured AMI. Stay tuned either via RSS or via Twitter."

I am not sure what color this guy's hat is, but his sang froid is unsettling.  And, Amazon says that users can expect these clusters to scale at about 90% efficiency, and developers can expect the availability soon of a variety of programming aids that will simplify the process of exploiting and scaling the GPU clusters.  So, Mr. Roth is not vamping.

Don't Be Shiva

Again, every innovation has its dark potential.  In this case, the innovation is not technical; GPUs have been used for nefarious purposes of the above kind for years.  This is an economic innovation that takes considerable cost and time out of a kind of criminality that can be extremely rewarding - identity and data theft.   It cannot be stopped any more than digital piracy or other forms of highly-leveraged electronic misbehavior, it can only be slowed down, and only if Amazon and others like them drive against it.  Will they?

I'm not sure what it will take to make sure that Amazon doesn't let their new yellow cake get into the wrong hands, but I suspect it will more likely be a result of regulation or litigation after a disaster than of altruistic foresight before trouble strikes.  As I mentioned in my article, SMB Cloud is a Hacker's Paradise a few months back, large cloud services providers, including Amazon, have so far not demonstrated striking speed and initiative in getting and staying ahead of the bad guys, whose resolve for mischief and mayhem is boundless.

Optimism is no defense.  One of the main reasons cyber-crime is so out of control now is that the World Wide Web was built on a foundation of magical thinking in the form of the fraternal optimism of academics.  As bad as security risks have been in the Web 1.0 era, despite the definite improvements in prevention and hygiene that have been made, they still may pale by comparison with what could be coming.  Cloud computing has multiplied many good things, like cost savings and business agility, by 1-2 orders of magnitude.  It can do the same for many bad things, if we let it.  Let's not let it.

More Stories By Tim Negris

Tim Negris is SVP, Marketing & Sales at Yottamine Analytics, a pioneering Big Data machine learning software company. He occasionally authors software industry news analysis and insights on Ulitzer.com, is a 25-year technology industry veteran with expertise in software development, database, networking, social media, cloud computing, mobile apps, analytics, and other enabling technologies.

He is recognized for ability to rapidly translate complex technical information and concepts into compelling, actionable knowledge. He is also widely credited with coining the term and co-developing the concept of the “Thin Client” computing model while working for Larry Ellison in the early days of Oracle.

Tim has also held a variety of executive and consulting roles in a numerous start-ups, and several established companies, including Sybase, Oracle, HP, Dell, and IBM. He is a frequent contributor to a number of publications and sites, focusing on technologies and their applications, and has written a number of advanced software applications for social media, video streaming, and music education.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...