Click here to close now.


Cloud Security Authors: Maria C. Horton, Sanjay Zalavadia, Steve Watts, Brad Thies, David Dodd

Related Topics: Microservices Expo, Java IoT, Agile Computing, Cloud Security, Government Cloud

Microservices Expo: Article

Why Obama Administration Should Have Paid More Attention to Load Testing

What needs to be understood here is that it’s important to test early and often

October 1, 2013, was the most anticipated date for the Obama administration since his re-election. It was to be the day every American would have access to health care on one centralized website. However, according to at least one report only six people enrolled in Obamacare on the first day. Then shortly after, the entire website crashed along with its infrastructure.

The massive crash happened because within the first 10 days of launch had over 14.6 million unique views. Something the Obama administration was not prepared for, nor the testers.

The website should have been able to handle tens of thousands of people at once, but in a trial test before the launch a mere 500 users caused the website to crash. In testimony before U.S. Congress, the contractors responsible for said they didn't have enough time to fully test the website. The inability to properly load test the website well before the launch date of October 1st led to one of the worst federal website debacles of all time.

What Went Wrong
The website was designed to provide Americans with a simple solution as a one-stop-shop for health care insurance, but as we all know it wasn't that simple.

The site was built by 55 contractors and is considered one of the most complex software projects ever undertaken for the federal government, which might be where their problems all started.

According to Louis Woodhill, a contributor to Forbes magazine, the Obamacare website is comparable to the Soviet Union. "In their effort to build an IT system to implement Obamacare, the U.S Department of Health and Human Services was trying to do the same thing as the USSR's Gosplan agency: elicit coordinated, purposeful action from a collection of entities that don't know each other, don't trust each other, have conflicting objectives, and face diverging incentives."

Mixing contractors wasn't their only issue, the Obama administration continued to make a series of rookie mistakes that led to the demise of the website.

Incorrectly Assessing User Behavior. First, the administrators in charge of the website decided in late September to exclude the feature that would let people shop for health plans before registering for an online account. This lead to a bottleneck in the process because more people than expected had to go through the registration process before they could even browse through plans.

Broken Systems Integration. Second, the registration process was flawed. The consumer was supposed to enter basic account information, a security question and so on, but the communication between the systems responsible for storing this information wasn't working properly. This resulted in thousands of users who were unable to successfully create an account.

Rebuilding Components from Scratch When Proven Systems Were Available. Last, the Data Services Hub, which is a proven identity service available to the government for consumer applications, was surprisingly not used to its full extent. Instead, the website builders created new software systems meant to do exactly the same thing. In an article by Mashable the author emphasizes the fact that if the site had in fact fully leveraged the Data Hub, then it wouldn't have been such a mess.

With all of these missteps and rookie mistakes under consideration, what is known is the fact that was overwhelmed with the amount of visitors to one site.

Why the Government Should Have Made Load Testing a Priority
It seems like those responsible for deploying the site didn't really appreciate the importance of load testing, which is especially surprising when you consider that the website had in fact failed a pre-launch load test miserably. Of course, politics came into play as the deadline for the website was non-negotiable. But with all the red flags warning of failure, load testing should have played a much more critical role and here's why:

Prioritization of Problems and Fixes
A big issue with was that the contractors claimed they didn't have enough time and felt extreme pressure to roll out the website before it was properly tested. If load testing occurred earlier in the website development phase, testers would have been able to identify the parts of the website that were not working properly.

The major pain point in the entire website was the registration process that millions of Americans attempted to fill out. Had they load tested the website months out from the launch, the team would have been able to identify the root causes of performance issues and determine whether they were in application code or the app servers and infrastructure components.

Earlier Identification of Issues


This chart illustrates how much it costs the paying client to fix a bug according to the stage of development. At the operation stage, a bug can cost clients more than 150 times as much as a bug caught in the requirement stage.

Had the testers broken down their tests into smaller test cases, over time the administration might have taken the time to listen and understand that these little bugs needed to be fixed prior to the public launch.

Decisions Made from Intelligence on the Ground
We know the tension between testers and business owners can be pretty intense. The funders of the website want it up and running right away, but testers want to properly identify errors and have enough time to fix the issues that arise.

The administration decided to completely ignore the classic project management triangle.

The only way to increase the scope of a project without changing the due date would be to add more resources. Since the administration was rigid on all three sides of the triangle, the quality of the website suffered.

It's no wonder this website failed. The dynamics between the testers and heads of were strained, and it appeared the Obama administration chose to ignore testers who knew the website was not ready. Today
The website isn't through the woods just yet. According to The Washington Post, the website has been flagged by over 22,000 people trying to correct errors the system made when they were signing up for a new federally-mandated health care plan.

Apparently, federal workers aren't able to access consumer data manually. "An unknown number of customers who are trying to get help through less formal means - by calling the health care marketplace directly - are told that's computer system isn't yet allowing federal workers to go into enrollment records and change them."

What needs to be understood here is that it's important to test early and often. If tests would have been conducted throughout the entire website development, the Obama administration would have avoided such an embarrassing and reputation-tarnishing event.

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Internet of @ThingsExpo, taking place June 7-9, 2016 at Javits Center, New York City and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
We are rapidly moving to a brave new world of interconnected smart homes, cars, offices and factories known as the Internet of Things (IoT). Sensors and monitoring devices will touch every part of our lives. Let's take a closer look at the Internet of Things. The Internet of Things is a worldwide network of objects and devices connected to the Internet. They are electronics, sensors, software and more. These objects connect to the Internet and can be controlled remotely via apps and programs. Because they can be accessed via the Internet, these devices create a tremendous opportunity to inte...