Welcome!

Cloud Security Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Elizabeth White, Ravi Rajamiyer

Related Topics: @DevOpsSummit, Containers Expo Blog, Cloud Security

@DevOpsSummit: Blog Feed Post

Even If They Hacked It, So What? Why Threat Modeling Matters By @daedtech | @DevOpsSummit #DevOps

Eliminating waste is by far my favorite part of the agile approach to software

Even If They Hacked It, So What? Why Threat Modeling Matters
by Erik Dietrich

Eliminating waste is by far my favorite part of the agile approach to software.

In a world where the entirety of a piece of software is designed up front, I might ship and learn only after the fact that nobody ever uses the software's WhizBang feature.

That's brutal - the entire development team spent three months on that. But in an iterative world where we ship every week or two, we'd have shipped a small slice of that feature, seen that it wasn't getting used, and pivoted to something that provided more value to the users.

The same concept can also be applied to the security of your software. This doesn't mean you'll implement security and ditch it after two weeks if no one tries to break in, or that you should wait until something bad happens to implement security functionality.

But it does mean you should assess threats from a likelihood and impact perspective and then prioritize them accordingly. And, like delivering features, you should do this as close to the start of the project as possible.

This is where threat modeling comes in.

What is threat modeling?
Threat modeling involves being deliberate about identifying who would want to attack the system you are building and how those people would go about conducting the attack. Significantly, as Margaret Rouse points out, it is also about determining "where the most effort should be applied to keep a system secure."

Threat modeling isn't just a brief brainstorming session followed by items hastily added to a team's backlog. Teams define entire processes around it, such as this one described by Microsoft, with varying levels of formalism. The key is to identify all manner of threats and then to tie them to business significance - and thus, priority. Implementing counter-measures then becomes a first class feature for the team.

Some obvious examples come to mind.
If you run an E-Commerce site, financially motivated thieves will want to game the payment system to get free things.  Perhaps politically motivated people would want to get administrative access to publicly vandalize the site with some kind of message. Or perhaps one user would want to target another, gaining inappropriate access to see what that person buys and sells.

But there are also less obvious examples that are equally important. If the site is storing personal information, which operational or IT administrative employees have access to it? Would it be possible for them to record that information and retain it, even after leaving the company? What would happen if they did?

It's not just your employees, either.
Do you have an agency that you're subcontracting operational work to, who, in turn, might be subcontracting it even further? What sort of access do those folks have, and what could they do with it? If your relationship turned sour, could they try to hold your operations hostage somehow?

It is by answering these questions that you start to understand what could happen and how to prioritize counter-measures. And you also gain a sense for how likely a threat is, how important it is to protect pieces of your infrastructure, and if a threat really even matters.

As strange as it may sound- not all threats are equally important.

Security in a Vacuum
Software developers are conscientious people that take pride in their work and knowledge. In a vacuum, this can lead to situations where they make decisions on the basis of doing "the right thing" as a matter of principle rather than profit. They might say to themselves, "what kind of programmer would I be if I didn't create a login for our site and salt and encrypt the passwords being stored to the database?" This would obviously be followed by implementing such a scheme.

But what if the application consisted only of publicly-available functionality and content, such as a simple web-based calculator? Should the developers have done "the right thing" or should they have just skipped implementing signup/login altogether?

Testers are similarly conscientious people that take pride in their work and knowledge - and who can also slip up in the same way. "What kind of tester would I be if I didn't test a scenario where the user doesn't actually enter a password?" They might test this scenario and submit a defect, noting that users could bypass signup and get straight to the content by leaving the password field blank. But if all of the content is publicly available, who cares?

Left to their own devices, these folks will run with "the right thing" due to their sense of professionalism and previous battle scars. That is why threat modeling is so important - it moves decisions about security from happening in ad-hoc fashion at the individual level to deliberate fashion at the team or project level.

Threat Modeling for Business Value
In the example above of creating a spurious login page, nobody ever stopped to ask, "If they hacked it, so what?" If a hacker were able to spoof an existing user and log in, they would...do what, exactly? Use the calculator? Who cares?

If threat modeling had been performed, and those questions had been asked, creation of a login page would either have been deprioritized or, more likely, jettisoned altogether. It would not have come across a developer's desk as something to implement in the first place, and a developer would be unlikely to take it upon herself to do it as gold plating. After all, the team would already have been through the threat modeling process and thus, all aware that ‘unauthorized' access to the calculator is a non-issue.

Instead, the team might have come away understanding that distributed denial of service attacks (DDos) were a threat to the calculator's core business. They would then be able to plan high availability and remediation of threat as properties of the system from the beginning. In this fashion, the team is working not just on the highest value user features but also on the highest value security features as well.

Assumption of Control
With both user features and security properties of the system, it's important to prioritize the work with the highest ‘bang for the buck' factor. But there's an additional component to the specific value of threat modeling.

The attack vectors for a piece of software or a system are conceptually infinite, as the black hat hackers of the world are constantly dreaming up creative new ways to ruin your day. What's not infinite is the time and resources of the team - those are often highly constrained.

Right out of the gate you face a tall order. The only hope of keeping up is to identify the most damaging, likely threats and design your system in a way that mitigates these.

In a world where attacks on your application are a constant way of life, threat modeling gives you the best fighting chance to sleep easily, knowing that you've got a good strategy for preventing the worst attacks and that you've spent your money wisely.

Read the original blog entry...

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

IoT & Smart Cities Stories
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in development and launches of disruptive technologies to create new market opportunities as well as enhance enterprise product portfolios with emerging technologies. His most recent venture was Octoblu, a cross-protocol Internet of Things (IoT) mesh network platform, acquired by Citrix. Prior to co-founding Octoblu, Chris was founder of Nodester, an open-source Node.JS PaaS which was acquired by AppFog and ...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
JETRO showcased Japan Digital Transformation Pavilion at SYS-CON's 21st International Cloud Expo® at the Santa Clara Convention Center in Santa Clara, CA. The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get...