Welcome!

Cloud Security Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Elizabeth White, Ravi Rajamiyer

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Machine Learning , Apache, Cloud Security, @DXWorldExpo

@CloudExpo: Blog Feed Post

SDN: Concrete or Concept?

Is SDN a concept, or a concrete architectural construct? Does it really matter?

Now, if we look at the benefits we can attempt to infer what the problems SDN is trying to solve:

BENEFIT

PROBLEM

Programmability Network components today, particularly hardware-based components, run specific feature sets that can only be modified by the vendor, which happens on a 12-18 month schedule, security and hot-fixes not withstanding. New features and functions can only be added by the vendor based on their prioritization, not the customer.
Automation Manual configuration of network components is time consuming, costly, and introduces a higher risk of human error that can result in outages, poor performance, or security risks.
Network control The network today doesn't adapt rapidly to changing conditions or events. While some protocols simulate such adaptability, these protocols can't autonomously route around outages or failures or modify existing policies easily.

These are certainly problems for IT organizations of a variety of sizes and composition. The question then is, how does SDN uniquely solve those problems?

The answer is that as a concrete solution (i.e. components, software, and architectures)  it does not uniquely solve the problem. As a concept, however, it does.

Someone's no doubt quite upset at the moment at that statement. Let's explain it before someone's head explodes.

CONCEPT versus CONCRETE

The concept of separating data and control plane enables programmability. Without that separation, we have what we have today – static, inflexible networking components. But the concept of separating data and control planes isn't unique to solutions labeled specifically SDN. ADN is a good example of this (you saw that coming, didn't you?)

A network component can – and this may surprise some people – internally decouple control and data planes. Yeah, I know, right? And doing so enables a platform that looks a whole lot like SDN diagrams, doesn't it – with plug-ins and programmability. This occurs in full-proxy architectures when there exist dual stacks – one on the client side, one on the server side. Where traffic transitions from one stack to the other exists an opportunity to inspect, to manipulate, to modify, the traffic. Because the architecture requires acting as an endpoint to clients (and conversely as the point of origin for the server side), protocols can even be implemented in this "no man's land" between client and server. That enables protocol transitioning, such as enabling SPDY on the outside while still speaking HTTP on the inside or IPv4 to servers while supporting IPv6 on the client (and vice-versa).

Where the separation occurs is not necessarily as important as the fact that it exists – unless you're focused on concrete, SDN-labeled solutions as being the only solutions that can provide the flexibility that programmability offers.

Automation occurs by exposing the management plane through an API (or implementing a specific API, such as OpenFlow) such that operational tasks and configuration can be achieved through tools instead of time.

Between automation and programmability, you realize network control.

Now, this is not SDN, at least not in terms of protocol support and concrete architecture. But it is software-defined, and it is networking, so does it count?

I guess it depends. ADN has always approached layers 4-7 with an eye toward extensibility, programmability and control that enables agility in the network. We didn't call it SDN and I don't see the industry deciding to "SDN-wash" existing ADN solutions as SDN just because a new term came along and became the TLA du jour.

What I do see is that ADN and specifically full-proxy based ADC (application delivery controllers) already offer the same benefits using the same concepts as SDN. Consider again the core characteristics of SDN:

1. Control and data planes are decoupled

2. Intelligence and state are logically centralized

3. Underlying network infrastructure abstracted from applications

All of these characteristics are present in an ADN. The ability to leverage network-side scripting on the control plane side of the equation enables extensibility, rapid innovation, and ability to adapt to support new protocols, new applications, new business requirements – all without involving the vendor. Which is exactly one of the benefits cited for SDN solutions and specifically OpenFlow-enabled architectures.

So the question really is, does it matter if a solution to the problem of "agility in the network" is a concrete or conceptual SDN solution if it ultimately solves the same set of problems?

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...