Welcome!

Cloud Security Authors: Pat Romanski, Liz McMillan, Craig Lowell, Ram Sonagara, Richard Hale

Related Topics: SDN Journal, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, Cloud Security

SDN Journal: Blog Feed Post

Resolving the Questions Surrounding the SDDC

You can't have a successful software-defined model with a hardware-defined mentality

Question 1. Vendors are racing to lead the movement towards a softwaredefined data centre. Where are we up to in this journey, and how far are we from seeing this trend widely adopted?

Considering most organisations have still not fully virtualized or moved towards a true Private Cloud model, SDDC is still in its infancy in terms of mainstream adoption and certainly won't be an overnight process. While typical early adopters are advancing quickly down the software-defined route these are mostly organizations with large scale multi site data centres who are already mature in terms of their IT processes. Such large scale organisations are not the norm and while the SDDC is certainly on the mindset of senior IT executives, establishing such a model requires several key challenges and tasks.

Typical environments are still characterised by numerous silos, complex & static configurations and partially virtualized initiatives. Isolated component and operational silos need to be replaced with expertise that cover the whole infrastructure so that organisations can focus on defining their business policies. In this instance the converged infrastructure model is ideal as it enables the infrastructure to be managed, maintained and optimised as a single entity by a single silo. Subsequently such environments also need to dramatically rearrange their IT processes to accommodate features such as orchestration, automation, metering and billing as they all have a knock on effect to service delivery, activation and assurance as well as change management and release management procedures. The SDDC necessitates a cultural shift and change to IT as much as a technical one and the latter historically always takes longer. It could still be several years before we really see the SDDC be adopted widely but it's definitely being discussed and planned for the future.

You can't have a successful software-defined model with a hardware-defined mentality



Question 2. Looking at all the components of a data centre, which one poses the most challenges to being virtualized and software-defined?
The majority of data centre components have experienced considerable technological advancements in past few years. Yet in comparison to networking, compute and hypervisor, storage arrays still haven't seen that many drastic changes beyond new features of auto-tiering, thin-provisioning, deduplication and the introduction of EFDs. Moreover Software Defined's focus is applications and dynamically meeting the changing requirements of an application and service offering. Beyond quality of service monitoring based on IOPS and backend/frontend processor utilisation, there are still considerable limitations with storage arrays in terms of application awareness.

Additionally with automation being integral to a software-defined strategy that can dynamically shift resources based on application requirements, automation technologies within storage arrays are up to now still very limited. While storage features such as dynamic tiering may be automated, they are still not based on real-time metrics and consequently not responsive to real-time requirements.

This leads to the fact that storage itself has moved beyond the array and is now encompassed in numerous forms such as HDD, Flash, PCM and NVRAM etc. each with their own characteristics, benefits and challenges. As of yet the challenge is still to have a software layer that can abstract all of these various formats as a single resource pool. The objective should be that regardless of where these formats reside whether that's within the server, the array cache or the backend of the array etc. they can still dynamically be shifted across platforms to meet application needs as well as provide resiliency and high availability.

Question 3. Why has there been confusion about how software-defined should be interpreted, and how has this effected the market?
Similar to when the Cloud concept first emerged in the industry, the understanding of the software-defined model quickly became somewhat blurred as marketing departments of traditional infrastructure vendors jumped on the bandwagon. While they were quick to coin the Software-Defined terminology to their offerings, there was little if anything different to their products or product strategy. This led to various misconceptions such as software- defined was just another term for Cloud, if it was virtualised it was software-defined or even more ludicrously that software-defined meant the non-existence or removal of hardware.

To elaborate, all hardware components need software of some kind to function but this does not necessitate them to be software-defined. For example Storage arrays use various software technologies such as replication, snapshotting, auto-tiering and dynamic provisioning. Some storage vendors even have the capability of virtualising third party vendor arrays behind their own or via appliances and consequently abstracting the storage completely from the hardware whereby an end user is merely looking at a resource pool. But this in itself does not define the array as software defined and herein lies the confusion that some end users face as they struggle to understand the latest trend being directed at them by their C-level execs.

Question 4. The idea of a software-defined data centre (virtualizing and automating the entire infrastructure wildly disrupts the make-up of a traditional IT team. How can CIOs handle the inevitable resistance some of their IT employees will make?
First and foremost you can't have a successful Software- defined model if your team still have a hardware-defined mentality. Change is inevitable and whether it's embraced or not it will happen. For experienced CIOs this is not the first time they've experienced this technological and consequently cultural change in IT. There was resistance to change from the mainframe team when open systems took off, there was no such thing as a virtualisation team when VMware was first introduced and only now are we seeing Converged infrastructure teams being established despite the CI market being around for more than three years. For the traditional IT teams to accept this change they need to recognise how it will inevitably benefit them.

Market research is unanimous in its conclusion that currently IT administrators are far too busy doing maintenance tasks that involve firefighting "keeping the lights" on exercises. Generally figures point to a 77% mark of overall time spent for IT admin on doing mundane maintenance and routine tasks with very little time spent on innovation, optimisation and focus of delivering value to the business. For these teams the software-defined model offers the opportunity to move away from such tasks and free up their time enabling them to be proactive as opposed to reactive. With the benefits of orchestration and automation, IT admin can focus on the things they are trained and specialised in such as delivering performance optimisation, understanding application requirements and aligning their services and work to business value.

Question 5. To what extent does a software-defined model negate the need to deploy the public cloud? What effect will this have on the market?
The software defined model shouldn't and most likely won't negate the public cloud, if anything it will make its use case even clearer. The SDDC is a natural evolution of cloud, and particularly the private cloud. The private cloud is all about IT service consumption and delivery of IT services whether this be layered upon converged infrastructure or self assembled infrastructures. Those that have already deployed a private cloud and are also utilising the public cloud have done so with the understanding and assessment of their data; it's security and most typically it's criticality. The software defined-model introduces a greater level of intelligence via software where application awareness and requirements linked to business service levels are met automatically and dynamically. Here the demand is being dictated by the workload and the software is the enabler to provision the adequate resources for that requirement.

Consequently organisations will have a greater level of flexibility and agility to previous private cloud and even public cloud deployments, thus providing more lucidity in the differentiation between the private and public cloud. Instead of needing to request from a cloud provider permission, the software defined model will provide organisations on-demand access to their data as well as independently dictate the level of security. While this may not completely negate the requirement for a public cloud, it will certainly diminish the immediate benefits and advantages associated with it.

Question 6. For CIOs looking for pure bottom-line incentives they can take to senior management, what is the true value of a software-defined infrastructure?
The true value of a software defined model is that it empowers IT to be a true business enabler. Most business executives still see IT as an expensive overhead as opposed to a business enabler. This is typically because of IT's inability to respond quicker to ever changing service requirements, market trends and new project roll-outs that the business demands. Much of this is caused by the deeply entrenched organizational silos that exist within IT where typical infrastructure deployments can take up to months. While converged infrastructure solutions have gone some way to solving this challenge, the software defined model builds on this by providing further speed and agility to the extent that organisations can encapsulate their business requirements into business delivery processes. In this instance infrastructure management processes become inherently linked to business rules that incorporate compliances, performance metrics and business policies. In turn via automation and orchestration these business rules dynamically drive and provision the infrastructure resources of storage, networking and compute in real time to the necessary workloads as the business demands it.

Question 7. To what extent will a software-defined infrastructure change the way end-users should approach security in the data centre?
A software-defined model will change the way data centre security is approached in several ways. Traditional physical data center security architecture is renowned for being inflexible and complex due to its reliance on segmented numbers of dedicated appliances to provide numerous requirements such as load balancing, gateways, firewalls, wire sniffers etc. Within a software-defined model, security can potentially not only be delivered as a flexible and agile service but also as a feature that's built into the architecture. Whether that is based on an approach of security being embedded within the servers, storage or network, a software-defined approach has to take advantage of being able to dynamically distribute security policies and resources that are logically managed and scaled via a single pane.

From a security perspective a SDDC provides immediate benefits. Imagine how simplified it will become when automation can be utilized to restructure infrastructure components that have become vulnerable to security threats? Even the automation of isolating malware infected network end points will drastically simplify typical security procedures but will then consequently need to be planned for differently.

Part of that planning is acknowledging not just the benefits but the new types of risk they inevitably introduce. For example, abstracting the security control plane from the security processing and forwarding planes means that any potential configuration errors or security issues can have far more complex consequences than in the traditional data centre. Furthermore centralising the architecture ultimately means a greater security threat should that central control be compromised. These are some of the security challenges that organisations will face and there are already movements in the software defined security space to cater for this.

Question 8. Where do you see the software-defined market going over the next couple of years?
The concept of the SDDC is going to gain even more visibility and acceptance within the industry and the technological advances that have already come about with Software-Defined Networking will certainly galvanise this. Vendors that have adopted the software-defined tagline will have to mature their product offerings and roadmaps to fit such a model as growing industry awareness will empower organizations to distinguish between genuine features and marketing hyperbole.

For organisations that have already heavily virtualized and built private clouds the SDDC is the next natural progression. For those that have adopted the converged infrastructure model this transition will be even easier as they will have already put the necessary IT processes and models in place to simplify their infrastructure as a fully automated, centrally managed and optimized baseline from which the SDDC will emanate from. It is fair to say that it won't be a surprise to see a lot of the organisations that embraced the converged infrastructure model to also be the pioneers of a successful SDDC.

The above interview with Archie Hendryx is taken from the May 2014 issue of Information Age.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

@ThingsExpo Stories
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The IoT has the potential to create a renaissance of manufacturing in the US and elsewhere. In his session at 18th Cloud Expo, Florent Solt, CTO and chief architect of Netvibes, discussed how the expected exponential increase in the amount of data that will be processed, transported, stored, and accessed means there will be a huge demand for smart technologies to deliver it. Florent Solt is the CTO and chief architect of Netvibes. Prior to joining Netvibes in 2007, he co-founded Rift Technologi...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, provided tips on how to be successful in large scale machine learning...
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"C2M is our digital transformation and IoT platform. We've had C2M on the market for almost three years now and it has a comprehensive set of functionalities that it brings to the market," explained Mahesh Ramu, Vice President, IoT Strategy and Operations at Plasma, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.