Click here to close now.

Welcome!

Security Authors: Elizabeth White, Liz McMillan, Irit Gillath, Pat Romanski, Tom Scearce

Related Topics: Cloud Expo, SOA & WOA, Virtualization

Cloud Expo: Blog Feed Post

Virtual Strategy - Virtually Right

With a private cloud strategy and dynamic data center you can quickly respond to rapid business fluctuations

With a private cloud strategy and dynamic data center you can quickly respond to rapid business fluctuations. But how do you get there?

This post was originaly published as thanksgiving weekend special at virtual-strategy.com.
In the article I discussed some approaches for building a dynamic data center that not only addresses complexity and reduces cost, but also accelerates business response time, to ensure that organization realizes the true promise of cloud computing, business agility and customer responsiveness.

Cloud computing presents an appealing model for offering and managing IT services through shared and often virtualized infrastructure. It’s great for new business start-ups who don’t want the risk of a large on-premise technology investment, or organizations who can’t easily predict what the future demand will be for their services. But for most of us with existing infrastructure and resources, the picture is very different. We want to capitalize on the benefits of the cloud ― on demand, low risk, affordable computing ― but we’ve spent years investing in rooms stacked high with hardware and software to run our daily mission critical jobs and services.

So how do organizations in this situation make the shift from straight-forward server consolidation to a dynamic, self-service virtualized data center? How do they reach the peak of standardized IT service delivery and agility that is in step with the needs of the business? Many virtualization deployments stall as organizations stop to deal with challenges like added complexity, staffing requirements, SLA management, or departmental politics. This “VM stall” tends to coincide with different stages in the virtualization maturity lifecycle, such as the transition from tier 2/3 server consolidation to mission-critical tier 1 applications, and from basic provisioning automation to a private/hybrid cloud approach.

The virtualization maturity lifecycle
The simple answer is to take it step-by-step, learning as you go, building maturity at every step. This will earn you the skills, knowledge, and experience needed to progress from an entry-level virtualization project to a mature dynamic data center and private cloud strategy.

It’s called the virtualization maturity lifecycle, and it builds in four steps. Just like pilots start their training on small planes (going full cycle from take-off to landing) before they move onto large commercial jets, it is advisable for organizations to implement these virtualization maturity steps iteratively. For example, start a full maturity cycle on test and development servers before moving to mission critical servers and applications.
Start easy, by consolidating servers, to increase utilization and reduce your current carbon footprint. To ensure deep insight and continuity in support of the migration from physical to virtual, you might want to leverage image backup and physical-to-virtual restore tools that allow you to move your physical IBM, Dell and HP images directly to ready to run VM images for VMware, Sun, Citrix and Microsoft.

The next step involves optimizing the infrastructure. Apart from maintaining consistency, efficiency, and compliance across the virtual resources (which is proving fast to be even more complex in virtual than in physical environments), we analyze, monitor, (re-)distribute and tune our applications and services.

While optimizing, we also discover and document the rules we will automate in the next phase. Rules about which applications best fit together, what areas are suitable for self service and which type of services are most important. As you can imagine the answers to this last question will be very different for a nuclear plant (safety first) compared to an online video rental service (customers first), which it is why it is such an important step. If you skip this stage and go straight into automation, you’ll likely end up in the same situation that you’re in today, just automated.

A successful cloud strategy is all about agility and flexibility, and the next step in the virtualization maturity lifecycle helps take care of automation and the orchestration of your (now) virtual services. You can empower users to help themselves ― industrialize processes ― without calling IT for every service request. Automation has many advantages here. It is the catalyst to standardize your virtual infrastructure, integrate and orchestrate processes across IT silos, and accelerate the provisioning of virtual cloud services. Once the industrialized provisioning process is live, automation technologies can then also be used to monitor demand volumes, utilization levels and application response times and to assist root-cause analytics to help isolate and remediate virtual environment issues.

The final stage is the centerpiece of a cloud strategy, a position which allows you to manage the definition, demand, and deployment of IT services: the dynamic data center. Your now agile infrastructure, delivered from a secure, highly available data center, enables you to quickly respond to rapid business fluctuations. To reach a dynamic data center, you need to automate the entire process of service delivery from request to fulfilment. This includes centralized service requests, automating the approval process so that department heads can quickly approve or reject requests, a standard and repeatable provisioning process, and standard configurations.

This goes much further than the traditional dream of a “lights out” data center, which basically was a static conveyor belt-like factory where all labor was automated away. The dynamic data center is like a modern car factory, where robots perform almost all tasks, but in ever changing sequences and configurations, guided by supply-chain-lead orchestration.

The new normal
As we all know, technology changes fast. This advancement in technology is creating a “new normal” where relationships with customers are increasingly in a digital form and technology is no longer an enabler or accelerator of the business― it has become the business.

This is a theme picked up by Peter Hinssen, one of Europe's thought leaders on the impact of technology on our society. He evangelizes this new normal, arguing that in a digital world there will be new rules that define what is acceptable for IT, including zero tolerance for digital failure, an era of “good enough” functionality (60% functionality in six weeks rather than 90% in six months), and the need to move your architectures―including your new cloud architecture―from “built to last” to “designed to change”.
The lifecycle approach described earlier may be just what you need to help align your IT organization to what Hinssen calls the new normal. First you determine where opportunities exist for consolidation and rationalization across your physical and virtual environments ― assessing what you have in your data center environment and establish a baseline for making decisions that take you to the next stage. Next, to achieve agility, you have to automate the provisioning and de-provisioning of virtualized resources, including essential elements, such as identities, and other management policies such as access rights.

The next step in delivering an on-time, risk-free (zero failure) cloud computing strategy is service assurance. You need to manage IT service quality and delivery based on business impact and priority — top-to-bottom and end-to-end. That includes, for example, delivering a superior online end-user experience with low-overhead application performance management, and end-to-end visibility into traffic flows and device performance. The new normal also needs to be secure. IT security management technologies must be applied against current regulations and end-user needs, which enable the virtual layer to be more secure.

All these factors combined ultimately lead to agile IT service delivery. With agility, you can build and optimize scalable, reliable resources and entire applications quickly. By embarking on the virtualization maturity roadmap, you can move closer to a dynamic data center and successful cloud strategy.

Any shortcuts?
This evolutionary approach may sound very procedural (and safe). You may also be thinking, is this the only way? What if I need it now?  Is there no revolutionary approach to help me get straight to a private cloud much more quickly? Just like developing countries, which have skipped the wired POTS phone system and moved directly to a 100% wireless infrastructure, a revolutionary approach does exist. The secret lies in the fact that – in addition to the application itself - the infrastructure required to deploy an application can be virtualized – load balancers, firewalls, NAS gateways, monitoring tools, etc.  This entire entity – the application and the required infrastructure it needs to be successfully deployed – can then be managed as a single object. Want to deploy a copy of the application? Simply load the object and all of the associated virtual appliances are automatically loaded, networked, secured and made ready.  This is called an application-centric cloud.

With traditional virtualization, the servers are the parts that are virtualized, but afterward, these virtual servers, networks, routers, load balancers and more, still need to be managed and configured to work with the other parts of the data center, a task as complex and daunting as it was before. This is infrastructure-centric cloud.  With full application-centric clouds, the whole business service (with all its involved components) is virtualized becoming a virtual service (instead of a bunch of virtual servers) which reduces the complexity of managing these services significantly.

As a result, application-centric clouds can now model, configure, deploy and manage complex, composite applications as if they were a single object. This enables operators to use a visual model of an application and the required infrastructure, and to store that model in the integrated repository.  Users or customers can then pull that model out of the repository, reuse it and deploy it to any data center around the world with the click of a button.  Interestingly, users deploy these services to a private cloud, or to an MSP, depending on who happens to offer the best conditions at that moment.  Sound too futuristic?  Far from it.  Several innovative service providers, like DNS Europe, Radix Technologies, and ScaleUp, are already doing exactly this on a daily basis.

For many enterprises, governments and service provider organizations, the mission for IT today is no longer just about keeping the infrastructure running. It’s about the critical need to quickly create new services and revenue streams and improve the competitive position of their organization.
Some parts of your organization may not have time to evolve into a private cloud. For them, taking the revolutionary (or green field) approach may be best, while for other existing revenue streams, an evolutionary approach, ensuring investment protection, may be best.  In the end, customers will be able to choose the approach that best fits the task at hand, finding the right mix of both evolutionary and revolutionary to meet their individual needs.

Read the original blog entry...

More Stories By Gregor Petri

Gregor Petri is a regular expert or keynote speaker at industry events throughout Europe and wrote the cloud primer “Shedding Light on Cloud Computing”. He was also a columnist at ITSM Portal, contributing author to the Dutch “Over Cloud Computing” book, member of the Computable expert panel and his LeanITmanager blog is syndicated across many sites worldwide. Gregor was named by Cloud Computing Journal as one of The Top 100 Bloggers on Cloud Computing.

Follow him on Twitter @GregorPetri or read his blog at blog.gregorpetri.com

@ThingsExpo Stories
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS solutions that provide a Hadoop flavor either make choices for customers very flexible in the name of opti...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, m...
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize supplier management. Learn about enterprise architecture strategies for designing connected systems tha...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization model to drive results.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from mere Big Data to smart data. He will argue that today’s hybrid cloud storage solutions, with commodity...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.