ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ -- IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...
|By Patrick Kerpan||
|October 18, 2012 07:30 AM EDT||
Cloud computing is so alluring. The public cloud economizes infrastructure resources and creates a scalable, on-demand source for compute capacity. Additionally, the cloud can be a strategic asset for enterprises that know how to migrate, integrate and govern deployments securely.
Apple co-founder, Steve Wozniak recently said, "A lot of people feel 'Oh, everything is really on my computer,' but I say the more we transfer everything onto the web, onto the cloud, the less we're going to have control over it."
In fact, over 70% of IT professionals worry about security according to an IDG Enterprise Cloud Computing Study.
Boiled down, security, access and connectivity are really issues of control.
As any prudent cloud user, the application has its own unique security features, such as disk encryption and port filtering. But do these layers of security features overlap or conflict? What happens to ownership after migration? Do solutions really have to be architected before and after deployment?
Take an application-focused approach to security from the beginning. The application-controlled, application-owned security layers will ease the decision to deploy, test, and develop in the cloud and save on IT training and time along the road.
Control of Security: Who Has It?
Part of the "magic" cloud providers and vendors supply is wrapped up in layers of ownership and control in the form of firewalls, isolation, and the cloud edge. Most enterprise application owners hope that these layers will cover the possible gaps in security after migration. Unfortunately most enterprises need security controls they can attest to and providers ultimately own and control these security features.
Unfortunately the needs and concerns of the cloud service provider are distinctly different than the needs and concerns of the enterprise cloud service user (the application topology deployed to the cloud and its owner). Security loopholes can exist because there are gaps between the areas users and providers control and own. The known boundary between what the cloud user can control and view and what the cloud provider can view and control is the root source of enterprise executives' concerns with public cloud.
The provider-owned, provider-controlled features (as in the cloud edge, cloud isolation), the provider-owned, user-controlled features (or the multi-tenant API controlled router/ hypervisor), and the app-owner, app-controlled features (OS port filtering and disk encryption) can be configured in an overlay network to give the user the ultimate control of security.
Application-to-cloud migration and software defined networking (SDN) capabilities out there offer additional, overlapping layers of control and security that span the spheres of the traditional cloud layers.
In order for cloud projects to succeed, IT executives need methods and tools they can attest to and can pass audit. Understanding the perimeter of access, control, and visibility between the application layer and the cloud provider layers is the first step to a less painful cloud migration. With this knowledge enterprises can then design a migration process that fits their use-case to deploy application topologies to the public cloud in a secure and controlled fashion.
Three Migration Rules We Recommend Breaking
Today's migration "rules" create more hurdles than solutions. Rapid industry changes, lack of standard security approaches, and the confusion on the proper steps to cloud deployment cause enterprises to overlook the issues of application-level control.
In fact, application-centric concerns are not even being addressed. Popular migration advice urges enterprises to tackle huge hurdles before and during migration, including deploying all at once, re-architecting before migration, and postponing the cost benefits of using the cloud.
Break the following three migration rules and it is possible to renovate more efficiently, capitalize on the cloud's economies of scale, and quickly, easily, and securely control enterprise networks and applications in the cloud.
Rule 1: Deploy all at once or not at all
Just as lemmings became extinct by all jumping in head first, most enterprises require time to analyze and adjust to new technologies before committing serious time and effort. Employees, customers, and shareholders would not be happy if companies jumped into new technologies without first proving value. Thankfully, enough enterprises, organizations and governments have already seized the benefits of the cloud's flexibility, cost savings, and connectivity.
Now, the challenge for IT professionals is to find the cloud architecture and provider(s) that fit their enterprise's needs and avoid having to reinvent the cloud to do so. With proven solutions in the market, enterprises can skip the bare metal to virtual to test cloud development life cycle. Simply deploy directly to any cloud environment, develop, test, then release to speed the time to market.
Rule 2: Re-architect before migration
Most providers and brokers want enterprises to spend time and effort to re-build IT systems and as a result re-learn/re-train before migration. Advice articles list migration steps of parsing applications, virtualizing, re-architecting and then migrating. Cloud pundits advise IT professionals to be wary of all cloud security and take valuable time to renovate before migrating - which will slow down the process and postpone or even wipe out the financial benefits of the cloud.
The traditional datacenter has too much knowledge flowing in a vertical direction from application to infrastructure and infrastructure to application. Migrating to the cloud before the renovate, design, or innovate steps can cut down on the upfront hassle by removing the burdens of re-architecting and re-learning skills before migration. Saving time, IT resources, and forgoing the arduous re-training speeds up the process for migrating to the cloud and ultimately how the organization capitalizes on the cloud's flexibility.
Rule 3: Pay upfront for design and renovation costs
Why stop with the cloud's physical economies of scale when there are potential savings on the costs of IT overhead? The same time and effort put into saving "design economies of scale" can be used to save major overhead costs too. A single migration, rather than the process of backup, re-architecture, and then migration is more cost-effective. Why wait for cost savings until after migration when there is an option to realize faster deployment and speed to market?
The added customization and control needed to migrate in a logical set of steps puts the control and security solidly back into the application layer.
Enterprises will likely face a long, slow migration to the cloud but, with the tools to capture the efficiency of migrating through logical steps before designing, the process can be significantly less painful. The application-controlled, application-owned security layers will ease the decision to deploy, test, and develop in the cloud and save on IT training and time along the road.
Conventional wisdom is missing the application layer importance of security and control in the cloud. So only one migration question remains - why take the stairs when you can take the elevator?
Feb. 1, 2015 11:15 AM EST Reads: 3,398
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
Feb. 1, 2015 11:00 AM EST Reads: 2,613
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Feb. 1, 2015 10:45 AM EST Reads: 2,810
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Feb. 1, 2015 10:15 AM EST Reads: 2,105
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Feb. 1, 2015 10:00 AM EST Reads: 3,320
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Feb. 1, 2015 10:00 AM EST Reads: 3,314
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
Feb. 1, 2015 10:00 AM EST Reads: 2,585
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 1, 2015 09:30 AM EST Reads: 2,474
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Feb. 1, 2015 09:00 AM EST Reads: 3,014
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Feb. 1, 2015 06:45 AM EST Reads: 3,289
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Feb. 1, 2015 06:30 AM EST Reads: 2,051
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
Feb. 1, 2015 05:15 AM EST Reads: 3,254
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Feb. 1, 2015 03:00 AM EST Reads: 3,621
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Feb. 1, 2015 02:00 AM EST Reads: 3,223
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Jan. 31, 2015 11:30 PM EST Reads: 3,138
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Jan. 31, 2015 07:30 PM EST Reads: 3,263
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Jan. 31, 2015 03:00 PM EST Reads: 3,669
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 31, 2015 02:30 PM EST Reads: 2,789
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
Jan. 31, 2015 02:00 PM EST Reads: 3,930
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Jan. 31, 2015 01:15 PM EST Reads: 2,759