Click here to close now.




















Welcome!

Cloud Security Authors: Pat Romanski, Liz McMillan, Elizabeth White, Cloud Best Practices Network, Ian Khan

Related Topics: @CloudExpo, Cloud Security, Government Cloud

@CloudExpo: Blog Feed Post

The Cloud and Cybersecurity

The cloud-first policy and the incremental approach to cloud adoption will make IT reform real

As part of federal CIO Vivek Kundra’s 25-point plan to reform federal IT management announced last December, federal agencies must adopt a “cloud-first” policy that requires them to move three applications to the “cloud” over the next 12 to 18 months. Agencies must identify the three “must move” services within three months, move one of those services to the cloud within 12 months and the remaining two within 10 months.

This cloud-first policy and the incremental approach to cloud adoption will make IT reform real and should result in huge (30-50%) savings in federal IT budgets. One specific and measurable goal laid out in the plan calls for a reduction in government data centers from the current 2,094 number to fewer than 800 by 2015. Already 50 percent of government agencies are moving to private clouds but to realize the full potential of the cloud, the government needs to move from many small clouds to fewer large, shared clouds. Of course, federal acquisition policies and authorities must be modified before agencies can fully embrace this strategy. The Federal Risk and Authorization Management Program (FedRAMP) begins to address this issue.

FedRAMP allows joint authorizations and continuous security monitoring services for Government and Commercial cloud computing systems intended for multi-agency use. Joint authorization of cloud providers results in a common security risk model that can be leveraged across the Federal Government. The use of this common security risk model provides a consistent baseline for Cloud based technologies. This common baseline ensures that the benefits of cloud-based technologies are effectively integrated across the various cloud computing solutions currently proposed within the government. The risk model will also enable the government to “approve once, and use often” by ensuring multiple agencies gain the benefit and insight of the FedRAMP’s Authorization and access to service provider’s authorization packages.

There are still a lot of challenges that federal agencies need to work out with the cloud–data sovereignty, privacy and security, funding models, etc–but it is clear that the cloud model will allow government to operate more efficiently and effectively. Nonetheless, there persists the nagging perception that the cloud is inherently unsafe. Government agencies are uncomfortable handing over control of their data to other agencies, vendors or third parties. They are right to be concerned; reported cyber attacks against federal systems increased by 39 percent during the last fiscal year when compared to the year before, says an annual report on agency implementation of the Federal Information Security Management Act (FISMA). The report–posted online last month by the Office of Management and Budget (FY2010 FISMA Report)–finds that Federal agencies reported 41,776 cyber incidents during fiscal 2010. In 2009, agencies reported close to 30,000 incidents.

Despite the grim outlook, we believe the security of the federal enterprise, as well as its functionality, can be significantly enhanced by smartly implementing cloud computing. The following are some key principles that can facilitate this:

  • The importance of mission-focused engineering. Private clouds inside the federal enterprise can enhance mission support, but mission-focused engineering should be a first step in this pursuit.
  • The continual need for security, including data confidentiality, integrity and availability. All federal computing approaches must be engineered to be in total consonance with IA guidelines to assure federal information, information systems and information infrastructure. Cloud Computing, when engineered right, makes dramatic, positive changes to the mission assurance posture of the federal enterprise. Cloud computing enables stronger end point security and better data protection. It also enables the use of thin clients and the many security benefits they provide. Identity management and encryption remain of critical importance.
  • The need for always instantaneously available backup of data in the cloud. Ensured availability under all circumstances is a key benefit of smart cloud computing approaches.
  • The continual need for open source and open standards. Most cloud infrastructure today is based on open source (Linux, Solaris, MySQL, Glassfish, Hadoop) and this positive trend will help in net centric approaches. According to the IDC Group, open source software (OSS) is “the most significant, all-encompassing and long-term trend that the software industry has seen since the early 1980′s” Gartner projects that by 2012, 90 percent of the world’s companies will be using open source software. This all indicates open source and open standards should be a key principle for federal cloud computing and other net centric approaches.
  • The continual need to evaluate both low barrier to entry and low barrier to exit. As approaches to cloud computing are evaluated, too frequently the cost of exiting an approach is not considered, resulting in lock-in into a capability that may soon be inefficient. Cloud computing capabilities should be adopted that do not result in lock-in.
  • The need for open standards. Cloud computing contributions to enhanced functionality for the federal workforce and increase interoperability as the code, API’s and interfaces for cloud computing are secure but are widely published for all participants to interface with. Federal involvement in open source and open standards communities should continue and be accelerated, since increasingly cloud computing open standards are being discussed and designed by open standards bodies like W3C, OASIS, IETF and the Liberty Alliance. Document and other formats used by federal cloud computing activities will be open and available for all authorized users on all devices.
  • The need to understand the cost of “private clouds”. For at least the near term, the federal government will remain a provider of “private cloud” capabilities where security dictates ownership levels of control over compute power. This fact means the federal enterprise must continually engineer for change and technology insertion, which underscores the need for low barriers to exist in design criteria.

Regarding security, cloud computing holds the potential to dramatically change the continuous loosing game of continual workstation patching and IT device remediation by reducing the amount of applications on desktops and changing the nature of the desktop device from fat client to thin client. Devices can now have their entire memory and operating system flashed out to the device from private clouds and can have the power of the cloud presented to users as if the user is on an old fashioned desktop. This can be done in a way that never requires IT departments to visit the workstation to patch and configure it. And since all data is stored on private clouds it can be encrypted and access only provided to authorized users. No data can ever be lost when laptops are stolen and no data can ever be lost when desktops are attacked by unauthorized users. Security by well engineered use or cloud computing and thin clients or cloud computing and smart fat clients is dramatically enhanced.

This all leads to a key conclusion for the federal enterprise: as we move forward in cloud computing for support to the mission, the federal enterprise should continue to strengthen formal processes to ensure that lessons learned from both industry and the government’s own successful cloud computing initiatives are continually examined and broadly adopted across the enterprise

Crucial Point associates Dillon Behr, Alex Olesker, Bob Gourley and Chris Barnes contributed to this post.

This post sponsored by the Enterprise CIO Forum and HP.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@ThingsExpo Stories
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, described how to revolutionize your archit...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with APIs within the next year.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, d...
In his keynote at 16th Cloud Expo, Rodney Rogers, CEO of Virtustream, discussed the evolution of the company from inception to its recent acquisition by EMC – including personal insights, lessons learned (and some WTF moments) along the way. Learn how Virtustream’s unique approach of combining the economics and elasticity of the consumer cloud model with proper performance, application automation and security into a platform became a breakout success with enterprise customers and a natural fit for the EMC Federation.
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists addressed this very serious issue of profound change in the industry.