|By Ted Alford, Gwen Morton||
|October 26, 2009 02:00 AM EDT||
Government Cloud Computing on Ulitzer
The President's budget for fiscal year 2010 (FY10) includes $75.8B in information technology (IT) spending, which is a 7-percent increase from FY09. Of this, at least $20B will be spent on IT infrastructure investments.  The FY11 budget for IT is projected to be nearly $88B. The government is actively seeking ways to reduce IT costs, and the FY10 budget request highlights opportunities for the federal government to achieve significant long-term cost savings through the adoption of cloud computing technologies:
"Of the investments that will involve up-front costs to be recouped in outyear savings, cloud-computing is a prime case in point. The Federal Government will transform its Information Technology Infrastructure by virtualizing data centers, consolidating data centers and operations, and ultimately adopting a cloud-computing business model. Initial pilots conducted in collaboration with Federal agencies will serve as test beds to demonstrate capabilities, including appropriate security and privacy protection at or exceeding current best practices, developing standards, gathering data, and benchmarking costs and performance. The pilots will evolve into migrations of major agency capabilities from agency computing platforms to base agency IT processes and data in the cloud. Expected savings in the outyears, as more agencies reduce their costs of hosting systems in their own data centers, should be many times the original investment in this area." 
The language in the budget makes three key points: (1) up-front investment will be made in cloud computing, (2) long-term savings are expected, and (3) the savings are expected to be significantly greater than the investment costs.
Booz Allen Hamilton has created a detailed cost model that can create life-cycle cost (LCC) estimates of public, private, and hybrid clouds. We used this model, and our extensive experience in economic analysis of IT programs, to arrive at a first-order estimate of each of the three key points in the President's budget. Overall, it appears likely that the expectations highlighted in the budget can be met, but several factors could affect the overall degree of economic benefit.
The government's adoption of this new IT model warrants careful consideration of the model's broad economic implications-including the potential long-term benefits in terms of cost savings and avoidance as well as the near-term costs and other impacts of a transition from the current environment. Factors such as the number and rate of federal agencies adopting cloud computing, the length of their transitions to cloud computing, and the cloud computing deployment model (public, private, or hybrid) all will affect the total costs, potential benefits, and time required for the expected benefits to offset the investment costs.
Booz Allen developed a first-order economic analysis by considering how agencies might migrate to a cloud-based environment and what the costs and potential savings might be under a variety of scenarios. Specifically, given long-standing efforts to protect the privacy and security of the federal government's data and systems, a key variable will be whether agencies take advantage of public clouds, build their own private clouds, or adopt a hybrid approach., The focus was on Cloud Computing infrastructure services as these tend to represent a relatively more consistent set of costs/investments/operating requirements across all agencies. We made some high-level, simplifying assumptions in our initial analysis:
- There is an existing data center(s) currently operational that is a baseline for economic comparison of migrating to a cloud environment.
- Existing application software will migrate with the infrastructure to the cloud. Application software support costs remain out of scope.
- Migration decisions will be made at the department or agency (rather than bureau) level in order to aggregate demand and drive scale efficiencies.
- We assume the perceived sensitivity of an agency's mission and data will be a primary factor (though by no means the only factor) driving its decisions on which path to follow..
Next, we developed three high-level scenarios that represent potential migration paths. The three scenarios are as follows:
Scenario 1: Public Cloud Adopters
Key Agency Characteristic: Migrates low-sensitivity data to an existing public cloud.
Assumptions: Transition to the new cloud environment will occur steadily over 3 years; workload remains constant (i.e., no increase in capacity demand).
Scenario 2: Hybrid Cloud Adopters
Key Agency Characteristic: Uses a private cloud solution to handle the majority of its IT workload; also uses a public cloud solution to provide "surge" support and/or support for low-sensitivity data.
Assumptions: Seventy-five percent of the IT server workload will migrate to a private cloud, and the remaining 25 percent will transition to a public cloud; transition to the new cloud environments will occur steadily over 3 years; existing facilities will be used (i.e., no new investment is required in physical facilities); workload remains constant (i.e., no increase in capacity demand).
Scenario 3: Private Cloud Adopters
Key Agency Characteristic: Builds its own private cloud solution or participates in an interagency cloud solution (i.e., community cloud). Broad mission sensitivity results in the need to maintain control of infrastructure and data.
Assumptions: Transition to the new cloud environment will occur steadily over 3 years; existing facilities will be used (i.e., no new investment is required in physical facilities); workload remains constant (i.e., no increase in capacity demand).
Agencies publicly report only their "consolidated" IT infrastructure expenditures, which include end-user support systems (e.g., desktops, laptops) and telecommunications. Additional spending on application-specific IT infrastructure is typically rolled up into individual IT investments. In an effort to isolate data center costs, we extrapolated findings based on our experience with actual federal data centers. Specifically, we developed a "representative" agency data center profile that serves as a useful proxy for other agencies and enables us to explore the potential savings of a migration to cloud computing under the scenarios described above. Although agencies of similar size can have very different IT infrastructure profiles, we modeled an agency with a classic standards-based web application infrastructure. For our representative agency, we began with an assumption that a Status Quo (SQ) data center containing 1,000 servers with no virtualization is already operational.  The results at different scales are shown in our analysis.
Using a Booz Allen proprietary cloud computing cost and economic model that employs data collected internally, data from industry, and parametric estimating techniques, we estimated the LCCs for our representative agency to migrate its IT infrastructure (i.e., its server hardware and software) to the cloud under each of the three scenarios described above. We compared these costs to the LCCs of the SQ scenario (i.e., no cloud migration).  We also calculated three common metrics to analyze each scenario's potential economic benefits. These metrics allowed us to evaluate the three elements of the business case in the President's budget and estimate the absolute and relative benefits, as well as the time over which the outyear savings will pay back the investment costs.
The three key metrics in our analysis are as follows:
- Net Present Value (NPV) is calculated as each cloud scenario's discounted net benefits (i.e., the cloud scenario's reduced operations and support [O&S] costs relative to the SQ environment's O&S costs) minus the cloud's discounted one-time investment costs. A positive dollar figure indicates a positive economic benefit versus the SQ environment. NPV is an absolute economic metric.
- Benefit-to-Cost Ratio (BCR) is calculated as each cloud scenario's discounted net benefits divided by its discounted investment costs. A number greater than 1.0 indicates a positive economic benefit versus the SQ environment. BCR is a relative economic metric.
- Discounted Payback Period (DPP) reflects the number of years (from FY10) it takes for each scenario's accumulated annual benefits to equal its total investment costs.
The top portion of Exhibit 1 shows the analysis results. This exhibit presents the one-time investment phase costs as well as the recurring O&S phase costs for each scenario with a 13-year life cycle (3-year investment phase and 10-year steady-state O&S phase) from FY10 through FY22.
Assuming a 3-year transition period for each scenario, investment costs are expected to be incurred from FY10 to FY12 and include (depending on the scenario) hardware procurement and commercial off-the-shelf (COTS) software license fees; contractor labor required for installation, configuration, and testing; and technical and planning support (i.e., system engineering and program management costs) before and during the cloud migration. Because the SQ reflects an operational steady state, no investment costs are estimated for that scenario. Although the public cloud scenario does not present any up-front investment costs for hardware or software procurement, it does require program planning and technical support, support for porting applications over to the new cloud environment, and testing support to ensure programs and applications are working correctly in the new environment.
Recurring O&S costs "ramp up" for all cloud scenarios beginning in FY10 and enter steady state in FY13, continuing through FY22. For private clouds, these costs include hardware and software maintenance, periodic replacement/license renewal costs, system operations labor support costs, and IT power and cooling costs. For hybrid clouds, the O&S costs include the same items as the private cloud (albeit on a reduced scale), as well as the unit consumption costs of IT services procured from the public cloud. For public cloud scenarios, the O&S costs are the unit costs of services procured from the cloud provider and a small amount of IT support labor for the cloud provider to communicate any service changes or problems. In all three cloud scenarios, a significant portion of the O&S costs are incurred while phasing out the SQ environment during the transition. The SQ phase-out costs "ramp down" from FY10 to FY12, dove-tailing with the ramp up of the new clouds' O&S costs. Not surprisingly, the total LCCs are lowest for the public cloud scenario and highest for the private cloud scenario, with the hybrid cloud scenario's LCCs falling in the middle.
The economic analysis confirms that the projected NPV and BCR for all three scenarios are significant relative to the SQ environment. Once the cloud migrations are completed, our model estimates annual O&S savings in the 65-85 percent range, with the lower end corresponding to the private cloud scenario and the upper end corresponding to the public cloud scenario. These percentages can be applied to overall federal IT spending for data centers to estimate the potential absolute savings across the federal government. (As part of the Information Technology Infrastructure Line of Business [ITI LoB] initiative, General Services Administration [GSA] is coordinating a benchmarking effort across the government. If those figures are made public, a total dollar savings estimate will be possible).
Our model shows that the net benefits and payback periods for agencies adopting the hybrid cloud scenario are closer to those for the private cloud than the public cloud. This variation is largely a result of our assumption that 75 percent of the current server workload would migrate to a private cloud and only 25 percent would transition to the public cloud. If we were instead to assume the opposite mix (i.e., 25 percent of the workload migrating to a private cloud and 75 percent to a public cloud), the hybrid scenario economic results would be closer to the public cloud results.
We conducted a sensitivity analysis on several of the variables in our cost model to determine the major drivers for cloud economics. The two most influential factors driving the economic benefits are (1) the reduction in hardware as a smaller number of virtualized servers in the cloud replace physical servers in the SQ data center and (2) the length of the cloud migration schedule. Exhibits 2, 3, and 4 show the results of varying these factors.
In practice, several factors could cause agencies to realize lower economic benefits than our estimates suggest. One factor is underestimation of the costs associated with the investment or O&S phase for the cloud scenarios. Another factor is server utilization rates (both in the current environment and the new cloud environment). Our analysis assumes an average utilization rate of 12 percent of available CPU capacity in the SQ environment and 60 percent in the virtualized cloud scenarios. This difference in server utilization, in turn, enables a large reduction in the number of servers (and their associated support costs) required in a cloud environment to process the same workload relative to the SQ environment. Agencies with server utilization rates that are already relatively high should expect lower potential savings from a virtualized cloud environment.
The charts indicate two key takeaways:
- Scale is important: The economic benefit increases as virtualized servers replace larger numbers of underutilized servers.
- Time is money: Because of the cost of parallel IT operations (i.e., cloud and non-cloud), the shorter the server migration schedule, the greater the economic benefits.
These findings, in turn, lead us to the following recommendations for agencies and policymakers contemplating a cloud migration:
- It is more cost-effective to group smaller existing data centers together into as large a cloud as possible, rather than creating several smaller clouds.
To reduce the cost of running parallel operations, organization should properly plan for and then migrate to the new cloud environment as quickly as possible. The three lines in Exhibit 5 show (in this case, for the public cloud) that the BCR goes down rapidly and the DPP increases as the transition time increases.
A few agencies are already moving quickly to explore cloud computing solutions and are even redirecting existing funds to begin implementations. However, for most of the federal government, the timeframe for redirecting IT funding to support cloud migrations is likely to be at least 1-2 years, given that agencies formulate budgets 18 months before receiving appropriations.
Specifically, an agency develops IT investment requests each spring and submits them to OMB in September, along with the agency's program budget request, for the following government fiscal year. OMB reviews agency submissions in the fall and can implement funding changes via passback decisions (generally in late November) before submitting the President's budget to the Congress in February. Theoretically, the earliest opportunity for OMB to push agencies to revise their IT budgets to support a transition to the cloud will be fall 2009; however, agencies typically only have about 1 month to incorporate changes to their IT portfolios during passback. To give GSA and OMB time to develop more detailed guidance, as well as necessary procurement mechanisms and vehicles, it is more likely that OMB will direct or encourage agencies to plan for cloud migrations during the FY12 budget cycle (starting in the spring of 2010).
Other Considerations with Potential Economics Effects
When deciding on moving to the cloud, agencies need to consider some additional technical aspects of cloud computing and their potential impact on their organization. Such areas include but are not limited to data security, software migration, technical architectures, and the skill set of the IT workforce.
All government organizations struggle with ensuring that the data they have remains secure and adheres to current policies and regulations. Because data security is such a critical issue, cloud providers will be required to address it in their products and services, and should be able to tailor the level of security to meet demand. Additionally, by centralizing data and servers, a cloud environment will allow for easier detection and investigation of incidents, and allow enabling IT staff to replicate and address them efficiently.
However, there are currently no security standards for cloud computing, and until such standards have been developed, and used effectively to measure provider services and enforce accountability, any failures will fall on the agency's in-house IT organization. In awareness of this reality, organizations should be careful about putting mission-critical and core processes into a public cloud, and private cloud architectures should be designed to minimize any security concerns while realizing the benefits of cloud optimization.
Service Oriented Architecture
As the government moves towards embracing Service-Oriented Architecture (SOA), cloud computing will optimize the benefits of those investments. Cloud computing is inherently a Service Oriented Architecture and implementing the private clouds will provide for more control over data, security and privacy.
Migration of Applications to the Cloud
This article identifies the financial benefits of migrating the IT infrastructure to the cloud.
Cloud architectures and service delivery models will lead to changing needs for technical skills amongst agencies' IT workforces. CIOs will need to plan to conduct or refresh workforce assessments and training, as well as set aside the necessary funding, to ensure technical staff are trained on cloud architecture, implementation and operations.
Economic Influence on Policy
From an economic perspective, GSA and OMB can take a number of steps to maximize the probability that the cloud computing business model can work in the federal government; i.e., that it can achieve its objective of enabling significant cost savings. These steps promote information sharing and transparency in the realistic costs and benefits of various cloud models, as well as establishing the necessary policy and contracting frameworks. Because scale is a key variable affecting both costs and benefits, policy guidance regarding scale considerations will be particularly critical (e.g., determining how much flexibility, if any, agencies and departments have to create private clouds at the bureau and/or interagency level).
As a cloud "storefront," GSA should conduct due diligence reviews to establish that public cloud providers, once identified, indeed offer highly efficient, highly scalable (both up and down) usage-based pricing beyond traditional managed services (e.g., by comparing proposed rates against commercial benchmarks). GSA should also work with potential providers to ensure agencies can readily understand service definitions, service levels, terms, conditions, and pricing. These steps will provide transparency to facilitate agencies' ability to compare potential provider pricing against their legacy operations costs-an essential component of building a credible business case for any type of cloud migration. In earlier shared services initiatives, such as financial management, the lack of such standardized information on pricing and service levels in the first few years proved a major impediment to progress, as agencies faced decisions about alternative solutions that were often based on unreliable cost data from potential vendors.
Finally, GSA will need to establish and communicate its own schedule for cloud services founded in the pricing for the services with different cloud venders..
Summary of Key Observations
Although cloud computing offers potentially significant savings to federal agencies by reducing their expenditures on server hardware and associated support costs, chief information officers, policymakers, and other interested parties should bear in mind a number of practical considerations:
- It will take, on average, 18-24 months for most agencies to redirect funding to support this transition, given the budget process.
- Some up-front investment will be required, even for agencies seeking to take advantage of public cloud options.
- Implementations may take several years, depending on the size of the agency and the complexity of the cloud model it selects (i.e., public, private, or hybrid).
- It could take as long as 4 years for the accumulated savings from agency investments in cloud computing to offset the initial investment costs; this timeframe could be longer if implementations are improperly planned or inefficiently executed.
Given these observations, we offer the following recommendations:
- OMB, GSA, and other organizations, such as National Institute of Standards and Technology (NIST), should provide timely, well-coordinated support-in the form of necessary standards, guidance, policy decisions, and issue resolution-to ensure agencies have the necessary tools to efficiently plan and carry out migrations to cloud environments. As the length of the migration period increases, the potential economic benefits of the migration decrease.
- OMB and GSA should seek to identify those agencies with the highest near-term IT costs and expedite their migration to the cloud.
- To encourage steady progress, OMB should establish a combination of incentives and disincentives; e.g., consider allowing agencies to retain a small percentage of any savings realized from cloud computing for investments in future initiatives. To monitor progress and heighten transparency and accountability, OMB could incorporate cloud-related metrics into the new government-wide IT dashboard.
- Agencies should consider which of the high-level scenarios described in this article best suits their needs, with the understanding that regardless of scenario chosen, proper planning and efficient execution are critical success factors from an economic perspective.
- Given the significant impact of scale efficiencies, agencies selecting a private cloud approach should fully explore the potential for interdepartmental and interagency collaboration and investment (consistent with emerging OMB and GSA guidance). This, in effect, leads to the fourth cloud deployment model-the community cloud. A community cloud is a collaboration between private cloud operators to share resources and services.
- Agencies should identify the aspects of their current IT workload that can be transitioned to the cloud in the near term to yield "early wins" to help build momentum and support for the migration to cloud computing.
Cloud computing has received executive backing and offers clear opportunities for agencies to significantly reduce their growing data center and IT hardware expenditures. However, for the government to achieve the envisioned savings, organizations charged with oversight, such as OMB, NIST, and GSA, will have to facilitate progress, and departments and agencies will have to carefully select and plan for future cloud scenarios that yield the best tradeoffs among their respective costs, benefits, and risks.
- Figures from INPUT data for the FY10 President's budget; of the $20B in expenditures categorized as office automation and IT infrastructure spending, about $12.2B is spent on major IT investments, with the remainder on non-majors. Additional expenditures on application-specific IT infrastructure are typically reported as part of individual IT investments.
- President's budget, FY10 (Analytical Perspectives).
- The 1,000 servers are broken down in our cost model by server processing capacity (small, medium, and large) based on proportions consistent with our experience.
- Our model focuses on the costs that a cloud migration will most likely directly affect; i.e., costs for server hardware (and associated support hardware, such as internal routers and switches, rack hardware, cabling, etc.), basic server software (OS software, standard backup management, and security software), associated contractor labor for engineering and planning support during the transition phase, hardware and software maintenance, IT operations labor, and IT power/cooling costs. It does not address other costs that would be less likely to vary significantly between cloud scenarios, such as storage, application software, telecommunications, or WAN/LAN. In addition, it does not include costs for government staff. Further, for simplicity we removed facilities cost from the analysis
|jhbeil 10/21/09 03:51:00 PM EDT|
so when is "cloudonomics" going to hit the bookshelves?
|Phillip Hallam-Baker 10/20/09 09:30:00 PM EDT|
Looking at the numbers in the article a little further, it is assumed that the utilization rate will increase from 16% to 60% and that the reduction in the number of machines is the reason for the purported 60% cost saving.
The only way I can make those numbers work is if it is assumed that 80% of the costs in a data center are driven by nothing more than the number of machines in the data center that are powered.
This seems to be an absurdly high assumption to me.
|Phillip Hallam-Baker 10/19/09 05:07:00 PM EDT|
I found the basic assumptions in this article to be unsupported. It is really easy to assume 65% savings from an infrastructure change if you ignore most of the costs of making the change.
I examine this in more detail on my blog.
I think this type of article will do great damage to cloud computing as it sets out claims that are simply ludicrous and will not be believed. It is entirely credible that newly deployed software services will be cheaper when designed for cloud deployment. It is not credible that anyone should expect to save a single dollar by taking a deployed application that does not otherwise need changing and throwing it into the cloud.
Once hardware costs are sunk, they are sunk. thus there are no savings to be won through 'migration' if you are a large corporation or a government agency. There will be real savings, but they will be modest and come gradually.
The savings from cloud computing will be for the smaller enterprise right down to the small business which does not even have a machine room let alone a data center. There the savings are real and dramatic. But let's not get cloud computing dismissed as hype with unsupported claims.
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
Nov. 24, 2014 12:00 PM EST Reads: 1,315
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Nov. 24, 2014 11:00 AM EST Reads: 1,428
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Nov. 24, 2014 09:00 AM EST Reads: 1,481
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
Nov. 23, 2014 07:30 PM EST Reads: 1,712
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 23, 2014 12:00 PM EST Reads: 1,645
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
Nov. 23, 2014 07:45 AM EST Reads: 1,487
ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ -- IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...
Nov. 22, 2014 05:30 PM EST Reads: 1,479
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
Nov. 22, 2014 05:30 PM EST Reads: 1,334
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Nov. 21, 2014 09:15 PM EST Reads: 1,401
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the real benefits to focus on, how to understand the requirements of a successful solution, the flow of ...
Nov. 21, 2014 08:00 PM EST Reads: 1,459
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
Nov. 21, 2014 08:00 PM EST Reads: 1,404
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 21, 2014 07:00 PM EST Reads: 1,309
Focused on this fast-growing market’s needs, Vitesse Semiconductor Corporation (Nasdaq: VTSS), a leading provider of IC solutions to advance "Ethernet Everywhere" in Carrier, Enterprise and Internet of Things (IoT) networks, introduced its IStaX™ software (VSC6815SDK), a robust protocol stack to simplify deployment and management of Industrial-IoT network applications such as Industrial Ethernet switching, surveillance, video distribution, LCD signage, intelligent sensors, and metering equipment. Leveraging technologies proven in the Carrier and Enterprise markets, IStaX is designed to work ac...
Nov. 20, 2014 09:15 PM EST Reads: 1,389
C-Labs LLC, a leading provider of remote and mobile access for the Internet of Things (IoT), announced the appointment of John Traynor to the position of chief operating officer. Previously a strategic advisor to the firm, Mr. Traynor will now oversee sales, marketing, finance, and operations. Mr. Traynor is based out of the C-Labs office in Redmond, Washington. He reports to Chris Muench, Chief Executive Officer. Mr. Traynor brings valuable business leadership and technology industry expertise to C-Labs. With over 30 years' experience in the high-tech sector, John Traynor has held numerous...
Nov. 20, 2014 06:00 PM EST Reads: 1,349
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
Nov. 20, 2014 01:00 PM EST Reads: 1,593
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover how hardware commoditization, the ubiquitous nature of connectivity, and the emergence of Big Data a...
Nov. 20, 2014 12:30 PM EST Reads: 1,804
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Nov. 18, 2014 09:00 PM EST Reads: 2,024
SYS-CON Events announced today that IDenticard will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. IDenticard™ is the security division of Brady Corp (NYSE: BRC), a $1.5 billion manufacturer of identification products. We have small-company values with the strength and stability of a major corporation. IDenticard offers local sales, support and service to our customers across the United States and Canada. Our partner network encompasses some 300 of the world's leading systems integrators and security s...
Nov. 18, 2014 08:15 PM EST Reads: 1,583
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Nov. 18, 2014 01:30 PM EST Reads: 2,016
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world. The next @ThingsExpo will take place November 4-6, 2014, at the Santa Clara Convention Center, in Santa Clara, California. Since its launch in 2008, Cloud Expo TV commercials have been aired and CNBC, Fox News Network, and Bloomberg TV. Please enjoy our 2014 commercial.
Nov. 13, 2014 05:00 AM EST Reads: 3,549