Welcome!

Security Authors: Peter Silva, Jay Smith, Carmen Gonzalez, Michael Shaulov, Paige Leidig

Related Topics: Cloud Expo, SOA & WOA, Virtualization, Security, Big Data Journal, SDN Journal

Cloud Expo: Article

The Intelligence Inside: Cloud Developers Change the World of Analytics

Evidence is mounting that embedding analytics inside apps business people use every day can lead to quantifiable benefits

Slide Deck from Karl Van den Bergh's Cloud Expo Presentation: The Intelligence Inside: How Developers of Cloud Apps Will Change the World of Analytics

We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.

The Increased Focus on Analytics
With the emphasis on data-driven decision-making, it is perhaps not a surprise that the focus on analytics continues to mount. According to IDC's Dan Vesset, 2013 was poised to be the first year that the market for data-driven decision making enabled by business analytics broke through the $100 billion mark. IT executives are also doubling-down on analytics, a fact highlighted by Gartner's annual CIO survey which has put analytics as the number one technology priority three times out of the last five years. So, given the importance and spend on analytics, everyone should have access to the insight they need, right?

Most Business People Still Don't Use Analytics
Amazingly, in spite of spending growth and focus, most information workers today do not have access to business intelligence. In fact, Cindi Howson of BI Scorecard has found that end-user adoption of BI seems to have stagnated at about 25%. This stagnation is difficult to reconcile. How is it possible that, at best, one quarter of information workers have access to what is arguably most critical to their success in a world that runs on data?

There are a variety of reasons for stagnant end-user adoption, including the high costs associated with BI projects and an overall lack of usability. However, the biggest impediment to BI adoption has nothing to do with the technology. The reality is that the vast majority of business decision makers do not spend their day working in a BI tool - nor do they want to. Users already have their preferred tool or application: sales representatives use a CRM service; marketers use a campaign management or marketing automation platform; back-office workers will spend a lot of their day in an ERP application; executives will typically work with their preferred productivity suite, and the list goes on. Unless you are a data analyst, you are not going to want to spend much of your day using a BI tool. But, just because business people prefer not to use a BI tool does not mean they don't want access to pertinent data to bolster better decision-making.

The Need for More Intelligence Inside Applications
What's the solution? Simply put, bring the data TO users inside their preferred applications instead of expecting them to go to a separate BI system to find the report, dashboard or visualization that's relevant to the question at hand. If we want to reach the other 75% of business people who don't have access to a standalone BI product, we have to inject intelligence inside the applications and services they use every day. It is only through more intelligent applications that organizations can benefit from broader data-driven decision-making. In fact, according to Gartner, BI will only become pervasive when it essentially becomes "invisible" to business people as part of the applications they use daily. In a 2013 report highlighting key emerging tech trends, Gartner concludes that in order "to make analytics more actionable and pervasively deployed, BI and analytics professionals must make analytics more invisible and transparent to their users." How? The report explains this will happen "through embedded analytic applications at the point of decision or action."

If the solution to pervasive BI is to deliver greater intelligence inside applications, why don't more applications embed analytics? The reality is that only a small fraction of applications built today have embedded intelligence. Sure, they might have a table or a chart but there is no intelligent engine; users typically can't personalize a report or dashboard or self-serve to generate new visualizations on an ad-hoc basis. The culprit here is that business intelligence was originally intended as a standalone activity, not one that was designed to be embeddable. Specifically, the reasons driving developers to ignore BI platforms boil down to cost and complexity.

Cost and Complexity Are Barriers to Embedded BI
Traditionally, BI tools have carried a user-based licensing model. Licenses typically cost from the tens of thousands to millions of dollars. Such high per-user costs might be justified for a relatively small, predictably-sized population that includes a large percentage of power users who will spend a good amount of time working with the BI tool. This user-based model, however, is totally unsuitable for the embedded use case. The embedded use case is geared toward business users who will access the BI features less frequently and likely have less analytics experience than the traditional power user - in this scenario, high per-user costs simply can't be justified.

BI products are complex on a number of different levels. First, they are complex to deploy, often requiring months if not years to roll out to any reasonable number of users. Second, they are complex to use, both for the developers building the reports and dashboards as well as the business people interacting with the tool. Third, they are complex to embed. Designed as standalone products, BI tools are not architected to plug into another application.

Given the cost and complexity of traditional standalone BI offerings, it is no surprise that developers often turn to charting libraries to deliver the visualizations within their application. The cost is low and they are relatively simple for a developer to embed. In the short term, a charting library is a reasonable solution, but over time falls flat. The demands for more charts, dashboards and reports quickly grow, and end users begin looking for the ability to self-serve and create their own visualizations. As a result of these mounting demands, many application developers find themselves essentially building a BI tool, taking them outside their core competency and stealing precious time away from advancing their own application.

Could a New Generation of Embedded BI Provide the Solution?
Fortunately, there is a new generation of embedded analytic platforms emerging that looks set to address these challenges of cost and complexity. Wayne Eckerson, a noted BI analyst, identifies this as the third generation of embedded analytics in his article on the Evolution of Embedded BI. In summary, Eckerson describes the third generation as "moving beyond the Web to the Cloud" where developers can "rent these Cloud-based BI tools by the hour." These BI platforms can "support a full range of BI functionality including data exploration and authoring" and can be embedded through standard interfaces like REST and JavaScript. So, how does this third-generation address the issues of cost and complexity?

Utility Pricing Dramatically Reduces Cost
To address the challenge of cost, a new generation of embedded analytics platforms employs a utility-based licensing model where the software is available on a per-core, per-hour or per-gigabyte basis. From a developer's perspective, this is a much fairer model, as one only pays for what is used. At the beginning of the application lifecycle when usage is sporadic, developers can limit their costs. As the application becomes successful and use grows, usage can be easily scaled up. A recent report by Nucleus Research concluded that utility pricing for analytics can save organizations up to 70% of what they would pay for a traditional BI solution. I've written previously about how utility pricing will dramatically increase the availability of analytics, reaching a much broader set of organizations. The rapid adoption of Amazon's Redshift data warehousing service and Jaspersoft's reporting and analytics service on the AWS Marketplace provides rich testimony to the benefits of this model.

Cloud and Web-Standard APIs Reduce Complexity
A cloud-based BI platform significantly simplifies deployment, as there is no BI server to install or configure. The Nucleus Research report found that the utility-priced, Cloud BI solutions could be deployed in weeks or even days as opposed to the months commonly required for a traditional BI product.

Leveraging web-standard APIs like REST and JavaScript, the third-generation platforms also simplify the task of embedding analytics both on the front-end and back-end of the application. Importantly, these APIs allow full-featured, self-service BI capabilities to be embedded, not just reports and dashboards. This means increased ability of the application to respond to the ad-hoc information requests of business users.

The Benefits of Embedded Intelligence
Intuitively, it would seem that, by providing analytics within the applications business people use every day, an organization should experience the benefits of more data-driven decision-making. But is there any proof?

A recent report by the Aberdeen Group, based on data from over 130 organizations, has helped shed light on some of the benefits of embedded analytics. First, as might be expected, those companies using embedded analytics saw 76% of users actively engaged in analytics versus only 11% for those with the lowest embedded BI adoption. As a result, 89% of the business people in these best-in-class companies were satisfied with their access to data versus only 21% in the industry laggards. The bottom line? Companies leading embedded BI adoption saw an average 19% increase in operating profit versus only 9% for the other companies.

Andre Gayle, who helps manage a voicemail service at British Telecom, illustrates the difference embedded analytics can make. "We had reports [before] but they had to be emailed to users, who had to wait for them, then dig through them as needed. It was inefficient and wasteful." Now, thanks to embedded analytics, British Telecom has seen a huge savings in time and cost. As Gayle explains, capacity planning for the voicemail service used to be a "laborious exercise, involving several days of effort to dig up the numbers " but now can be done "on demand, in a fact-based manner, in just a few minutes."

The evidence is mounting that embedding analytics inside the applications business people use every day can lead to quantifiable benefits. However, the protagonist here, unlike in the traditional world of analytics, must be the developer, not the analyst. A new generation of embedded BI platforms is making it easier and more cost effective for developers to deliver the analytical capabilities needed inside the Cloud applications they are building. As developers increasingly avail of these new platforms, we can hope that BI will finally become pervasive as an information service that informs day-to-day operations. As Wayne Eckerson puts it, "In many ways, embedded BI represents the fulfillment of BI's promise." Now it's up to Cloud developers to help us realize that promise.

More Stories By Karl Van den Bergh

Karl Van den Bergh is the Vice President of Product Strategy at Jaspersoft, where he is responsible for product strategy, product management and product marketing. Karl is a seasoned high-tech executive with 18 years experience in software, hardware, open source and SaaS businesses, both startup and established.

Prior to Jaspersoft, Karl was the Vice President of Marketing and Alliances at Kickfire, a venture-funded data warehouse appliance startup. He also spent seven years at Business Objects (now part of SAP), where he held progressively senior leadership positions in product marketing, product management, corporate development and strategy – ultimately becoming the General Manager of the Information-On-Demand business. Earlier in his career, he was responsible for EMEA marketing at ASG, one of the world’s largest privately-held software companies. Karl started his career as a software engineer.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
There will be 50 billion Internet connected devices by 2020. Today, every manufacturer has a propriety protocol and an app. How do we securely integrate these "things" into our lives and businesses in a way that we can easily control and manage? Even better, how do we integrate these "things" so that they control and manage each other so our lives become more convenient or our businesses become more profitable and/or safe? We have heard that the best interface is no interface. In his session at Internet of @ThingsExpo, Chris Matthieu, Co-Founder & CTO at Octoblu, Inc., will discuss how these devices generate enough data to learn our behaviors and simplify/improve our lives. What if we could connect everything to everything? I'm not only talking about connecting things to things but also systems, cloud services, and people. Add in a little machine learning and artificial intelligence and now we have something interesting...
Last week, while in San Francisco, I used the Uber app and service four times. All four experiences were great, although one of the drivers stopped for 30 seconds and then left as I was walking up to the car. He must have realized I was a blogger. None the less, the next car was just a minute away and I suffered no pain. In this article, my colleague, Ved Sen, Global Head, Advisory Services Social, Mobile and Sensors at Cognizant shares his experiences and insights.
We are reaching the end of the beginning with WebRTC and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) irreversibly encoded. In his session at Internet of @ThingsExpo, Peter Dunkley, Technical Director at Acision, will look at how this identity problem can be solved and discuss ways to use existing web identities for real-time communication.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT. Attendees will learn real-world benefits of WebRTC and explore future possibilities, as WebRTC and IoT intersect to improve customer service.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at Internet of @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, will share some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, an Open Source Cloud Communications company that helps the shift from legacy IN/SS7 telco networks to IP-based cloud comms. An early investor in multiple start-ups, he still finds time to code for his companies and contribute to open source projects.
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines.
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Erik Lagerway, Co-founder of Hookflash, will walk through the shifting landscape of traditional telephone and voice services to the modern P2P RTC era of OTT cloud assisted services.
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehension and conference efficiency.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example to explain some of these concepts including when to use different storage models.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. These technological reforms have not only changed computers and smartphones, but are also changing the data processing model for all information devices. In particular, in the area known as M2M (Machine-To-Machine), there are great expectations that information with a new type of value can be produced using a variety of devices and sensors saving/sharing data via the network and through large-scale cloud-type data processing. This consortium believes that attaching a huge number of devic...
Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies, with a focus on the enterprise, industrial, aerospace, and defense industries. Innodisk is dedicated to serving their customers and business partners. Quality is vitally important when it comes to industrial embedded flash and DRAM storage products. That’s why Innodisk manufactures all of their products in their own purpose-built memory production facility. In fact, they designed and built their production center to maximize manufacturing efficiency and guarantee the highest quality of our products.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. Download Slide Deck: ▸ Here
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. Every other IT news item seems to be about IoT and its implications on the future of digital business.
BSQUARE is a global leader of embedded software solutions. We enable smart connected systems at the device level and beyond that millions use every day and provide actionable data solutions for the growing Internet of Things (IoT) market. We empower our world-class customers with our products, services and solutions to achieve innovation and success. For more information, visit www.bsquare.com.
With the iCloud scandal seemingly in its past, Apple announced new iPhones, updates to iPad and MacBook as well as news on OSX Yosemite. Although consumers will have to wait to get their hands on some of that new stuff, what they can get is the latest release of iOS 8 that Apple made available for most in-market iPhones and iPads. Originally announced at WWDC (Apple’s annual developers conference) in June, iOS 8 seems to spearhead Apple’s newfound focus upon greater integration of their products into everyday tasks, cross-platform mobility and self-monitoring. Before you update your device, here is a look at some of the new features and things you may want to consider from a mobile security perspective.