|By Rick Kawamura||
|March 8, 2012 06:00 AM EST||
I recently presented, "The Moneyball Approach to Big Data - Creating an Unfair Advantage," at the Wall Street Technology Association's Hot Technologies Forum in New York. Everyone is talking about Big Data, but when it comes to taking action, most are taking a "wait-and-see" approach, and that concerns me. Skepticism or "late-adopter" mentality is understandable - if you want to forego a low-risk, high-reward opportunity and let your competition gain the advantage.
My job is to create value for my customers, and I'd hate to see anyone miss out on the opportunity presented by Big Data.
What's the Problem?
The Corporate Executive Board identified three potential barriers to Big Data implementation:
- Information Attainability (the right information is available and easy to find)
- Information Usefulness (information is of good quality and is usable in format)
- Employee Capability (employees analyze information effectively to make good decisions)
While these are definitely three potential barriers to implementation, the main problem I see in the market is pre-implementation where the "wait and see" mentality is caused by either a feeling that Big Data is over-hyped ("maybe it will go away"), or paralysis by analysis (the enormity and complexity of Big Data is too confusing to take action).
What Is Big Data?
The most common definition I've seen for Big Data is summarized by the three Vs:
- Volume: It's big - terabytes and petabytes of data
- Variety: It comes in many forms - internal, external, structured, and unstructured
- Velocity: It is growing and changing rapidly - making real-time capture and action hugely important
This definition is always supported by numbers showing the vastness and enormity of Big Data:
- The New York Stock Exchange creates 1 terabyte of data per day (InformationWeek)
- 10,000 payment-card transactions are made per second worldwide (American Banker)
- 30 billion pieces of content are shared on Facebook every month (McKinsey)
- Twitter feeds generate 8 terabytes of data every day (InformationWeek)
The Internet plays a huge role in the rapid growth of Big Data, giving individuals the ability to post and upload immense amounts of pictures, text, video, and mobile data. It also gives businesses a channel to offer access to customers and partners through web-based applications (think Oracle, salesforce.com, social media, procurement, logistics, publishers, etc.). One way to visualize this explosion of applications and data is the Bessemer Venture Partners Cloudscape. And that's just in the cloud. Don't forget all of the apps and data behind the firewall of every organization, whether commercial, governmental, or charitable. Big Data truly is BIG.
Before you go out and buy massive amounts of storage, take a look at what you currently consume and utilize and start from there in easily digestible portions. Forrester estimates that enterprises currently utilize less than 5% of available data. In a survey of global executives, IBM shows that 33% have made decisions with inaccurate data or data they don't trust; half don't have sufficient information from across their organization to do their job; 75% believe more predictive information would drive better decisions; yet 87% have yet to even start taking advantage of opportunities to leverage information to their advantage. You don't need to immediately implement a solution getting you to 100% or even 50% of available data - 6-10% will do for now.
If there are 200 million tweets a day equaling 8 terabytes of data, but only 1,000 of those tweets relate to your product or company, do you need to store and analyze all 8 terabytes every day? Think of it this way: there's a huge difference between "I have terabytes of data - videos, satellite pictures, social media conversations, and research reports" and "I know where Public Enemy #1 is." It comes down to Data vs. Intelligence. Data is useless if you can't extract meaningful intelligence from it. And the quality of the intelligence is much less dependent on the volume of data than it is on the relevance of the data and your ability to access it.
The Magic Words: Relevance and Accessibility
Although Big Data lives up to its name, don't get caught up in all the massive numbers. Focus on what's relevant to your business. Consider this: Sybase published Big Data, Big Opportunity that stated "for the median Fortune 1000 company... a 10% increase in usability of data translates to an increase of $2.01 billion in total revenue per year, [and] a 10% increase in accessibility to data translates to an additional $65.67 million in net income per year." Just because your company currently may have access to only 5 percent of the relevant data that is available, don't despair. You don't have to go from 5 percent to 100 percent. You really only need to go from 5 percent to 5.5 percent to reap great rewards.
The Secret to Taming Big Data
Despite all the hype and discussion around Big Data's massiveness, I've yet to find a single article mentioning the difficulty of accessing data that is spread throughout all of the various source applications. Until recently there were no Big Data integration platforms that could deal with the exploding number of applications and all of the data they contain, as well as the speed at which both are changing. Just a glance at the daily domain statistics on www.domaintools.com gives you an idea of the volume of sites being created, deleted, and transferred every 24 hours. Not every integration solution can manage the intensity of that kind of change to give you access to the relevant data - the business intelligence - your business needs when you need it.
The whole point of gaining access to relevant data is that it must be actionable. Otherwise it's a big waste of time and effort. What's amazingly useful about Big Data, and the web-based nature of so much of it, is that with a Big Data integration platform you can access any data you can see on a website and you can just as easily transform that data, perform an operation on it, and automate a resulting action. Here's an example:
You know that consumers and even your B2B purchasers research prices online and that loyalty to any one vendor has deteriorated as buyers have more pricing knowledge just a search and mouse-click away. But you're smarter than your competitors because you're already doing the extra 10 percent. You set up automated monitoring of your competitor's pricing, and when their price drops below yours, your Big Data integration platform calculates the difference plus 10%, logs into your ecommerce site and adjusts your prices automatically, all in mere moments. The beauty is that this can all be set up in hours, if not a few days, and you don't have to bring in an army of developers or consultants to create custom code to do any of it.
If I told you I could guarantee any application or data you can see in your web browser (customer data, bank transactions, twitter, blogs, supply chain vendors, government data, competitor prices, etc.) could be automatically accessed and loaded into the application, database, or spreadsheet of your choice, how many game-changing Big Data projects could you imagine? Understanding the point-in-time cash position of billions of dollars across 300 banks? No problem. Monitoring competitor pricing on 50,000 SKUs every day? Simple. Automating a 23-step manual invoicing process to get paid millions of dollars two days faster? Done. Real-time, automated access to the relevant data you need is the key to success with Big Data.
Every company can benefit from Big Data in many ways, but most don't realize it. Hundreds of scenarios are possible using real-time application integration platforms that could save your company millions of dollars; grow revenue by double-digit percentages; create more personalized products that delight your customers; automate real-time feedback on your brand, products, and competitor prices; create your own custom research that allows you to see trends before your competitors do; and overall make your company a much more agile business that scales with your new-found vigor and growth. Don't let the size of Big Data paralyze you; get real-time access to the data that is relevant to your company's growth and take action.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 24, 2016 01:30 AM EDT Reads: 1,118
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
Jul. 24, 2016 01:15 AM EDT Reads: 1,214
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
Jul. 24, 2016 01:15 AM EDT Reads: 862
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 11:45 PM EDT Reads: 1,211
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jul. 23, 2016 11:15 PM EDT Reads: 1,855
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
Jul. 23, 2016 11:00 PM EDT Reads: 2,229
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Jul. 23, 2016 10:15 PM EDT Reads: 2,439
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
Jul. 23, 2016 09:30 PM EDT Reads: 1,949
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 23, 2016 08:45 PM EDT Reads: 1,658
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 23, 2016 08:30 PM EDT Reads: 1,778
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 08:30 PM EDT Reads: 2,028
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 23, 2016 08:00 PM EDT Reads: 2,418
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 23, 2016 08:00 PM EDT Reads: 1,800
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 23, 2016 07:15 PM EDT Reads: 1,875
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 23, 2016 07:00 PM EDT Reads: 884
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Jul. 23, 2016 06:15 PM EDT Reads: 824
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 23, 2016 06:00 PM EDT Reads: 1,896
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 05:30 PM EDT Reads: 1,728
The IoT has the potential to create a renaissance of manufacturing in the US and elsewhere. In his session at 18th Cloud Expo, Florent Solt, CTO and chief architect of Netvibes, discussed how the expected exponential increase in the amount of data that will be processed, transported, stored, and accessed means there will be a huge demand for smart technologies to deliver it. Florent Solt is the CTO and chief architect of Netvibes. Prior to joining Netvibes in 2007, he co-founded Rift Technologi...
Jul. 23, 2016 05:00 PM EDT Reads: 942
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Jul. 23, 2016 04:00 PM EDT Reads: 2,317