Mobile Leads the Way in Big Data Growth–But Where Will All the Data Go and How Can It Be Tapped?

RCR Wireless Magazine

By Dr. Hossein Eslambolchi

Date: December 2011

The communications industry is faced with transaction volumes in the billions of records per day, and is expected to grow rapidly over the coming years in large part due to increased adoption of smart devices but also due to the fact that any one consumer now holds a number of different devices, each generating many gigabytes of data every day. Operators are continually trying to keep up with tracking and retaining all this usage data and at the same time better understand the behavior and needs in order to roll out and monetize new service offerings at the optimal time.

How big is big?

Figures from a recent Cisco report indicate that global mobile data traffic will increase 26-fold by 2015. Additionally, there will be over 7.1 billion mobile-connected devices, including machine-to-machine (M2M) modules, in 2015. Mobile-connected tablets will generate as much traffic in 2015 as the entire global mobile network in 2010. But Big Data is not just about the daily megabytes each consumer generates which grows to petabytes three months later which an operator must hold to even generate the bill.

 

Up until just a few years ago, operators were predominantly focused on two key aspects – that of collecting subscriber usage data in order to produce the bill and ultimately account for revenue and secondly optimizing the networks with the goal of improving overall quality of service. Today the focus is much more about improving the total customer experience. A key part of that is monetizing new mobile products and services and every day new partnerships are formed for this purpose to enable more and better functionality on today’s mobile applications.

 

An important shift in focus for Communication Service Providers (CSPs) is moving away from ‘after the fact’ analysis, such as comparing revenue month to month or the age-old subscriber churn question to now more focus on predictive analysis and better use of subscriber data to predict what will happen. By incorporating analytics into everyday business decisions, operators can render valuable intelligence from usage data, to provide greater insights for which preferences, popular applications and even the social network of an individual or group of subscribers. This rich data enables operators to better manage customer service expectations and ultimately improve loyalty.

 

The idea of gaining a deeper level of subscriber intelligence–amidst the Big Data explosion driven by a plethora of new services, applications and devices; —creates a major challenge for IT in the form of data storage and management in addition to high performance analysis. Unfortunately, even with both historical and real time analytics in place, attempting to run fast queries against such large data sets can often feel like finding the needle in the haystack. An expensive haystack. What is required by CSP’s IT group is a cost-effective data management solution which not only compresses the data but enables operators to quickly and accurately extract the information that is the most useful to them for key business decisions.

 

Important considerations for IT

Amidst the growing volume of data that needs to be managed with ever changing business demands, there are two very important considerations that IT must consider when managing multi-structured data long-term. First, the speed at which the data needs to be ingested, and second the fact that systems need to scale cost effectively to keep this data online for many months and years.

Most tier 1 providers have discovered that traditional relational (aka row-based) databases simply cannot keep pace with the speed of data creation. Additionally, a traditional data warehouse can be cost prohibitive for the high volume and scale requirements over time, which can exceed budget for the fast analysis required. The communications industry generally speaking has what has come to be known as machine-generated data which essentially means once the transaction is created, it never changes and therefore needs a purpose-built repository to ingest voluminous data created at network speed in addition to cost-effective storage and easy scale for future growth. If the database repository is ingesting data off the network, it must also be available for any ad-hoc or planned query– essentially the ability to support diverse workloads and perform the job required with no downtime.

The system must also meet the level of query and analytics performance requirements for both business and regulatory compliance users. Today many CSP’s research teams are taking advantage of new, innovative technology platforms such as Hadoop and MapReduce which will enable fast analysis over wide and varying data structures at significantly lower cost. We can thank web 2.0 organizations who have led the way in this new open-source technology where behemoths like Google, Yahoo, eBay, Facebook and others have been able to manage petabyte scale with acceptable performance levels. Many of today’s enterprises are taking advantage of the same innovative technology to turn around fast business analysis.

For organizations that have invested heavily in enterprise data warehouses or even data marts to satisfy business query needs can also augment with such a dedicated solution to store long-term historical data sets and also provide unique and compelling economic benefits. Implementing specific business rules around when to offload from say the central warehouse to the dedicated historical data repository will definitely help improve overall performance on the primary warehouse but more importantly reduce the data set and therefore the cost to maintain over time. Having dedicated purposely designed data repositories fit for purpose should be the ultimate goal and a key part of this is the ability to easily move data in and out of these different repositories.

Mobile leads the Big Data wave – so where will all the data go?

For today’s operator, there is simply no choice as to whether you keep the data online and available. Both regulators require it and now it seems more than ever before that business demands it for better intelligence and competitive advantage. Understanding exactly what is going on within your customer base can only better lead to improved service levels and new revenue-generating offerings. So how you store and manage data is the next big question and how much you spend doing so is a key part of that. Let’s face it, IT infrastructure and all related expenditures are the cost of doing business. Fortunately there are many innovative database technologies that have emerged even in just the last 5 years that specifically address this problem. Those built for the purpose to store multi-structured data online for virtually unlimited timeframes and provide a much more efficient and cost-effective alternative to traditional or data warehouse should be closely examined. Additionally, as with any enterprise database solution, you need to have one that is resilient and secure in addition to the aforementioned very important compression capability, which addresses Big Data head-on.

 

By giving business ongoing access to years of subscriber data, they can better predict what will happen among the base. Knowing weeks in advance of a possible churn is much more powerful than weeks after the fact. Knowing which networks or group of subscribers are influenced by churn is even more powerful. If you really think about it, the customer experience is really all that matters and better intelligence about their behavior is a true competitive advantage. IT should look upon Big Data as a new opportunity – never before have we had so much data at our fingertips to tell us what customers are doing and what they ultimately want. Now it’s our job to figure out the best way to achieve that.

 

About the author

Dr. Hossein Eslambolchi is the chairman and CEO of 2020 Venture Partner. He is recognized in the scientific community as one of the foremost thought leaders and technologies scientists of the high-tech age and 21st century. Dr. Eslambolchi joined AT&T Bell Laboratories in 1986 and became global CTO, global CIO, President and CEO of AT&T Labs, and President and CEO of AT&T Global Network Services. He also served as a critical member of AT&T’s governing Executive Committee. As the Chief Transformation Officer at AT&T, he developed and executed a comprehensive four-stage strategy that included Enterprise Customer Service, Network Transformation, Service Transformation and Cultural Transformation; essentially, the overhaul and remodeling of the company that SBC dubbed the “new AT&T.”