“’2016-07-24 10:06:28.31′,’2016-07-24 10:06:28.30′, ‘2016-07-24 10:06:28.31′,’2’, ‘404997711001022’, ‘911431111232’,, ‘6’, ’60’,, ‘internet1.nsn.com.mnc099.mcc404.gprs’,,, ‘5’, ‘1950’, ‘2016-07-24 09:48:01.00’, ‘40499’,’15361….”
Crystal clear, isn’t it? You may have figured out that it’s something from an operator network, but can you tell what it means? And how would you make something useful out of this cryptic looking piece of raw data?
This is the central question that takes us back to the fundamentals of Big Data Analytics. Analytics is about creating new information and intelligence. The underlying raw material for this process is data – data that today can come from virtually anywhere, from telecom networks to washing machines – you name it. And for several years companies have been trying to tap into various sources of data in order to learn more about their customers, to become more productive or to be able to sell more.
The same is certainly true with telecom operators. With several billions of events created every day, telco networks provide an unparalleled amount of data. There’s really no lack of raw material, which leads back to our question – are telcos able to turn this huge amount of data into a new value-add?
The short answer is yes, but to get there takes work.
Don’t let that fish get away
Let’s take an analogy from the fishing industry (Bet you didn’t see that coming!). A single salmon, when caught in the North Sea obviously has some minor value. But let’s consider the fish as raw material, which after capture will be refined through multiple steps. When the same fish is served in a fine restaurant, its value might have easily multiplied ten-fold during the process.
The same thing happens with telco data. A single item of raw data from the network has very little value on its own, but through the process of analytical refining it can become incrementally more valuable every step of the way.
Let’s take our raw data from the beginning as an example. This data is part of several tens of records created when a telco customer makes a single VoLTE call. The first thing to do is to ensure that data in each individual record is accurate. Do all the fields have values? Are the values consistent? Are we storing everything in a secure manner? Think of it like checking that the caught fish is healthy, a reasonable size and that it’s stored properly in a cold environment.
In this case, combining all the received records tells us person A made a VoLTE call to person B. That’s nice, but doesn’t exactly break the bank for data value. So let’s imagine we can further refine this. Let’s say we could first establish the customer’s identity. Is it a postpaid or prepaid subscriber, which type of data plan have they bought, which value segment do they belong to, and what type of LTE device make and model do they have?
Perhaps we could also see how many VoLTE calls they are making, as well as when and how it impacts their legacy voice usage. Are they experiencing any issues such as silent calls or long connection times? We can have visibility from the last few minutes to the last year and in general observe what their overall quality of voice experience has been. And talking about experience, what if we could link this to customer care, to see how many times they have contacted the call center about VoLTE related issues, and how good was the customer service they received?
While we’re at it, what if we knew exactly where the person was located when making the call, during the call and even before the call?
All this is a bit like having different options for refining the fish. Whether it’s fried, in soup, in salad, sushi … I’m sure you get the point already.
Through this process we can start to see valuable new information being created from the raw data. Then imagine we could do the same for all subscribers that the operator has, across all geographical areas where they are present. This opens up a powerful “cube” of new information that we can look at to slice and dice from different angles. For example, it allows the operator to look at “who are the high-value customers with iPhone 6 and IOS 10.2 or older, who moved along main shopping streets, who have had bad quality of experience with VoLTE calls during the last two days and who have not called the Call Center to complain”.
Is all of this dream-ware? No – this is available and possible today. These types of flexible new insights establish the basis for value realization, but at the same time, it’s not yet the end point in our process. To create maximum impact, it is important to think about how we actually operationalize these new insights into concrete day-to-day usage.
No magic, just better info for better processes
Any new insight creates an impact only when it is really applied. For some time the industry has been talking about “actions” as a key step to turn insights into value. But sometimes we have a tendency to look at actions as if they are some magical new capability we need to invent. In reality, we can take a more down-to-earth view. Every day operator teams and individual employees already make multiple decisions, take action, create change, implement procedures – all guided by different work processes. Operationalizing insights is about the ability to enrich these work processes with new intelligence from analytics. In simple terms – to help people to do their best work.
Let me give you a real-life example. One Nokia analytics customer had a process of proactively reaching out to possible churners as a result of bad quality in their network home environment. The challenge was how to establish when a customer actually had a bad experience at home. Working together with operator data scientists, we created an analytical model which learned from subscriber movements and usage profiles to establish when the mobile customers were actually using operator services at home. This was correlated with insight about perceived experience, based on different quality and problem measurements. The resulting insight was then operationalized into the existing work processes, which helped the team to achieve 3x improvement in their campaign efficiency and lead to a 6% reduction in their churn rate.
So to answer our original question: yes, that raw data has value. But only when we go through the process of refining it. We need to able to create real new insights in a flexible manner, and we need to ensure the new insights are operationalized into work processes. Hence, it’s important that we as an industry look at analytics not just as science or a technical solution, but as a whole process of step-by-step refinement of the data all the way to operationalization. This is the way we can extract maximum value from our precious raw material.
To learn more about how Nokia solutions operationalize data, please visit our portfolio pages on Telecom analytics.
Share your thoughts on this topic by replying below – or join the Twitter discussion with @nokianetworks using #bigdata #analytics