Thursday, November 27, 2014

Part 4.1 - Traditional Approach to User Experience Management

To better understand how big data and analytics can dramatically improve the network problem identification and resolution for Service providers, let’s shown how the customer user experience is today managed by Telco operators

In the telecommunication industry, the management of User Experience is traditionally a well established discipline with clear models and processes (eTOM, ITIL). It is generally based on three main business processes:
 
  • Fault Management (FM), it is the group of processes of collecting and managing Alarms from network and service elements – Generally speaking, this process is responsible of monitoring services availability. This process is highly automatized
  • Performance Management (PM), it is the group of processes of collecting network and service performance information and aggregating it in high level indicators: the KPI and/or KQI. This process is responsible to monitoring the performance (quality) of the services. This process is highly automatized
  • Incident and Problem management, it is the group of processes to identify the problems (root cause analysis) and to make the needed change in network to fix the problem. This process is manual and time-consuming

The idea behind this model is very simple: all services are: “mapped” into the network and are associated to aggregate performance indicators (KPI/KQI) which measure the quality of the service provided. The combination of these components - Network alarms, KPI/KQI and service-network element mapping - provides the information to manage the service quality.
  • KPIs status provide clear and integrated view of the status of the network and the service quality provided
  • Alarms (Network element fail and KPI violation) provide immediate message on problems
  • Network-Service mapping allows to perform root cause analysis when alarms (network and KPI) are raised

When a network element fail, a network alarms are raised. Through the mapping between service and network elements it is possible to identify services impacted. Similarly, when a service KPI/KQI is degraded (it goes above or below defined thresholds), a KPI violation alarm is raised and through the service-network mapping is possible to identify the cause of the problem. So the combination of alarm, KPI and service mapping allows to manage problem and incident in the network and/or in the services provided to the subscribers.


Note:
The diagnostic approach based on the mapping between services and network elements   is based on the analytic model of “abductive inference” between alarms (effects) and network elements (cause). The abductive Inference is the process of reasoning from effect to a cause. It is a form of logical inference that goes from an observation (f.e. alarm) to a hypothesis that accounts for the observation (f.e. network or service elements which caused the problem).
 
Also if theoretically this model is extremely powerful, in the reality it has several strong limits:
·         the mapping can be extremely complex in large networks,
·         it is able to diagnose only known problems (the ones modelled)
·         it   doesn’t provide user impact information
·         It can’t identify problems which have no fault associated (f.e. slow download)
   


The picture below show an example of Telecom Operator Service Assurance reference architecture and the business processes associated.


Figure 1 - Example of Telecommunication reference model for Assurance and fulfillment processes


  
This model has worked well until the mobile internet era. In the 2G and 2.5G world, all, or most part, of services (f.e. example voice, SMS, etc.) provided by Telecommunication Operators were “network services” - that are services provided by the network layer. These services were controlled end-to-end by the Service Provider – from the switch to the terminal. Moreover, at that time, the mobile network was quite homogeneous and terminals (phone) were quite stupid.

In 2/2.5G scenario, the quality of the services was tightly bound to the network service level: monitoring the network – in particular the network signaling - was enough to monitoring the quality of the services provided.  At the same time, the strong relationships between services and network elements was the key to identify the root cause of the problems. It was the era of SLA (Service Level Agreement) and KPI/KQI (Key Performance/Quality Indicators).

With the 3G and the beginning of mobile internet era this scenario changes: Not “network services” appear. IP services (f.e. e-mail, web browsing, file download, etc.) start to be a commodity also on mobile networks. These new services had the peculiarity tonot to be “network services”; they are “application services” provided by applications on-top of the network. Quite often these applications are not owned by Operators but are provided by external entities (Application providers) using Operator network 

Terminal starts to be more intelligent and to run part of the applications/services on the terminal itself. The first smartphones appear. Mobile network increase their complexity: the 3G network is combination of packed data and voice circuit networks and its infrastructure coexists with the existing 2G/2.5G networks

These changes have a strong impact on the user experience monitoring and assurance:
  • The new IP services are just lightly bound to the network layer: monitoring the Control plane is not enough; User plane information are needed too.
  • Moreover, the relationship between services and network elements become less clear due the intrinsic packet-switched behavior of IP protocols, making the problem diagnostic processes extremely complex and time-consuming.

 
The service monitoring and assurance systems evolve to switch their focus from network to user and two new concepts rise: the Quality of Experience (QoE) and the Customer Experience Management (CEM).


Note:
Quality of Experience (QoE) is a measure of a customer's experiences with a service (web browsing, phone call, TV broadcast, etc.). QoE is a purely subjective measure from the user’s perspective of the overall value of the service provided. That is to mean, QoE cannot be taken a simply the effective quality of the service but must also take into consideration every factor that contributes to overall user value such as suitableness, flexibility, mobility, security, cost, personalization and choice (…..) Apart from its being user dependent, QoE will invariably be influenced by the user’s terminal device (for example Low Definition or High Definition TV), his environment (in the car or at home), his expectations (cellular or corded telephone), the nature of the content and its importance (a simple yes/no message or an orchestral concert) (from Wikipedia).

Even if the QoE is a subjective measure, it is possible to identify metrics that are directly related to the quality perceived by the end user – f.e. time to set up a new call, time to see a new web page or to access internet content, jitter or video buffering time when watching a video, etc. These metrics can be combined in score system to be used as global indicator of the quality perceived by the subscriber when he uses the services but also when he interacts with the Service Provider itself – f.e. when he contacts customer care.

 

4Gnetworks and the “Apps era”, completely reverse the traditional scenario: only few services are still network services; the most part of the services are provided by application over the network (the so called Over-The-Top applications).
  • The first and worst consequence of this new paradigm is that Telecommunication providers don’t control anymore the end-to-end service; in the best case they just control some components of this services, but in the most cases they just control the flow of application data inside their networks. But at the same time, Subscribers still credit Service Providers for the end-to-end service. If a subscriber has a problem with YouTube, their first thought is that the network’s operator is not good, independently of the real reason of the problem.

  • Also in the case where the Service provider controls the whole end-to-end service– f.e. in the case of VoLTE – providing a clear view of what is the real service quality perceived by the final user isn’t an easy task. The coexistence of different network technology - 4G (fully IP), 3G or 2G (traditional) and also WiFi - has dramatically increased the complexity of measuring service quality. A VoLTE call can provided through a mix of different network technologies – the so called 3g/4G roaming – and in this case, understand the quality of the call is a real guess.

  • The User interaction is completely changed too: subscribers and their devices (more than one) are always connected to the network (always on). Customers are become extremely demanding in term of quality and speed of the connectivity.

  • Additional complexity element to this new scenario is related to the so called Internet of things (IOT): in a short future the main users of the telecommunication network will be not human anymore, but sensor and device. This is the era of machine-to-machine communication


Obviously Service monitoring and Assurance systems have been evolved to keep up with this evolving scenario, f.e.: 
  • New data sources have been introduced (f.e. usage plane) to produce new KPI/KQIs, correlating control and usage information. 
  • Session capture and monitoring has been extended to all subscriber
  • Deep Packet Inspection (DPI) probes has been introduced in the network to extract IP protocol’s metadata, to have a better view of the user traffic at application and protocol level (level 7) 
  • Some operators have started to use also device status information (CiQ) to extend the trouble shouting up to the user’s devices
  • Other data sources as network topologies and logs, etc.

Unfortunately, most part of these improvements have been focuses on identifying and monitoring the subscriber’s Quality of Experience. Relative few efforts have been directed to solve the fundamental problem of the network and subscriber management: understand the reason why a problem or alarm has happened.


Using statistical terminology, today’s Service Assurance systems are efficient QoE Descriptive Measurements tools (know the problem), but they don’t provide diagnostic analysis support (to help solve the problem): they don’t help operator in the diagnostic and solve of the problem.  



 

 
 
 


Friday, November 21, 2014

Part 4 - Analytics for Network, Service and Customer Experience Managament



The telecommunications business faces many challenges these days. The combination and the exponential growth of mobile traffic has put a lot of pressure on Telecommunication networks. The combination of traditional sources of revenue declining, of exponential growth in data traffic, of the increasing network complexity (4G, 3G, 2G, WiFi, Femtocell, etc.), and the exponential growth of services and connected devices are overwhelming the company’s ability to provide high-quality service assurance—a situation that can damage customer relationships and increase churn.

In this new scenario, tradition service assurance approach where the user experience is managed by the monitoring the network is not anymore successful. The equation: Good network SLA = Good Services is not anymore valid. In the hypercompetitive telecommunication world of today, to meet increasing customer expectations, Service Providers must shift their focus from monitoring and manage network SLAs to monitoring and managing the end-to-end service quality as it is perceived by the final user: the customer Quality of Experience. Service providers’ service assurance organizations must facing the well-known litany of the challenge to move from SLA to QoE.

In this transition, big data plays a double role: from one side the explosion of the data to manage is a big threat, but from the other side big data and the statistical techniques to manage and analyze them, presents for Service provider organizations not only the great opportunity for growth, but also the biggest challenge to better optimize and automate their business processes

In the following chapters, will be shown how the combination of big data and advanced analytics can dramatically transform the network management from a manual and time consuming process to an automatic and actionable one.

Part 3.2 - B2C marketing: the Next Best Offer or Recommendation marketing

Despite all the hype to turn themselves into customer-centric companies, most Telco organizations are still product-oriented. Marketing departments, are still organized to sell products through mass campaign and not to increase customer’s expenditure promoting personalized services. But in the hype competitive telecommunication world, subscribers are inclined to spend more on the companies which offer personalized product and services. In order to increase their profit, Telecommunication companies must switch to a model where the customer’ lifetime value is maximized; they have to embrace new solutions capable of recommending the right product at the right time in the right location: The Next Best Offer solutions.
Read the full article:



Part 3.1 - B2B Marketing or How the convergence of WiFi Network, Big Data and Mobile Payment will change our shopping experience

The adoption of a single new technology rarely generates a big change in our social habits or behaviors; generally it is the combination of two or more technologies that creates something new and that will change our everyday life. This combination is very often accidental. Nowadays there are a lot of technological trends: I would like to focus on 3 specific products that seem to have no or very little correlation among themselves, but whose evolution and combination may have a big impact on our everyday experiences in the near future.

I am referring to:
  • Large scale deployment of Wifi network in metropolitan areas
  • Adoption of Big Data by every Company and Enterprise to switch from product-oriented to customer-centric marketing.
  • Mobile payment diffusion

These technologies will dramatically change our shopping experience.

Let’s see how this will happen and why.

The first trend is the choice of WiFi as a new preferred network access, by Telco operators in addition to the traditional 2G/3G/4G.

We all like the idea to be always online, always connected to our friends, communities and in general to the Internet.  And we all like being connected to the widest possible band to watch movies, upload/download picture and in general to access any digital content in mobility. The result is an insatiable appetite for mobile data that can’t be sustained any longer by the traditional 3G/4G radio spectrum.

WiFi networks represent a perfect solution for telecommunication operators, to satisfy the increasing bandwidth request in metropolitan areas, in venue location, in stadium, etc. and also indoor, as they have a much lower cost and much more bandwidth compared to cellular networks.
Basing on  “Cisco Visual Networking Index: Global Mobile Data Traffic Forecast Update, 2013–2017 report”, the percentage of offloaded traffic will grow from 33% in 2012 to 46% in 2017, and the trend will continue to grow.

In the near future, large part of data traffic in metropolitan areas will be carried by WiFi network.

The second big trend I want to analyze is the diffusion of big data approach in every industry and corporate.

Customers are increasingly frustrated by the generic offers they are bombarded with; they are looking for only relevant, personalized interactions, based on their situation and preferences. This is generating a Copernican revolution in corporate marketing: the end of product-centric approach in favor of a customer-centric one. Companies’ focus is moving from product sales to customers` expenditure level.

The key message is: get to know your customer and the key technology to realize it is Big Data and Analytics. The massive collection of information on customers (Big Data) allows companies to know their clients and to identify their preferences and needs; while Analytics allows Operators to identify the right product to propose/recommend them.

Several market reports show that sale efficiency in recommendation model on customer-initiated interaction is 10 times bigger with respect to traditional direct marketing.

Future marketing is personalized and recommendation-based.

The third and last trend is the Mobile Payment revamping.

Mobile Payment had a big hype some years ago, but it didn’t grow as expected and the market forgot about it for a while. Only recently have we seen the pop up of a lot of apps allowing users to make in-store purchases entirely with their phones, thus bypassing the store’s payment terminal. And only in the last days, have big players reentered this market – i.g. Apple has launched its new payment feature: Apple Pay; and MCX, a consortium of over 70 among the largest retailers in the US,  has announced its own mobile wallet, "CurrentC".

Based on BI Intelligence report, mobile in-store payments will grow at the 154% five-year- compound annual growth rate (CAGR), from $1.8 billion in 2013 to $189 billion in 2018.
But more important, Mobile Payment will change the way we will pay for what we buy.

In the near future, the combination of these three technological trends can dramatically change the way we do shopping, as they will enable a closer and more intimate interaction between store companies and customers inside the shop.

In a short future, not only will WiFi provide access to the Internet, but it will also provide high precision localization information on the users under the coverage of its access point. Using triangulation techniques similar to GPS, WiFi networks will provide a customer`s very accurate localization, even when the user is not logged in. Indeed, to fix the position, WiFi stations don’t need the users be logged into the network, they just need WiFi on in their smartphones.

Wi-Fi will emerge as the dominant indoor positioning technology for consumer-facing applications, thanks to the combination, and the consequent cost-sharing, of internet access and localization services. Other technologies – such as RFID, infrared, Beacon ... - will continue to exist in situations such as those where Wi-Fi networks don’t exist, high precision or other special features are required or desired, or applications don’t require tracking capabilities.

Thanks to the localization information provided by WiFi Access points in real time, shopkeepers will know the position of their customers inside the shop with the precision of one aisle.

Using big data and analytics techniques, shopkeepers can interact in real time with the customers, showing them offers, promotions, and products close to them or recommend those products they may be interested in. These techniques allow shopkeepers to engage and delight customers proposing them the right offer with the right value at the right time, when they are still in the shop.
Indeed, big data and analytics, analyzing past customer`s behavior and transactions, allow shopkeepers to know what the customer’s preferences and needs are and what their shopping behavior is.

Retailers have been using big data and analytics for a long time, to learn more about individual consumers, but this new generation of systems will include additional features such as:

Real time Recommendation – the real time recommending capabilities will allow the App inside the customer’s smartphone to recommend products that they may like. Real time customer’s preferences and recommendation will be also transmitted to associate or connected stores, in order to provide personalized, tailored services. After the retailers have recognized the customer, they will provide the associate with information on the reason why the customer is attending the store: in this way, the associate can quickly provide the customer with the desired service, creating an opportunity to delight the customer, make the sale and increase loyalty.

Social media data gathering- Retailers will look at social media data to learn more about their customers. The use of social data is imperative to keep up with the latest trend, to analyze campaign impacts, to see what people think about your brand and more important to investigate what are the preferences of your customer. Social intelligence tools will help retailers better understand what customers want.

Data brokerage with Partners – if the store visitor is a new customer, the store systems are blind, they know nothing about him/her so they can’t start any personalized interaction. The possibility to connect in real time with third parties to brokerage information will be an important mechanism to acquire new customers. Telecommunication operators and connectivity service providers will be the preferred partners in such brokerage, as they know  who the just-entered smartphone device belongs to. The key message for retailers will not be know everyone, but be able to get information on everyone when it is needed.

Augmented reality – The smartphone will be used to acquire additional information on exposed products. People will use the phone camera to recognize products and to learn more on them.

Mobile payment will be the closing component of the interaction. With their mobile phone, customers will be able to receive promotions and offers, to use them and to pay them immediately with a simple scan of the product, making the in-store experience much more efficient. The whole sale cycle will be initiated and ended on the customer’s smartphone.

Thanks to the combination of these technologies, it will be possible to re-create the shopping experience we had in the past in small shops, where shopkeepers and retail sales were intimately familiar with their customer and they were always ready to recommend the right product. The new stores systems will be able to follow their customers inside the store, to interact with them recommending personalized promotions or products – based on customer`s preferred purchase history and social media – to provide detailed information on customer’s preference to associates before they will engage the customers and to close the sale cycle with mobile payment.

The smartphone will increase its role in our daily experience acting as a sort of virtual assistant in our shopping experience. 

Wednesday, November 12, 2014

Part 2 - Customer Experience Management (CEM) or how to manage Telco Customers at 360 degree with analytic and big data

Today in most part of the world, mobile devices are reached the full penetration of the population. The mobile expansion era is ending; Telecommunication are becoming a mature market, where maintaining customers and increasing their expenditure is a mandatory strategy for any Telco CEO. 

In a mature market, indeed, the strategy of continuous growth by acquiring new customers is expensive: advertising costs, campaign management expenses, discounting, and equipment subsidies are extremely costly. Acquiring a new customer costs 5-8 times more than maintaining an existing one. On the contrary, account maintenance costs decline over the lifetime of the relationship and long term customers tend to be less inclined to switch, less price sensitive and are more likely to buy ancillary products.
So one of the pillar of any Telco strategy must be the increase of customer loyalty.

But Customer loyalty doesn't means only good services; it implies all the aspects of the interaction between subscribers and operators during the whole customer life-cycle: from promotion of new services and tariff plans to the use of self-care web site, from the interaction with the Customer Care to the billing and charging, and obviously the quality of the service perceived. This means that the increase of customer loyalty is not a simple task or an activity owned or demanded to a single department (f.e. Network). It is a more complex activity which involves the whole company. It is a new way to re-think Telecommunication operators around their most valuable asset: their customers.

All the activities targeting the increase of the customer loyalty are grouped under the name of Customer Experience Management (CEM).

CEM is an operative model that Operators have to adopt to be more efficient in preserving and increasing their customer base. It is a shift from Network and IT-centric operational model (OSS and BSS) to a new Customer-centric model, where all company functions are focused in maintaining and acquiring customers. Service providers that will be able to measure and manage the Customer experience will be the most successful.
Unfortunately for Operators, Customer Experience is an extremely complex task to monitor and manage, as it is influenced by many and diverse factors, factors that can happen on different channels and that depends by individual characteristics (f.e. the same event can be perceived differently by two or more users).

Consequentially, to manage the Customer Experience to elements play a crucial role:

  • The collection and the analysis of any customer-operator interactions, to transform them in an appropriate measure (QOE KPI). These values are used to monitor to the quality of the user interaction for each subscriber and for each interaction and channel
  •  Combine all customer-operator interactions into analytic models which provide a comprehensive and global picture of the subscriber experience (Loyalty score). These value are used to monitor and predict the churn propensity of each subscriber. Analytic models are used also to identify the best or more efficient action to avoid to lose subscribers with high churn propensity score.
In summary, if service providers want to realize strong customer relationships, they must be extremely focused on data management and analytic. In the figure below is shown the overall CEM life cycle.

Figure 1 – Customer Experience Management life cycle

As discussed before, Customer Experience Management isn't a single process but a combination of processes, so to fully implement CEM, all operator-subscriber interaction processes should be transformed in data and analytic driven.

In the table below are shown the key processes and related analytic cases needed to implement a CEM-based company:



Tuesday, November 11, 2014

Part 1 - The essential guide to use strategically analytics in the telecommunication market

Everyone talks about how Telecommunication operators can monetize big data and analytic by generating new business and new revenue streams.
I belong to the small group of professionals who believe that, as for telecommunication operators, the big data bet is not represented by a huge business growth opportunity, but that a dramatic improvement can be found in the area of optimizing and automating their business processes. Through the Big data and analytics, Telco operators can deeply change the way they operate: analytic – applied to Big Data - will be the intelligence which will operate the networks and most Telco processes. Thanks to analytic, manual processes will be automatized, make faster and more efficient.

Increasing efficiency and efficacy is a mandatory direction for all Telco operators who want to remain competitive in the future. Nowadays, communications companies (or Content and Service Providers - CSP) are facing a particular intense and disruptive period: they are squeezed by OTTs – which are successfully competing against Telco in the new services business – and by the increase in network managing costs, due to the exponentially-increasing bandwidth demand and network complexity, costs are not compensated by increasing revenues. Also if cost control cannot generate new revenues, process automation remains the best way for Operators to reduce their costs and sustain higher margins.In this scenario, big data and analytic will play a fundamental role, as they will be the key to dramatically increase Telco operational efficiency through process automation.
But devil is in the details: if the future presence of big data and analytic within the Telco business is clear, it is less clear what analytic cases should be implemented first, and how to define a successful adoption strategy. The market offers hundreds of analytic cases, but it is an extremely fragmented and siloes offer, without a vision on how to combine these use cases together. Having a clear strategy on analytic use cases deployment is key to be successful.
I don`t mean to draw a complete list or a taxonomy for analytic use cases in telecom industry. But I hope I can at least provide you with an overall picture, in which main analytic use cases are classified and described in terms of the problems they manage and of the integrated adoption paths proposed.
Let’s start with the classification. Big data and analytic uses in Telecommunication industry are classified in several ways: I have selected a classification based on cost saving for Telco:
  1. Network/IT Optimization – all use cases related to optimize and/or automate network and IT processes: 
  2. Marketing personalization and automation – all use cases related to automate and improve Operator’s marketing efficiency.
  3. Security and Revenue protection – cases related to the use of analytic for revenue protection (f.e. fraud fighting, revenue assurance, credit control …) and for cyber security protection (f.e. hacker attach, cyber protection …)
  4. New revenue stream generation – this area is dedicated to the use of big data and analytic to generate new revenue streams, as information brokering, mobile advertising, real-time location- based promotion ….

To easily identify a group, each of the use case belonging to a specific area is presented with the same color. Moreover, to reduce the number of the cases to draw, I have inserted only the most relevant.
To describe how to implement the different use cases and when, I have used a quadrant representation.
Please note that the maturity level of the use cases changes dramatically from one to another and in some situation it also depends on the evolution of Telco environment (f.e. Analytic for NFV). This means that the implementation of analytic has to be distributed over a longer period of time. I have represented the time needed to implement a specific use case, positioning it along a time line (X-Axis), called “time to adopt”; the complexity to implement is represented by the position along the “adoption complexity” axis (Y-Axis). The size of the use case indicates the economic impact of its introduction.
Use cases with similar goals (f.e. CEM) have been grouped, while same use cases applied to different Telco environments has been linked together using a dotted arrow line. The result is shown in the following picture.

The left-bottom quadrant is the “low hanging fruit” area. Here the use of analytic and related benefits is quite clear. This is the area of low-cost large-archive deployments (f.e large Hadoop implementation), Customer Experience Assurance (f.e. network assurance systems based on QoE KPI) and fraud systems. These cases have been already “explored” and implemented by most Operators.
The right-top quadrant is the “real strategic advantage” area. The first operators to reach this area will be much more competitive on the market, thanks to high efficiency and low operational cost. This is the area of high level of process automation: automatic network trouble management (systems automatically capable of identifying the root cause of QoE degration), automatic personalized marketing, automatic pattern detection of cyber-attacks, actionable network (Networks capable of discovering troubles and fixing them), etc. The use cases in this area are now in the experimental or PoC phase and they will be ready within 12/36 months.
The right-bottom quadrant is the “tactical” area. The analytic in this area are “tactical” advance for Telco. Most of the analytic for business-to-business (B2B) are in this area.

The left-top area is the “challenge” area. The benefits of the analytic cases in this quadrant are quite clear, but successful implementations are quite challenging.

Real strategic advantage
  • Next Best Offer (marketing automation)
  • Customer Care optimization
  • Next best Action engine
  • Automatic problem diagnostic
  • Actionable network PCRF 2.0 and Full NFV
  • Store of the future
  • Cyber-attack prevention

Challenge
  • IP Network usage analytic
  • Personalized Automatic promotion
  • Self-Customer care
  • Advanced Churn management
  • Subscriber preference information brokering
  • Mobile advertising
  • Subscriber Digital identity control
  • Customer care optimization (Call reduction, sentiment analysis, ..)
  • Content and Video recommendation

Low hanging fruit
  • Archive optimization
  • Customer Experience Assurance
  • IP Intelligent routing
  • Revenue Assurance and Protection

Tactical
  • Analytic as a services
  • M2M analytic
  • WiFi indoor analytic
  • Augmented reality