Culture is holding back operator adoption of open source

If open source is the holy grail for telcos, more than a few of them are getting lost trying to uncover the treasure; but why?

At a panel session featuring STC and Vodafone at Light Reading’s Software Defined Operations and the Autonomous Network event, the operational culture was suggested a significant roadblock, as well as the threat of ROI due to shortened lifecycles and disappearing support.

Starting with the culture side, this is a simple one to explain. The current workforce has not been configured to work with an open source mentality. This is a different way of working, a notable shift away from the status quo of proprietary technologies. Sometimes the process of incorporating open source is an arduous task, where it can be difficult to see the benefits.

When a vendor puts a working product in front of you, as well as a framework for long-term support, it can be tempting to remain in the clutches of the vendor and the dreading lock-in situation. You can almost guarantee the code has been hardened and is scalable. It makes the concept of change seem unappealing Human nature will largely maintain the status quo, even is the alternative might be healthier in the long-run.

The second scary aspect of open source is the idea of ROI. The sheer breadth and depth of open source groups can be overwhelming at times, though open source is only as strong as the on-going support. If code is written, supported for a couple of months and then discarded in favour of something a bit more trendy, telcos will be fearful of investment due to the ROI being difficult to realise.

Open source is a trend which is being embraced on the surface, but we suspect there are still some stubborn employees who are more charmed by the status quo than the advantage of change.

What to bin and what to keep; the big data conundrum

Figuring out what is valuable data and binning the rest has been a challenge for the telco industry, but here’s an interesting dilemma; how do you know the unknown value of data for the usecases of tomorrow?

This was broadly one of the topics of conversation at Light Reading’s Software Defined Operations & the Autonomous Network event in London. Everyone in the industry knows that data is going to be a big thing, but the influx of questions are almost overwhelming as the number of data sets available.

“90% of the data we collect is useless 90% of the time,” said Tom Griffin of Sevone.

This opens the floodgates of questions. Why do you want to collect certain data sets? How frequently are you going to collect the data? Where is it going to be stored? What are the regulatory requirements? How in-depth does the data need to be for the desired use? What do you do with the redundant data? Will it still be redundant in the future? What is the consequence of binning data which might become valuable in the future? How long do you keep information for with the hope it will one day become useful?

For all the promise of data analytics and artificial intelligence in the industry, the telcos have barely stepped off the starting block. For Griffin and John Clowers of Cisco, identifying the specific usecase is key. While this might sound very obvious, it’s amazing how many people are still floundering, but once this has been identified machine learning and artificial intelligence become critically important.

As Clowers pointed out, with ML and AI data can be analysed in near real-time as it is collected, assigned to the right storage environment (public, private or traditional dependent on regulatory requirements) and then onto the right data lakes or ponds (dependent on the purpose for collecting the data in the first place). With the right algorithms in place, the process of classifying and storing information can be automated, freeing up the time of the engineers to add value, though it also keeps an eye on costs. With the sheer volume of information being collected increasing very quickly, storage costs could rise rapidly.

And this is below the 5G and IoT trends have really kicked in. If telcos are struggling with the data demands of today, how are they going to cope with the tsunami of information which is almost guaranteed in tomorrow’s digital economy.

Which brings us back to the original point. If you have to be selective with the information which you keep, how do you know what information will be valuable for the usecases of tomorrow? And what will be the cost of not having this data?

Telcos are getting pretty good an impersonal communications

They might be slowly headed in the right direction, but telcos are still not great at relating to customers.

In the pursuit of relevance in the digital economy, personalised experiences are a hot topic, but the telcos are no-where near as good on the delivery front as the internet giants. There are of course many reasons for this, but one of the most apparent is the structure of the organization according to Intent HQ CEO Jonathan Lakin.

Here is the current state of affairs. The telcos have access to the same technology, a tsunami of information on the customer and (in theory) access to the same talent pool as the internet giants. The ingredients for success are on both the telco and OTT work surfaces, suggesting the organization itself is to blame.

The FANG companies are incredibly well-known for personalising experiences for customers. Not only does this create more opportunity to drive revenues, placing the right product in front of the right person at the right time, it creates a tie to the customer. The customer feels heard and has a stronger emotional connection to the brand, ultimately reducing churn. Both are benefits which would be of interest to telcos.

But the issue is structural. Telcos are organized in siloes, each of which is excellent at building an in-depth, narrow image of the customer. Whether it is insight on customer churn, or interaction and product history, the telco can build knowledge of the customer but without combining all of these images personalisation will never be a reality.

A good example is product offerings to customers, and a bugbear of many around the world. Whether it is offering products who have already been purchased, or even ones which might be out of the customers price range, without combining the siloes and making more use of the swathes of information available, personalised messaging will not be achievable.

The other issue for the telcos is that of priorities. Lakin pointed out that the main priority for telcos is profitability, which influences how products are developed and sold, and in turn evolving communication strategies and platforms. This not only creates a nightmare for integration in the IT department, but reinforces the siloes. The customer is sitting down the priority list, which is not going to aid the push towards personalised messaging.

Right now, the structure in not in place to create a personalised messaging culture. The ingredients are all there to create a sumptuous recipe, but the organizations are set up right.

Sky flexes its AI muscles

Artificial intelligence might be the buzzword of 2018, but few actually know what to do with the technology. That said, Sky seems to be surging ahead of the pack.

At the Telco Data Analytics and AI conference in London, an interesting statistic was put to the audience; 60% of the AI R&D spend in the telco industry is being directed towards network optimization. This is certainly a valid quest, though the problem with inward R&D investment is that it won’t prevent the slow wander towards utilitisation. To create value, telcos need to be investing in projects which actually create value, drive diversification and capitalise on new revenues. This is exactly what Sky seems to be doing.

“We have a data liquidity problem,” said Rob McLaughlin, Head of Digital Decisioning and Analytics for Sky UK. “Getting data is not an issue, we get it without trying, it’s about getting value from it.”

It seems the Sky UK team has a lot of ‘nice to have problems’, which demonstrate the effective steps forward the business is making in the intelligence-orientated world. While many telcos are struggling with the basic concepts, Sky is really setting the pace.

Aside from the overwhelming amount of data, McLaughlin complained of the management teams attitude towards artificial intelligence. Here, the team aren’t resisting, but asking for solutions which are overly complex. McLaughlin pointed out the Sky business was missing out on the low-hanging fruit, the simple problems which AI can address, instead the management team is looking for the top-line, super-complex solutions which can bring about revolutionary-change.

As McLaughlin told the audience, this is frustrating, but at least the management team is embracing new concepts and technologies, even if they are trying to run before they can walk. This is arguably a perfect scenario however. Change is led from the top of an organization, and McLaughlin seems to be describing a culture which is desperate to embrace change and create value.

Another interesting point made by McLaughlin was a claim there was no POC.

“We launched these projects at scale from day one,” said McLaughlin. “We didn’t want to do a POC as it was a bit of an insult to our intelligence. Why do they need to test whether data is good for the business?”

This demonstrates the much-hyped fail fast business model which has been employed so effectively by the internet giants. These companies don’t need to prove there is value in personalising services, they just need to make it work. The only way to get the algorithms to work is to get them out in the real world, trained by data, honed by machine learning and real-time experiences. This culture of creating results, not trying to prove perfection, will certainly drive value for Sky.

McLaughlin’s team are implementing AI in four different ways at Sky. Firstly, using customer information to cross sell services and products. Secondly, increasing engagement with products and services customers have already bought. Third, anticipating customer needs and problems, a project which is saving Sky millions in customer services and improving the overall NPS score. Finally, AI is being used in media optimisation to improve the advertising platform.

While these projects are still in the early days, the results are clear according to McLaughlin. NPS has been improving, cost saving are being realised and proactive selling of product through personalisation is increasing. With the cross-selling side, the results are quite remarkable. The success of sales of Sky Sport products are up 57% due to two simple changes. Firstly, putting the product in front of the customer at the right time, Saturday afternoon not Friday night for example, and Secondly, selling the product in the right way. If you know you are engaging a football fan, tell them about the football benefits not Formula One.

“Just crazy we haven’t been doing this for 30 years,” said McLaughlin.

All of these initiatives are built on identity. For McLaughlin this is the most important aspect of any data analytics and AI programme, and receives more attention than anything else. If you cannot identify your customer, it is impossible to personalise services effectively. It seems simple, but it is an aspect which is often overlooked.

“If we have the opportunity to speak to someone, don’t tell them something, treat them as the person the data says they are,” said McLaughlin.

Sky might not have a reputation as an particularly innovative organization, nothing out of the ordinary at least, but this approach to data analytics and artificial intelligence is certainly worth noting. The culture is accepting and proactive, there is an attitude which is geared toward doing, not planning, and the objectives are clearly outlined. McLaughlin might have his frustrations, but if you want an example of an organization which is proving the value of intelligence, you won’t have to look much farther.

Microsoft recognises AI might screw over some employees

Artificial intelligence has been hyped as the technology which will drive profits in the next era, though few in the technology want recognise how painful the technology will be for some segments of society.

The propaganda mission from the technology world was incredibly present at Microsoft’s UK event Future Decoded. Of course, there are benefits from the implementation of AI. Business can be more productive, more intelligent and more proactive, tackling trends ahead of time and gaining an edge on competitors. There is a lot of buzz, but it might just turn out to be justified.

Despite this promise, Microsoft has seemingly done something this morning few other technology companies around the world are brave enough to do; recognise that there will be people screwed by the deployment.

“There is a risk of leaving an entire generation behind,” said Microsoft UK CEO Cindy Rose.

The risk here is the pace of change. While previous generations might have had time to adapt to the impact of next-generation technologies, today’s environment is allowing AI to disrupt the status quo at a much more aggressive pace than ever before. Rose pointed towards the explosive growth of data, pervasiveness of the cloud and much more powerful algorithms, as factors which are accelerating the development and deployment of AI.

One question which should be asked is whether the workforce can be re-educated and reskilled fast enough to ensure society is not being left behind? Yes it can, but Rose stated the UK is not doing enough to keep pace with the disruption.

Looking at statistics which support this statement, Microsoft has released research which found 41% of employees and 37% of business leaders believe older generations will get left behind. Now usually when we talk about older generations and a skills gap, retirees comes to mind. However, those in the late 40s or early 50s could be the more negatively affected. The ability or desire to reskill might not be there due to the individuals entering the final stages of their career before retirement, though the risk of redundancy will be present. How are the people who might be made redundant 3-4 years short of retirement going to be supported? This is a question which has not been answered or even considered by anyone.

To help with imbalance, Microsoft UK has announced the launch of its AI Academy, which is targeted on training 500,000 people on AI skills. This is not just a scheme which is aimed at developers, but also IT professionals, those at risk of job loss and executives in both the business and public sector world.

As the technology industry has pointed out several times, there will be jobs created as part of the AI enthusiasm. But here is the risk, are those who are victims of job displacement suitably qualified to take these jobs? No, they are not. Uber drivers who fall victims to the firms efforts in autonomous driving, or how about the bookmaker who will be made redundant by SAPs powerful accounting software. These are not data scientists or developers, and will not be able to claim a slice of the AI bonanza which is being touted today.

But perhaps the risk has been hyped because there is too much focus on the negative? KPMG’s Head of Digital Disruption Shamus Rae suggested too much attention has been given to the dystopian view of AI, instead of its potential to unlock value and capture new revenues. Comfused.com CEO Louise O’Shea said one way her team implemented AI was to pair technical and non-technical staff to, firstly, allow front line employees to contribute to development and make an application which is actually useful, and secondly remove the fear of the unknown. The technical staff educate the non-technical staff on what the technology means and why it can help.

These are interesting thoughts, and do perhaps blunt the edge of the AI threat somewhat, but there will be those who use AI for purely productivity gains, not the way the industry is selling it. These are not businesses which will survive in the long-term, but they will have a negative impact on employees and society in the short-term. When you are lining up in the dole queue, the promise of an intelligent, cloud-orientated future is little comfort.

Microsoft UK CEO Cindy Rose is right. AI will power the next-generation and create immense value for the economy. But, no-where near enough is being done to help those at risk of job loss to adapt to the new world. The aim here is not to hide the negative with an overwhelming tsunami of benefits, but to minimise the consequences as much as possible. Not enough is being done.

Inward application of tech explains dumb pipe rhetoric

Every telco fears the ‘dump pipe’ label and the push towards commoditisation, but perhaps this trend is being compounded by an inward looking attitude in the application of potentially revolutionary technologies.

This is the conundrum; telcos are missing out on the cash bonanza which is fuelling companies like Facebook and Google, but to keep investors happy, executives are focusing more on improving profitability than replacing lost revenues, such as the voice and SMS cash cows of yesteryear. This might seem like quite a broad sweeping statement, and will not be applicable to every telco, or every department within the telcos, but statement could be proven true at Total Telecom Congress this week.

One panel session caught our attention in particular. Featuring Turk Telecom, Elisa and Swisscom, the topic was the implementation of AI and the ability to capitalize on the potential of the technology. The focus here is on automation, predictive failure detection and improving internal processes such as legal and HR. These are all useful applications of the technology, but will only improve what is already in place.

The final panellist was Google, and this is where the difference could be seen. Google is of course focusing on improving internal processes, but the main focus on artificial intelligence applications is to enhance products and create new services. Spam filters in Gmail is an excellent example, though there are countless others as the Deepmind team spread their influence throughout the organization.

The difference between the two is an inward and outward application of the technology. Telcos are seemingly searching for efficiency, while Google is looking to create more value and products. One will improve profitability of what already exists, the second will capture new revenues and open the business up to new customers. One is safe, the other is adventurous. One will lead a company down a path towards utilitisation, the other will emphasise innovation and expand the business into new markets.

Of course, there are examples of telcos using artificial intelligence to enhance offerings and create new value, but it does appear there is more emphasis on making internal processes more efficient and improving profitability.

This is not to say companies should not look at processes and business models to make a more successful business, but too much of an inward focus will only lead to irrelevance. We’ve mentioned this before, but the telcos seem to be the masters of their own downfall, either through sluggishness or a fear of embracing the unknown, searching for new answers.

The panel session demonstrated the notable difference between the two business segments. The internet players are searching for new value, while telcos seem more interested in protecting themselves. Fortune favours the brave is an old saying, but it is very applicable here.

Blockchain Set to Play Key Role in Telco Operations: Analyst

Blockchain technology is set to be used by telcos in multiple applications across all areas of operations in coming years, according to an industry analyst who has delved into the potential use of the digital ledger technology (DLT) in the space.

James Crawshaw, a senior analyst at Heavy Reading, says communications service providers (CSPs) see significant potential for the use of the much-hyped technology, which is best known for underpinning cryptocurrencies such as Bitcoin.

“Today, CSPs use databases for thousands of applications. Blockchain might reach dozens of applications in the next few years. Examples include mobile number portability, SLA monitoring, or replacing CDRs for billing,” says the analyst, who describes blockchain, in essence, as a “decentralized, immutable electronic ledger; a write-once-read-many record of historical transactions, as opposed to a database that can be written over.”

Currently, CSPs are considering using blockchain in three key areas, according to Crawshaw:

  1. Fraud management: for roaming and subscription identity fraud.
  2. Identity management: storing identity transactions (network logins, purchases, etc.).
  3. IoT connectivity: a blockchain could enable secure and error-free peer-to-peer connectivity for thousands of IoT devices with cost-efficient self-managed networks.

Crawshaw examined those use cases in depth in a recent report, Blockchain Opportunities for CSPs: Separating Hype From Reality.

And while there is a certain level of marketing enthusiasm around blockchain currently, that shouldn’t get in the way of real-world tests and deployments, notes the analyst.

“Like all complex new technologies there is a degree of hype and bandwagon-jumping with blockchain. Its main purpose is as an alternative to centralized systems for recording information (primarily databases). By distributing the control, you eliminate the risk of a hack of the central controller and the information being altered fraudulently.  By using clever computer science you can replace the central controller (and the fees they normally charge) with software and get a cheaper, more reliable solution. But in most cases where we use a database today we will continue to use them in the future,” notes Crawshaw.

So which CSPs are taking the lead with the exploration of blockchain as a useful tool? Colt is one network operator that has been taking a close look at multiple ways to exploit blockchain’s potential for some time.

The operator, in collaboration with Zeetta Networks, is also set to deliver a proof-of-concept demonstration of a blockchain-based offering that enables network carriers to buy and sell network services in a secure, distributed marketplace. That PoC will be unveiled at the upcoming MEF2018 show in Los Angeles.

And Colt is one of the operators participating in a panel discussion – What Opportunities Are There For Blockchain In Telecoms & How Can These Aid Automation? – on November 8 in London as part of Light Reading’s ‘Software-Defined Operations & the Autonomous Network’ event. PCCW Global and Telefónica will also be involved in that discussion.

There are also a number of industry initiatives involving multiple CSPs: The key ones related to blockchain are:

  • The Carrier Blockchain Study Group, which counts Axiata, Etisalat, Far EasTone, KT, LG Uplus, PLDT, SoftBank, Sprint, Telin, Turkcell, Viettel and Zain among its participants
  • The Mobile Authentication Taskforce, which includes AT&T, Sprint, T-Mobile and Verizon
  • The International Telecoms Week Global Leaders’ Forum, in which BT, HGC Global Communications, Telefónica and Telstra are involved.

In time, blockchain might be joined in CSP back offices by other DLTs. “Blockchain is a particular type of DLT that uses cryptographically hashed blocks to record transactions in a time series or chain. If security is less of an issue you could use a simpler DLT. But then again, you might just use a regular database,” notes Crawshaw.

Is your automated technology a threat to customer relationships?

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece Neil Hammerton, CEO of Natterbox looks at the pros and cons of increased automation when it comes to customer service.

We’ve all been there: trying to call our bank, GP, or local job centre, and having to press an infinite number of keys to get through to an automated voice that will make us wait on the line while letting us know that we’re number 20 in the queue. Companies claim that automating communication with the customer is making their journey much more efficient and streamlined. But is that really the case or are companies just putting a barrier between them and their customers?

It seems almost impossible nowadays for customers to get through to anyone on the phone when calling a company. Bearing in mind customers are likely to only pick up the phone when they want to sort something out quickly or they have a problem, this poor experience is probably going to have a damaging effect on brand perception and loyalty.

A Times investigation for instance recently found that Britain’s Big Six energy suppliers were taking more than 20 minutes to answer customer phone calls in some instances, prompting many to switch suppliers. In today’s competitive market, businesses cannot risk losing customers because of poor service. They need to develop good relationships rather than relying on technology to do it for them. They need to stop hiding behind automated processes and chatbots and distancing themselves from their customers.

Forcing customers to communicate with robots through several layers of filtering and recorded voices can make them more frustrated and their lives more difficult than a quick conversation with a customer service agent. For customers, there is nothing worse than feeling like the organisation they’re trying to reach isn’t prioritising their needs. Avoiding unpleasant conversations by hiding behind technology only makes it harder for customers to trust the brand and build a positive relationship.

Fortunately, we live now in an age of unprecedented technological advancement; which means that for every pain point that an organisation has, there’s usually some technology available to solve it. Advancements in telephony technology mean that businesses have the resources to make the phone experience much more enjoyable and insightful for both their customers and their staff. Businesses no longer need to see the phone as the conduit for difficult conversations, but as one for insights that benefit the business.

The first thing organisations need to keep in mind, when it comes to their customers, is that they want to have the company’s whole and undivided attention. This means a personalised experience, which entails knowing your customers well enough to provide that experience. When customers know they’re being cared for, they start thinking positively about brands, and might be inclined to expand their conversations beyond complaints or issues. When conversations become more pleasant, this is an opportunity for brands to build positive relationships with their customers and gain more insights.

Technology can also help businesses provide greater job satisfaction to customer service agents by ensuring their skills are properly used and they get the training they need. Skills-based routing can for instance allow customers to be automatically directed to an agent equipped with the skills needed for that particular customer’s profile; this is made possible thanks to artificial intelligence collecting and analysing data from previous interactions between the customer and the brand.

But AI can also be used to determine good calls from bad calls, allowing companies to then provide the necessary training for agents to best perform their job. This, ultimately, means less time wasted trying to redirect or inadequately answer a customer’s query, and more time available to work on building relationships – which an automated phone system cannot do by itself.

The automation of phone services does have benefits – if companies use it properly. Technology must be used in ways that will enhance the performance of contact agents, freeing them up to do their job: be the first point of contact in an organisation and build positive, lasting relationships with customers.

 

Neil Hammerton (08.18)Neil Hammerton, CEO and co-founder of Natterbox, and has been in IT for most of his working life. A serial entrepreneur, Neil began his career as a research laboratory apprentice with British Telecoms where he obtained an HND in electronics. The idea that became Natterbox occurred to him when he realised how poorly a big company he was working for dealt with incoming phone calls. Working with two co-founders, Neil launched Natterbox in 2010. Neil is passionate about the telecoms industry, technology and what Natterbox is helping its customers to achieve. He considers Natterbox the biggest success of his entrepreneurial career to date. 

AT&T launches online advertising marketplace Xandr

Two multi-billion dollar acquisitions and a funny name later, the AT&T content business vision starts to become a bit clearer.

AT&T has announced the launch of Xandr, its new content business unit which will combine current capabilities (e.g. AT&T AdWork and ATT.net), the Time Warner and AppNexus acquisitions, as well as distribution partnerships with Altice USA and Frontier Communications into a notable advertising entity. While the initial plan is to capture a slice of the digital advertising bonanza which has been fuelling the monstrous growth at Facebook and Google, long-term ambitions are a lot grander.

“Xandr is a name that draws inspiration from AT&T’s rich history, including its founder Alexander Graham Bell, while imagining how to innovate and solve new challenges for the future of advertising,” said CEO Brian Lesser. “Our purpose is to Make Advertising Matter and to connect people with the brands and content they care about. Throughout AT&T’s 142-year history, it has innovated with data and technology, making its customers’ lives better. Xandr will bring that spirit of innovation to the advertising industry.”

In the first instance, Xandr will combine the distribution and data capabilities of AT&T, with content catalogues from Time Warner, Frontier and Altice USA and the technology platform of AppNexus to make a more complete advertising offering. With its 170 million subscriber base of mobile, broadband and OTT products, and the data collected on these customers, AT&T believes it can offer a hyper-targeted advertising solution and more effective ROI, to rival the likes of Facebook and Google.

But this is only the first step of the business. In the long-run, AT&T hopes there will be an opportunity for advertisers to bring their own data, augment this with the AT&T customer insight to provide an even more targeted and efficient proposition. These are the foundations of what the business hopes will eventually become an advertising marketplace, where all distributors, content owners and advertisers can combine. AT&T will enrich these offerings with its own data, and even offer tie-ins to Insight Strategy Group and Advertiser Perceptions in order to understand the dynamics between consumer sentiment and the advertising experience. We might have been waiting a while for this move in the content space, but it certainly is an in-depth one.

The partnerships with Insight Strategy Group and Advertiser Perceptions are certainly interesting ones as well. Understanding the dynamics between sentiment and advertising can aid advertisers in placing the right type of advert, in front of the right consumer, at the right time. Its a science which leans on art, but has the potential to be very useful.

The AppNexus acquisition was only completed in August for $1.6 billion, having announced the intention to buy the business in June. Through AppNexus, AT&T has been able to bolster its capabilities with an advertising marketplace, which provides enterprise products for digital advertising, serving publishers, agencies and advertisers. With AT&T’s first-party data, content and distribution the offering becomes more complete, as the focus turns to creating a platform that makes linear TV buying more automated and data-driven. Of course, part of this deal relies on the successful acquisition of Time Warner, which is proving to be more difficult business.

That said, while this is a good idea from AT&T to provide additional value to the content ecosystem, there will be complications. AT&T will have to convince competitor media companies to put their premium inventory on its network, while regulation could prove to be a hurdle as well. With data privacy a hot topic in the technology world right now, shifting around sensitive information and augmenting in such a marketplace might raise some concerns from privacy advocates.

Some have questioned whether AT&T’s venture into the content world, but this does look like to be a comprehensive strategy, incorporating several promised aspects of the digital economy. There are of course significant hurdles for the business to overcome, but it is a creative idea, perhaps one which would have been more likely to emerge from other segments of the technology world. More importantly, it is an opportunity for AT&T to provide value above connectivity.

The telcos will always have an important place in the digital economy, providing the connectivity cornerstone, though this runs the risk of utilitisation, slipping down the value chain. Using data for the purposes of advertising has always been a sensitive issue, though should AT&T be able to negotiate the red-tape maze, Xandr will enable AT&T to secure ‘UnTelco’ revenue. This is a case of a telco using what it has to add value to a parallel segment, as opposing to disruption and attempting to steal a limited amount of revenue. Its creating additional revenue streams and value.