4G to 5G: avoiding the gap of disappointment

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece Danny Itzigsohn, Senior Director, Technology & Strategy at TEOCO, takes a look at what can be done to increase the chances of 5G delivering on its promises.

Great excitement has surrounded the switching on of live 5G networks around the world; from South Korea, to the UK through to the US, many operators have reached a milestone moment in their 5G journeys. But while these advancements are a step in the right direction on the road to full, commercial, interoperable 5G, it is only the tip of the iceberg.

Today’s 5G is non-standalone, which means that it relies on both the 4G core as well as the 4G RAN to function. In that sense, 5G today is more of a 4G-5G hybrid (what is called a NSA – Non-Stand Alone 5G implementation), and it will continue to be for some time yet. 5G is currently deployed mainly in densified “islands” of limited coverage and sometimes leverages 4G connections to achieve improved throughput (Dual Connectivity). As soon as mobile users move out of these “coverage islands”, they fall back onto 4G. As a result, mobile subscribers’ 5G experience could be largely dictated by their experienced Quality of Service (QoS) while on 4G, especially as the human mind is more likely to remember a negative experience over a positive one. For operators, this can present a challenge: How to maintain subscriber experience in a hybrid 4G/5G world?

Analytics equals success

One of the biggest challenges for operators when assuring their networks is visibility – or lack thereof. As with everything in life, you don’t know what you can’t see. For operators, a lack of network or service visibility could lead to degradations caused by faults or performance issues going undetected for some time. Maintaining consumer confidence in 4G and 5G service quality is critical to all operators – they simply can’t afford to allow network glitches to undermine network and service QoS or Quality of Experience (QoE). Consumers are becoming more demanding, and they’re not afraid to switch providers should they find their experience wanting. This has stark consequences for all operators looking to achieve 5G ROI by effectively monetizing early adopters.

To do so, operators must ensure they can monitor both 5G and 4G in a holistic manner to maximise QoS across both. We are still in the early days of 5G deployments. This means that 5G today exists in islands, or pockets, whereby only small geographical areas benefit from its enhanced broadband capabilities. But industry marketing will have consumers think otherwise. For this reason, it is critical that operators offering “5G services” are able to manage QoS and QoE as subscribers move between 4G and 5G. Subscribers will expect nothing more than a seamless experience as they move between 4G and 5G; failure to manage that could have a serious impact on 5G adoption and operator churn rates.

Operators must therefore have a crystal-clear picture of their network and services using advanced analytics, including machine learning and artificial intelligence, to understand the events, trends and outliers occurring across the network and its services. Most importantly, they must move from a reactive to a proactive and predictive approach, whereby they are able to anticipate service degradations or events before they have even impacted the subscriber experience. In addition to this, operators must apply advanced analytics in a correlative fashion to ensure they understand what is occurring across both 4G and 5G networks. This will become particularly important as data volumes associated with 5G grow drastically; 5G brings with it a significant increase in monitoring complexity, which will not only make it more difficult to identify network degradations, but will also make their impact on subscriber experience more challenging to address. By leveraging advanced analytics to predict service degradations, and understand outage and event patterns, operators will be better positioned to monitor QoS closely across both 5G and 4G networks.

Bridging the gap in a network slicing world

Advanced analytics will also prove particularly important in helping operators harness the power of network slicing. Network slicing is set to make the business of QoS monitoring even more challenging. In a network slicing world, different applications, services and end-users have varying network requirements challenging the underlying network’s ability to address all of them efficiently. Take for example connected cars: They will rely on ultra-reliable, low-latency communications (URLLC) for assisted driving and road safety slices and on mIOT slices for telematics; while passengers streaming UHD videos will require enhanced Mobile Broadband (eMBB) for infotainment services. Taking the example of telematics services to connected commercial vehicles; it’s critical that these enjoy continuous connectivity through both the 5G and 4G covered areas. Failure to assure the required service availability could have significant repercussions on road safety and fleet operation’s efficiency. Operators must therefore use advanced analytics to ensure real time optimal subscriber experience on a per-slice and service basis. That means understanding what is happening across the entire network, but at slice-level too, and correlating the information gathered to guarantee QoS for each application or service.

Subscribers are becoming increasingly demanding—they simply won’t tolerate events that impact their QoS, not even slightly. Operators need to assure experience in real time for both their 5G and 4G subscribers; those who have invested in 5G handsets will expect a 5G-level experience, while those who remain on 4G will accept no disruption to their existing service. Managing this can be a tough balancing act and require operators to plan ahead and think carefully about the impact their new 5G networks will have on their existing infrastructure and services.

5G will undoubtedly revolutionize our digital lives, and huge opportunities are attached to this next wireless generation. But operators will only capitalize on these opportunities if they harness the right tools that will see them capable of guaranteeing QoS, and ultimately will see them succeed today, and tomorrow.

Using machine learning as a stethoscope for 5G

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this article Yuval Stein, AVP Technologies at TEOCO looks at the use of machine learning to optimise the roll out of 5G

The invention of the stethoscope was thanks to shyness rather than a spark of genius. In 1816, French doctor René Laennec, felt that listening to a young woman’s heart by pressing his ear to her chest wasn’t appropriate. Instead, he rolled up some paper, and found that he was able to hear much better.

The earliest designs of stethoscopes were simple wooden tubes and it was many years before the instrument we see as emblematic of healthcare was created. But the stethoscope was more than just a useful tool, it led to a new way of doing medicine. Before, it was normal to treat symptoms rather than underlying causes—now, through this new device, doctors had insight into what was going on inside the body, and were better able to understand the diseases behind the symptoms.

This shift from treating symptoms to treating causes seems natural to us now, but at this time doctors would treat a fever with no real idea of what was causing it. A similar change is now necessary with mobile networks. Increased complexity means that we need a new way of looking at these networks, to find the root causes of faults, rather than only treating the symptoms.

Machine learning as a stethoscope

The sheer amount of data that a 5G network will produce is going to be overwhelming. More data is good, of course, because the more data we have on a network, the better we can understand the issues that may be causing problems for users. But all this data needs to be analysed. In the past, this was simple—more data meant hiring more people to analyse and formulate actionable conclusions from the data.

This is no longer tenable. But it’s also no longer possible to rely on simple forms of automation in order to react to regular and more obvious issues. 5G is different from previous network generations, in that many new technologies and architectural innovations are being introduced at the same time. These technologies include NFV/SDN, edge computing, new radio access technologies and more.

This new complexity means we need new tools to examine the network. And this is where machine learning becomes just like the stethoscope—not just a tool, but a shift in how things are done. The use of machine learning can identify patterns and reduce the need for human oversight—a vital means of increasing operational efficiency by reducing headcount. But the real change is shifting from fixing issues to detecting underlying issues—even those that don’t linger in the network for long.

Treating the causes, not the symptoms

The rise of virtualised, software-driven networks has meant that service assurance is more decentralised. This means more network alarms—even with automation it’s still often impossible to determine where the real problems reside. This is particularly an issue where faults are intermittent—the symptoms may last for far longer that the fault itself. Manually examining service alarms will give an engineer no real clue as to where they can start to fix the underlying problems.

Also, there is big difference between being reactive and being proactive in maintaining a level of network assurance. A simple example would be if network bandwidth was too low to provide a certain service, and an alarm is set for when this happens. Automation would mean that the fix for this happens without any intervention. But a step further would be to use statistical techniques, such as trend analysis and forecasting to detect abnormalities in the network. These tools would mean pre-empting a situation that would result in poor service in anticipation of a glitch,  rather than reacting when the issue actually arises. This isn’t about fixing problems, but preventing them before they ever happen, addressing the underlying symptoms before they have a chance to take root and cause havoc.

But machine learning can go further. Self-learning algorithms mean that operators can create a baseline profile that identifies when exceptions occur. Rather than determining a threshold for an alarm, this allows for the creation of adaptive thresholds. An example of this would be an area where many homes are being built—at some point there will be a lot more traffic in that area, but engineers don’t have the time to check how closely construction timelines are being followed. Instead, the network behavior should change to meet the demand automatically. While a hard-coded threshold would need to be reconfigured, machine learning means that thresholds are adjusted automatically.

These examples seem fairly straightforward—but millions of similar decisions need to be made every day based on an overwhelming amount of data. Operators have known that automation is necessary for some time, but machine learning is key to decision-making in a 5G network. Without it, operators will be reduced to guesswork, lacking the tools to make the most out of their new—and expensive—networks.


Yuval-SteinYuval Stein is the AVP of Product Management and Service Assurance Products at TEOCO. With more than 15 years of experience in the service assurance domain, Yuval has held key product management positions throughout his career. He brings his knowledge to the fault, performance and service domains, and uses his hands-on experience to adapt service assurance solutions to the industry challenges: digital services and network technologies.

5G hype: learning from previous generations

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Thomas Neubauer, VP Innovations and Business Development, TEOCO, urges the industry to show restraint when talking up the 5G era.

5G is only just now becoming a reality, and some people are already heartily sick of it—at least, if some articles are to be believed: Headlines such as Don’t fall for the 5G hype, Why Everyone Needs to Shut Up About 5G and even The ‘Race to 5G’ Is Just Mindless Marketing BS show that there is already a lot of skepticism around this new technology.

We don’t have a 5G smartphone on the market yet, and already there’s a 5G backlash. How can people be sick of the hype around a technology that isn’t even here yet?

Tempering expectations

It’s hardly surprising, when the hype around previous mobile generations is taken into consideration. People aren’t burned out by 5G hype. They’re still exhausted by the hype from previous generations. Claims were made around 4G that consumers didn’t see for a long time, and in some cases didn’t see at all. Advertising regulators and even the courts were involved as HSPA+ networks were rebadged as LTE, and wild claims of high speed turned out to be less impressive in practice.

Over-zealous marketing departments are keen to show that their company is the fastest, the biggest, or the first, leading to a rush of announcements before a technology is ready for primetime. With 4G, this marketing push and confusion over what really counted as “4G” got completely out of hand—and the confusion over standards didn’t help. LTE and LTE-A both use OFDMA access technology, so it would make sense that both should be described as 4G. However, the ITU believed that only LTE-A and WiMAX 2 qualified as 4G technologies – which didn’t stop marketing departments describing their LTE networks as 4G. Later, the ITU changed its mind and said that the 4G label applied to “other evolved 3G technologies providing a substantial level of improvement in performance and capabilities”.

Operators essentially had the go-ahead to label any network better than standard 3G as 4G—and why would they choose not to? With rivals claiming to have 4G networks, sticking scrupulously to a 3G label meant offering, in the eyes of the customer, an inferior product.

But once bitten, twice shy: consumers presented with a new upgrade that failed to wow them the last time around are less likely to fall for the same trick twice. Operators and smartphone vendors need to be very careful in how they announce their networks and new devices. If early adopters buy in to the 5G label and do not receive the speeds they are promised, the word of mouth will be that this is just 4G all over again.

There is real opportunity for operators in 5G, enabling new use cases, new services and new streams of revenue. But what consumers care about is speed and availability. The marketing of 5G needs to be smarter with its message.

Marketing 5G to consumers—or not

The biggest opportunities that 5G makes possible aren’t easily communicated with a message that simply says that it’s the first in the market, has the most coverage, or is the fastest, or has the lowest latency. The real question is: what are the use cases the customers will value?

Network slicing, for instance, has great potential, creating multiple end-to-end virtual networks than run on top of a physical network. These networks can be tuned for specific needs, for example for IoT applications, or for connected vehicles. Network slicing will not be part of the first 5G networks deployed—we still have at least a couple of years to go before it will become a reality.

These will fall flat if they are marketed directly to consumers. The knowledge that there is a network slice dedicated to their specific needs will be compelling to a service provider or enterprise customer but won’t be of interest to the end consumer.

5G home broadband is also a potential killer application. Verizon’s 5G Home service is already available as a wireless alternative to fixed home broadband, making it possible to get similar speeds as a fixed line but without the need for the same infrastructure around and leading to the home. At this point in time, this is a use case that’s specific to the US, due to the cost of broadband and the spectrum available. Home wireless broadband has the potential to be better value for money and help service providers retain customers through bundled deals. But it’s not yet clear if marketing home broadband as a 5G technology will be effective or just confusing.

The same goes for IoT. Proponents of 5G networks often see IoT networks as a use case. But proponents of IoT see 5G as just one potential technology of many. For now, only very few IoT use cases need 5G, though that may change when 5G makes possible ultra-low latency. Even then, it’s edge computing rather than 5G that will have the biggest effect, and even then It’s certainly not something that’s going to get consumers excited any time soon.

This could lead operators into the same trap as before, selling their networks as the fastest/best/first.

The answer may be that 5G should not be marketed to consumers—at least, not in the same way that 4G was. Earlier this year, Telecoms.com reported early that consumers aren’t really interested in 5G—so why force them to care?

Instead of hyping 5G networks, the mobile industry should instead hype—if it must hype anything—5G services and the superior use cases made possible by this technology. 5G is not revolutionary because it’s faster, but because of what it makes possible, such as mobile VR, self-driving vehicles, industrial automation, smart cities, and more.

A 5G network is more than just a faster network. To promote it as such and to only boast about new rollouts and being the first to market is a disservice to the technology. And with few people interested in this new generation of technology—or even deeply skeptical—the best move could be not to market “5G”, but instead market the value it will bring without referring to 5G at all.


Thomas_NeubauerThomas is Vice President of Business Development & Innovations for TEOCO. Thomas was the founder and Managing Director of Symena, acquired by AIRCOM. AIRCOM was subsequently acquired by TEOCO in December 2013. Thomas Neubauer has more than eleven years of experience with automatic cell planning (ACP) and mathematical optimization. He holds a Ph.D. in telecommunications engineering from the Vienna University of Technology.

World Cup: Understanding is the key to avoid scoring own goals

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Derek Canfield, General Manager, Business Analytics at Teoco talks about the network challenges associated with major sporting events like the World Cup.

Imagine the scene. It’s the World Cup final and 80,000 people in Moscow’s Luzhniki stadium are craning their necks from the seats to watch a referee look at his small video screen beside the pitch. Meanwhile a worldwide TV audience expected to top one billion people will not only see the replays themselves, they’ll have experts and former referees explaining what is happening to them.

This situation is common for the American sports fan. The lead official at an NFL game can often be seen going under the hood to review on a private screen a tight call in a game that has countless interpretations available. And while the crowd boos or cheers the big screen replays, the audience at home get a detailed explanation from an ex-official or rules expert on the situation unfolding and the possible verdicts.

But is it right that the paying audience in the stadium, often including the most passionate fans, is left a little in the dark, so to speak? Of course not, and thus we see a good number of them will have their smartphones out trying to track what’s happening, and get the inside scoop on the likely decision moments before it is revealed. In fact, our digital world is intended to give the passionate fans the best of both worlds: the energy and of the live event, enriched by readily available content and analysis on their respective devices.

However, unless the stadium’s communications capacity is being actively managed and flexed to allow the live audience to keep track of developments on their mobiles, it is possible and even likely the stadium will score a network own goal and leave its audience frustrated.

Simply cranking up the network capacity is only part of the solution. To really improve things for the fans on site, operators need a much better understanding of what the spectators in the stadium are actually doing. This involves tracking the apps they are using, the feeds they are watching, and understanding all of the services they are trying to access. Without access to that level and granularity of data, it will be almost impossible for the stadium service provider to really improve the quality of service being provided.

Unfortunately, the challenge in those circumstances is that the majority of the data traffic in the stadium will be encrypted. In fact, Gartner has predicted that by next year as much as 80 per cent of all web data traffic will be encrypted. So how can the operators know what’s going on, if they can’t actually see the services being used?

With advanced analytics solutions, operators have found it is possible to gain actionable insights and make their massive event preparations run smoothly on the big day. By applying machine learning and heuristics together with our real-time digital analytics to the problem operators get deeper visibility into the big blind spot of encrypted data traffic to extract the metrics. Armed with this intelligence, they can make adjustments to the network and service without compromising data security and privacy.

Machine learning is used to provide sufficient visibility of the encrypted data and the traffic flow so that the network can effectively ‘self-identify’ the application service being used, for example streaming video on demand from specific sports app playing in HD resolution. Armed with that knowledge, the operator can then apply modelling to understand how it should adjust the network to deliver the best experience.

At this year’s Super Bowl in Minneapolis we worked with the stadium network operator to provide real-time analysis of stadium upload and download network traffic. At any point during the game the operators could see a full picture with subscriber-level granularity on numerous items. Some examples include top ten apps services being used, the balance between HD and standard definition in terms of video streaming, how much content people were uploading, and what average data speed was being achieved.

With spectators messaging friends, capturing videos and pictures of themselves or the players to upload to Twitter or searching for feeds showing highlights and replays; it’s vital to track all that network activity in real time. As well as maximising network performance, it becomes possible to detect any part of the stadium with connectivity issues, or even highlight a rogue user consuming a vast amount of data by trying to live stream the whole match.

At the World Cup, the service providers will also be dealing with roaming subscribers from all over the world, many of whom will only wish to use free operator stadium Wi-Fi. Without that full network traffic visibility in near-real time, managing the service to deliver a quality experience will be all but impossible.

But here’s the thing – spectators at an event have an increasing appetite for internet services and content. They represent a captive audience whose interests have been defined by their very presence in the venue – making them prime targets for special offers, promotions and add-on services. Those operators able to track ‘what’s going on’ within the network in real time, will not only provide the best customer service, they will also have the best knowledge of customer behavior to sell additional airtime and game-related services.

By enabling users to do what they love to do, when and where they want to do it, operators have the ability to enrich life’s experiences. And in doing so, operators have the ability to create additional content services opening up new revenue opportunities. Keeping the activities and priorities in sync is fundamental for the operator to score profits while players on the field vie to score goals.


Derek Canfield - TEOCODerek Canfield is the General Manager responsible for Business Analytics at Teoco. He is a veteran of the telecom industry, having spent the first half of his twenty-year career working for a North American operator and the second half with Teoco. At Teoco, Derek leads the go-to market strategy for the analytics suite of products and services focusing on that key tenant of aligning technology with business objectives to drive innovation and market leadership.