The iStore is a perfect example of how we can make money – Vodafone

Many around the world will use Apple as a benchmark of how to successfully engage the digitally native consumer, though it’s the development of the iStore ecosystem which can be used as a model for the telcos.

Speaking at Light Reading’s Software Defined Operations and the Autonomous Network event in London, Atul Purohit from Vodafone Group pointed towards the slow-burning and long-term investments made in creating the app store as the way to make money in the digital economy. This is ultimately one question which will frustrate a lot of telcos, how will the vast expenses on 5G be recaptured over the next decade?

With 5G, MEC and IoT trends starting to make more concrete impacts on the real world, Purohit suggested a platform model should be built these technologies, with the telcos forming the centre of the ecosystem. This is not the blockbuster cash generator, a silver bullet to recoup lost billions, which some investors might want, but it is a sensible, sustainable business model. In fact, you can already see the benefits today.

Apple’s iStore and Google’s Play Store are excellent examples of careful development of an ecosystem and the rewards which can be realised when a segment matures, but the smart home is another. Most would have presumed the telco would be an excellent focal point for the smart home ecosystem, primarily because the router is already an accepted fixture in the living room, though the likes of Amazon and Google has imposed the smart speaker in its place.

This is an example of inaction than anything else, as the proactive internet giants wrestled the focus of the smart home away from the router and the cumbersome telcos, and onto the speaker. Services and products are being built around the smart speaker, and the financial rewards will be claimed by Amazon and Google. With personalised experienced and IoT trends of tomorrow, the telco still has an opportunity to stake its claim to be the focal point of these ecosystems.

This is a business model which will mature over time, requiring long-term investment and patience above all else, though it is a model proven successful time after time.

Edge Computing vs Fog Computing

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece freelance Journalist Charlie Osborne explains the differences between edge and fog computing.

The Internet of Things (IoT) and connected devices, which are able to facilitate data collection and storage at the edge of networks, have the potential to transform enterprise systems and the way we manage data irrevocably. As the number of IoT devices in circulation is set to rise to 20.4 billion by 2020 and a 10-fold rise in worldwide data generation by 2025 forecast by IDC, in order to properly harness and manage data, new solutions at the edge are going to become critical.

The next generation of mobile LPWA and 5G technologies, including the rollout of NB-IoT and LTE-M networks, will become key to managing the IoT deployments of the future. Computing at the network perimeter, including edge computing and fog computing, will also play a crucial role in content delivery and workload management. The massive volume of data collected by enterprise players has given rise to edge computing, which is defined as a means to bring data collection and analysis close to the source of collection.

This bypasses the need to transfer data across centralized relays and to central cloud systems, which can be a costly endeavour demanding high bandwidth which can also negatively impact latency. Fog computing builds upon this concept. While edge computing is responsible for connecting edge gateways and connected devices, fog computing provides the intelligence and necessary protocols for such decisions to take place. The architecture brings cloud computing to the edge of networks and provides the support required for systems to decide what information needs to be transferred to core systems, and what data can be managed at the edge of networks. In turn, this can result in faster processing times and lower latency.

Edge and fog computing have found a place in systems ranging from corporate networks to industrial settings. Whenever an organization has the need to manage IoT device deployments — whether this is the factory floor for equipment monitoring, enterprise platforms working with data analytics in real time, or smart city connected devices, connected cars, or home mesh networks, these technologies can lessen the strain on cloud environments and bandwidth requirements.

When it comes to telecoms and 5G network deployment, some experts believe that fog and edge computing will provide the missing link between cloud architectures and end-users. 5G networks, due to roll out in 2020, aim to reach high mobile speeds of up to 1Gbps and reduce latency to the sub-milliseconds.

In order to achieve this, 5G will need to be able to utilize edge and fog computing to support dense networks and eradicate the risk of bottlenecks, caused by high bandwidth demands caused by transferring data to centralized cloud environments. While 5G will be driven by high-frequency radio waves, fog computing is a necessity due to the increase in latency which occurs due to cloud-based application requests from core networks to the cloud.

Without a means to bring processing power and compute close to the edge, 5G networks will still suffer from high latency problems.  Edge and fog computing platforms are likely to be adopted at speed in the coming years as enterprises, telecoms, and industrial players understand the opportunities for growth and efficiency improvements created by utilizing not just core cloud environments, but peripheral networks.

IoT devices, next-generation communications protocols, and networks, coupled with our desire to make everything from the factory floor to our smartphones more intelligent, will all require a stable backbone to perform. Computing at the edge can supply the support we need.

 

Charlie Osborne is a professional journalist based in London, UK. She is a freelance editor, educational material creator and contributes to IoT World News as a feature writer with a focus on consumer technology, innovation, smart technology, mobility, edtech, and security.

 

Edge Computing Congress returns to the German capital. Meet the entire edge ecosystem and discover how cloud computing, 5G and IoT connected services can provide seamless connections at the network edge.

The benefits of edge computing, IoT for mobile operators in the smart city space

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece freelance Journalist Charlie Osborne explores the role of edge computing in enabling the smart city.

Mobility has paved the way for the rapid adoption of the Internet of Things (IoT) and connected devices worldwide. At the core of IoT and edge computing technologies is mobile networking, which has given rise to Big data, analytics, and cloud computing taking over what was once on-premise and dumb enterprise systems and networks, ranging from security solutions to customer assistance platforms.

As 5G networking looms on the horizon, telecommunications firms are poised to take advantage of what IoT and edge computing can offer. Data demands and the thirst for spectrum have forced a rapid re-think in how telecoms firms manage their resources, prompting a shift towards cloud-driven architectures. Any technology which can lessen the burden of data consumption is worthy of note, and edge computing has the potential to drastically reduce the transfer and management costs of data.

Edge and Fog computing both bring data analysis and collection closer to the source, giving companies the option to reduce the bandwidth required to manage information, as well as reduce latency. Mobile operators can choose to store content which demands high bandwidth at network peripheral nodes, rather than force transfers to core systems at the heart of a network. In turn, this can reduce the workload of mobile networks.

Fog computing is poised to become a necessary element of future mobile networks, which will not only be faced with the challenge of catering for data-hungry consumers but will also need to support the ever-growing array of IoT devices and deployments in our cities through networks such as mobile LPWA, NB-IoT, and LTE-M. Local data storage and analysis provided by edge computing has a vast array of applications in the smart city arena, in which telecoms and mobile operators can benefit from.

Cities worldwide are expected to house an additional 2.5 billion new residents by 2050 and as populations grow, smarter solutions are required for sustainable urban living. Research suggests that the smart city market will be worth $2.4 trillion in 2025 but the core of this growth is dependent upon telecoms firms providing the support required for smart city IoT deployments through the adoption of fog and edge computing. Smart city use cases: Transport: Intelligent transport systems, ranging from smarter traffic lights to autonomous vehicles, have a place in smart cities.

Not only can IoT provide a way to improve traffic flows, but intelligent systems can improve road safety and reduce pollution levels. Vehicle-to-vehicle (V2V) communications, IoT sensors, GPS platforms, and RFID technologies all produce data and require rapid analysis to be fed back to connected devices in real-time. Fog computing and IoT, supported by mobile networks, provides the backbone to manage this information while reducing the risk of bottlenecks. In turn, telcos have the opportunity to secure fresh revenue streams in the automotive industry, an area of business once completely separate.

Smart energy grids: In order to replace aging water, gas, and electricity meters with more intelligent alternatives, smart cities require networks able to cater to low-power sensors spread over a vast area. Fog computing can act as a bridge between smart grids and cloud platforms for the purpose of collecting, computing and storing smart meter data before transmitting them to the cloud, improving the privacy and security of such sensitive data across a range of geographical locations. LoRa networks, which are able to support massive IoT deployments, are currently being rolled out to support smart energy grids and platforms by a number of telecommunications providers and ISPs.

Resident services: The concept of the smart city also includes ways to improve the lives of residents. These can include sensors for monitoring the environment, local services applications, and  intelligent services for caring for the elderly or those with medical conditions.

As fog computing is able to cater for data processing across a variety of nodes in different areas in a secure fashion, this kind of architecture should be a top choice to support resident services, specially when sensitive data is involved. Many of these services focus on mobile devices as a point of entry,  which, in turn, can be harnessed by telcos to create fresh revenue streams.

As our thirst for bandwidth deepens and our cities become smarter, mobile operators must have the capacity and infrastructure in place to support us. When our roads become managed through IoT and AI, bottlenecks can have a severe impact on safety & emergency services, and critical utilities cannot be interrupted, lest cities face the economic consequences.

As a result, telecoms firms will be held to a high standard in terms of speed and reliability. However, the smart city will also provide a wealth of opportunities for mobile operators to turn a profit.

 

Charlie Osborne is a professional journalist based in London, UK. She is a freelance editor, educational material creator and contributes to IoT World News as a feature writer with a focus on consumer technology, innovation, smart technology, mobility, edtech, and security.

 

Edge Computing Congress returns to the German capital. Meet the entire edge ecosystem and discover how cloud computing, 5G and IoT connected services can provide seamless connections at the network edge.

Ericsson upgrades Radio System, partners with Juniper on backhaul and buys CENX

Ahead of MWC Americas Ericsson has embarked on a frenzy of announcements around its core product offering.

The headline news is a significant upgrade to the Ericsson Radio System, its signature RAN product suite that has been a major part of its apparent recovery. Specifically Ericsson is launching something called the RAN Compute portfolio, which consists of a couple of baseband processors and a couple of radio processing units designed to be positioned wherever in the network you want your processing to be done. In other words this is a mobile edge computing play.

The other big thing in new, improved ERS is some new software called Ericsson Spectrum Sharing. This is designed to help with dynamic support of both 4G and 5G on the same spectrum, so long as you’re using ERS shipped since 2015, and can be installed remotely. While some of 5G will take place on higher frequencies, the stuff currently being used by 4G has the best propagation characteristics and will therefore remain valuable. This is the kind of 5G software upgrade Ericsson has been promoting as a key feature of ERS from the start.

“The hardware and software that we are launching today continues to address the flexibility needed for the next-generation networks,” said Ericsson EVP of Networks Fredrik Jejdling. “They offer our customers an expanded and adaptable 5G platform, making it easier for them to deploy 5G.”

We had a chat with Nishant Batra, Head of Product Area Networks at Ericsson, ahead of the announcement and he stressed this is all about ramping ERS’s 5G capability. Initially the propaganda was all about it being 5G upgradable, then about being ready for the 5G launch. Now the narrative revolves around this kit being positioned for the mass deployment of 5G.

Ericsson wants the world to see a picture of growing positive momentum and trying to be the perceived leader in 5G kit is a key part of that. “The momentum has never been better and we want to keep accelerating,” said Batra.

All this RAN shininess isn’t much good without some top-notch backhaul, however, and nobody is claiming that as an Ericsson strength. 5G is set to massively increase the volume of data passing across networks so, which being sure to big-up its own Router 6000 backhaul product and microwave tech, Ericsson has announced the extension of its partnership with Juniper to augment its transport efforts, as well as a new partnership with ECI on the optical side. So much for the big Ericsson Cisco partnership eh?

“Our radio expertise and knowledge in network architecture, end-user applications and standardization work put us in an excellent position to understand the requirements 5G places on transport,” said Jejdling. “By combining our leading transport portfolio with best-in-class partners, we will boost our transport offering and create the critical building blocks of next-generation transport networks that benefit our customers.”

“Commercial 5G is expected to represent close to a quarter of all global network traffic in the next five years,” said Manoj Leelanivas, Chief Product Officer at Juniper Networks. “With both companies bringing together industry-leading network technology, Juniper and Ericsson will be able to more effectively capitalize on the immense global market opportunity in front of us and help our customers simplify their journey to fully operational 5G networks.”

In other Ericsson news it has indulged in a rare bit of M&A via the acquisition of US service assurance vendor CENX. This move is designed to augment Ericsson’s OSS and managed services offerings and CENX is all about cloud-native automation, so its technology and 185 staff should be especially helpful in the area of virtualization. They haven’t said what it cost.

“Dynamic orchestration is crucial in 5G-ready virtualized networks,” said Mats Karlsson, Head of Solution Area OSS at Ericsson. “By bringing CENX into Ericsson, we can continue to build upon the strong competitive advantage we have started as partners. I look forward to meeting and welcoming our new colleagues into Ericsson.”

“Ericsson has been a great partner and for us to take the step to fully join Ericsson gives us the best possible worldwide platform to realize CENX’s ultimate goal – autonomous networking for all,” said Ed Kennedy CEO of CENX. “Our closed-loop service assurance automation capability complements Ericsson’s existing portfolio very well.”

Lastly Ericsson has announced a new partnership with US operator Sprint to build a new virtualized core and operating system dedicated just to IoT. Network slicing will be a major feature of the 5G era and IoT has network requirements quite distinct from other usage models, so it makes sense to not just apportion a piece of the network to it, but customise all the other tech too.

“We are combining our IoT strategy with Ericsson’s expertise to build a platform primed for the most demanding applications like artificial intelligence, edge computing, robotics, autonomous vehicles and more with ultra-low-latency, the highest availability and an unmatched level of security at the chip level,” said Ivo Rook, SVP of IoT for Sprint. “This is a network built for software and it’s ready for 5G. Our IoT platform is for those companies, large and small, that are creating the immediate economy.”

“Sprint will be one of the first to market with a distributed core network and operating system built especially for IoT and powered by Ericsson’s IoT Accelerator platform,” said Asa Tamsons, Head of Business Area Technology & Emerging Business at Ericsson. “Our goal is to make it easy for Sprint and their customers to access and use connected intelligence, enabling instant and actionable insights for a better customer experience and maximum value.”

That Ericsson is making so many announcements ahead of MWC Americas would appear to be a major endorsement of the event and of the GSMA’s regional expansion of the MWC brand. The timing might also have been influenced by the staging of Huawei’s Operations Transformation Forum event and even IFA, and it’s clear there is room in the telecoms calendar for big Autumn trade fests.

Edge Computing: What industrial enterprises want from service providers

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece Charlie Osborne of Edge Computing Congress looks at the role edge computing has to play in Industry 4.0.

The adoption of edge computing is going to become key for industrial companies seeking to implement modern cost-saving measures, 5G, and make the transition to Industry 4.0.

The manufacturing sector is no longer centred around isolated equipment and manual maintenance. Instead, businesses have the potential to thrive due to data.

Industrial companies are now operating on a global scale and have IIoT devices, software, and services at their disposal to transform data points into information which can improve both operational efficiency and the bottom line.

Industrial players and companies at large are no longer forced to send their data to one area in-house or to the cloud for analysis, which requires high levels of expenditure and can cause latency and security challenges.

Instead, Internet of Things (IoT) devices, sensors, and cloud services, empowered by edge computing, offer a scalable way to streamline traffic flows and process data in real-time close to the source, increasing the efficiency of data processing, reducing latency, initiating bandwidth savings, and more.

IoT micro data centres used by edge computing will become even more important in the future with the emergence of 5G networks.

Despite predictions that the enterprise will spend over £1 trillion on IoT networks, devices, and management systems by 2020, the industrial sector is yet to realize the full potential of edge computing.

Edge computing has the potential to disrupt the industrial sector more than most. Companies are making the move towards Industry 4.0 but recent research suggests that only a quarter of industrial firms feel they have a sufficient understanding of how digital processes can improve their business.

Only three percent of data points gathered by today’s industrial IoT systems are utilized which leaves 97 percent of otherwise actionable data floating in the wind.

Service providers have a key role to place in the adoption of edge computing. As enterprise players lean towards this next-generation technology in order to improve operational efficiency and to tap into the benefits of Big Data and IoT, service providers cannot afford to be left behind.

The enterprise expects service providers to innovate in order to push intelligence and computing out to the edge, meeting both the technological and data challenges of modern manufacturing, as well as offer ways to future-proof industrial processes.

Whether or not industry players know it yet, edge computing will become a necessity in the coming years to facilitate the transfer and analysis of large volumes of data and to control industrial and operational processes making the shift from legacy systems to Industry 4.0.

Companies offering such services need to establish a strategic plan to take advantage of what edge computing offers. The deployment of edge computing will become a crucial component of future industries, but many companies need to understand where to start.

Service providers must understand what industrial players expect, the challenges ahead, and how edge computing can be implemented as a justifiable investment which benefits the enterprise.

For more information and to learn what is wanted, needed and expected from edge computing service provider download our recent report.

 

Edge Computing Congress is returning to the German capital next month. Register to attend this event for the opportunity to meet the entire edge ecosystem and discover how cloud computing, 5G and IoT connected services can provide seamless connections at the network edge.

Orange points to privacy benefits through MEC

Mobile Edge Computing (MEC) is back on the buzzword agenda after spending a few years in the wilderness and Orange has pointed to an interesting privacy benefit to the technology.

After getting a technology tour at Roland Garros this week, one of the quick demos offered some insight into the world of video analytics and edge computing. Using several different wireless cameras scattered around the venue and various AI applications, Orange is able to keep track on the number of individuals who are in one particular area. This could be one of the entertainment areas or the courts themselves, but the algorithm is able to give an accurate estimate of how populated these areas are, which can help for crowd control or security purposes.

The idea of using facial recognition through video surveillance has started to create some privacy concerns in recent months, as there is little awareness from the general public who have not consented to being monitored, but this is where it gets interesting. Orange pointed out that the images are not detailed to identify specific individuals, just the number of individuals in an area, but even if it was, it doesn’t matter because of edge computing.

With processing power stored on the edge of the network the data can be processed, insight captured, before being deleted. Useless information can be sifted out on the edge, with only relevant data or the insight sent back to the core. By empowering the edge, privacy concerns are negated as personal information is not actually being stored by Orange, simply the insight which would not be considered sensitive.

This is not a revelation which is going to change the technology world, but it is an interesting little benefit which addresses a growing concern in the wider society.

Nokia takes 5G to the edge in Brooklyn

Networking vendor Nokia is using a 5G event in New York to show off some of its latest shiny things.

The Brooklyn 5G Summit describes itself as a ‘5G technology summit hosted in Brookly, NY’, which seems hard to argue with. The listed contact from the event is a Nokia email address so we’re going to assume Nokia runs the whole thing unless advised otherwise, and there don’t seem to be any other vendors involved.

The big thing Nokia is looking to bring attention to this year is its Edge Cloud datacenter solution, which is inevitably being positioned as 5G-ready. Nokia has been putting a lot of effort into the datacenter side of things in recent years via its AirFrame portfolio, which looks like an increasingly wise bet as edge computing becomes ever more prominent in the telecoms world.

This announcement concerns a server specifically designed for edge computing. It puts an emphasis on open architectures and software for fast deployment (it’s OPNFV compatible), and support for ultra-low latency to support things like automation and Cloud RAN. All this stuff plays a big role in 5G so that juxtaposition seems fair enough in this case.

“The edge cloud will play an essential role in delivering the compute power required for 5G,” said Marc Rouanne, President of Mobile Networks at Nokia. “By expanding our AirFrame and 5G Future X portfolio we can provide a network architecture that meets the needs of any operator and their customers.

“Used with the Nokia ReefShark chipset and our real-time cloud infrastructure software, the Nokia AirFrame open edge server will deliver the right decentralization of 4G and 5G networks. We can work with operators to ensure that data center capabilities are deployed exactly where they are needed to manage demands as they expand their service offering.”

“The edge cloud is an integral part of 5G network architecture, bringing more processing capabilities closer to where data is generated and consumed,” said Dan Rodriguez, GM of the Communications Infrastructure Division at Intel. “Nokia’s new AirFrame open edge solution is built on Intel Xeon Scalable processors, which offer the needed balance of compute, I/O and memory capacity for the edge cloud to work seamlessly across the wide range of workloads deployed on the edge.”

And that’s not the only piece of 5G-related goodness Nokia has bestowed on the grateful residents of Brooklyn this week. Nokia Bell Labs has persuaded NTT DOCOMO to get involved in some demo some millimetre wave tech involving a phased-array chip solution for the 90 GHz band to increase radio coverage in higher frequency bands and deliver multi-gigabit speeds at scale.

The main point of this demo seem to be to show the viability of 5G at these very high frequencies, including the use of a large number of antennas and also show how dynamic offloading relocation in a 5G core will enable low-latency networks.

“At Bell Labs, we work with leading operators such as NTT DOCOMO to develop disruptive technologies that will redefine human existence,” said Bell Labs President Marcus Weldon. “At the Brooklyn 5G Summit, we will show the world’s first RF solution that addresses the challenge of delivering optimized coverage for future mmWave frequencies, using a pioneering RFIC design that can be scaled to any array dimension and deliver optimized connectivity to any set of devices.”

Multi-Access Edge Computing will transform network economics

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece Bejoy Pankajakshan, SVP, Technology and Strategy at Mavenir, take a look at what Multi-access Edge Computing will bring to the table.

Rapid advances in mobile computing power are pushing machine intelligence to the very edge of the network. Known as Multi-access Edge Computing (MEC), this development is a critical success factor in the rise of the new 5G mobile networks that will deliver unheard of data speeds to an exponentially larger user base, and all at a fraction of the cost-per-bit expense.

The primary advantage of moving processing out to devices at the edge of the networks is that it delivers the ultra-low latency needed to communicate efficiently; essentially eliminating the need to constantly query centralized intelligence engines every time a device requests information. Instead, the device can perform the computation itself, extending applications and services to the very edge of the network, and in close proximity of the user.

When you combine that kind of decentralized computing power with the web-scale platforms that allow companies like Amazon, Facebook, and Google to efficiently manage billions of users with minimal resources, then you have a recipe for huge commercial success. A good thing too, because by some estimates, as many as one trillion devices will be connected to the internet by 2030.

To deliver connectivity to anything near that kind of user pool, CSPs will need to adopt a radically new operational approach towards service roll-out, maintenance, and support functions. Fortunately, with the promise of MEC and 5G, this approach will address both human and machine-to-machine (M2M) connections.

In this bright, new, hyper-connected world, applications such as enhanced personal assistants (e.g., Siri, Google Assistant) that perform truly complex “smart-home” tasks are moving into the mainstream. And this is where the exciting machine-to-machine part comes in. With MEC, your personal assistant will go even further – it will step out of the home with you and into your car and literally drive you to work in the morning, communicating with city sensors and other smart-devices along the way.

Anticipating the applications that MEC enables, pioneering companies have already built an end-to-end portfolio of software-based, open hardware, web-scale solutions that make the 5G vision a brilliant new reality. For example, enabling MEC with software solutions in the cloud, such as Cloud RAN and vEPC, enables the creation of a virtualized, cloud-centric mobile network. Now operators can leverage back-end infrastructure while placing intelligence and local storage out at the edge of the network.  With technology like this, living on the edge is exciting.

 

Meet Mavenir and learn more about progress towards 5G in the Middle East and North Africa next month at 5G MENA 2018, the largest event in the region to focus on advancing and commercialising 4G and 5G networks.

The potential of Machine Learning to optimize content distribution

Telecoms.com periodically invites expert commentators to share their insights into the most pressing industry issues. In this piece network AI specialists B.Yond explains why we need intelligence at the network edge.

We live in a connected world that is constantly streaming—video, games, music—the demand for content is always on. And, with emerging technologies presenting massive potential, including virtual reality, augmented reality, autonomous transportation, and mobile infotainment, there will be an unprecedented level of demand on the networks. Internet traffic on content delivery networks (CDNs) will more than double from 73 exabytes to 166 exabytes (Cisco VNI 2017) in the next three years.

In addition to the data demand, these applications will require lower latency, higher reliability and better fidelity than current networks deliver. This will require a significant change in network infrastructure from a centralized to a massively distributed architecture. To truly manage the volume and demand on the new network infrastructure and provide an optimized consumer experience with content at the edge, we need software-driven solutions that are focused on significantly reducing operational cost. We need intelligence at the edge.

Bringing Content to the Edge

To meet increasing content volumes, networks must effectively and intelligently manage massive and divergent amounts of data, be available anywhere with the capability to respond instantaneously, and have extensive security capabilities to support privacy concerns. However, the current industry-standard, based on centralized network infrastructure, cannot meet these requirements.

A network with a massively distributed architecture leveraging cloud, Network Function Virtualization (NFV), and Software-Defined Networking (SDN) technologies to employ edge computing, improves operational efficiency, reduces CAPEX, and creates opportunities for new revenue streams. By significantly reducing the distance between the mobile user and content, edge computing enhances network security, improves scalability and responsiveness, and supports low-latency applications.

With enhanced opportunity for content delivery through edge computing, there are further opportunities for growth and revenue. In CDNs, we are seeing a trend as companies build their own private servers on the edge and move away from distributing content through a shared CDN provider. For content providers, this shift to privatization is lowering the cost of handling increasingly high-definition videos, improving the user experience, and enhancing security.

There is a prime opportunity for operators and cable provides to capitalize on this by creating private or shared CDN servers. This can be achieved by repurposing central offices and adding nodes to cell sites and virtual Customer Premise Equipment (vCPE). Operators enable new revenue streams by building private CDNs using their wireless and wireline networks. With 5G and network slicing the costs can be further reduced.

An Intelligent Approach to Managing and Optimizing Content Delivery

As content is pushed to the edge, the automated, intelligent management and optimization of the network becomes essential. By applying Machine Learning (ML) and Artificial Intelligence (AI) to a distributed infrastructure, operators can proactively identify network traffic patterns and proactively respond appropriately to communications traffic demand with focus on improved customer experience. This process works by operators gathering real-time performance data from the software-defined core and access networks, then using ML and AI algorithms to provide guidance instantaneously. By applying this to video applications, service providers can optimize the end-to-end Quality-of-Experience (QoE) to reduce start-up delay, eliminate freezes and improve video quality.

Imagine a customer who is watching the latest episode of “Stranger Things” deployed through the closest local server in “central office one”. However, as traffic on the network begins to increase, the ML platform would proactively identify the potential impact to content delivery and automatically respond. In this case, by making a copy of “Stranger Things” in another central office. It may not be as close physically, but with more availability for transport. For customers, it means never again having their Netflix binging disrupted.

Because of the scale and reach of their networks and their ability to access full end-to-end infrastructure data, operators have an advantage over content providers distributing over the top today. To leverage the opportunity, operators need to build a virtualized CDN infrastructure with a next-generation ML- and AI-based management solution.  Though necessary to effectively and dynamically manage an increasingly complex network, an intelligent management solution will also deliver enhanced quality of experience, and new revenue streams.

Intelligence is Necessary for Progress

With the explosion of content, there is no question that a move to the edge is required to support a new wave of increasingly demanding content-based applications. But, the move to a distributed infrastructure is not enough. Without the use of proactive intelligence—the complexity of a massive edge network and the demands of the content become unmanageable and turn into an operational nightmare.

The optimal customer QoE requires the application of ML and AI to network performance data in order to guide the CDN infrastructure and video applications. Operators and content providers must work together to bring intelligence to the edge to progress the capabilities of content delivery.