Sharing, selling and standardizing – the great spectrum conundrum

With the World Radiocommunication Conference currently underway in Egypt, it’s timely to discuss some of the spectrum issues facing the industry today.

Spectrum is and will probably remain a hot-topic for the industry due to its critical importance. The success of a telco is partly defined by the spectrum licences it is able to horde, and depending on where you are in the world, the scarcity of these assets varies. That said, to describe anywhere as having an abundance would be foolish to say the least.

Starting with the idea of selling spectrum, this is a topic which is under constant debate, review and criticism.

“When you talk about spectrum, the price you have to pay always has an impact on the rollout strategy,” said Jasper Mikkelsen, Director of Public and Regulatory Affairs for the Telenor Group, during a panel session this week.

This is where the balancing act is at its finest. The regulators will argue they need to charge for access to spectrum for several reasons, but the price of spectrum is often the centre of criticism in various markets. Mikkelsen pointed out during the auctions in Thailand earlier this year, many of the spectrum assets went unsold due to the reserve prices assigned to the lots.

However, according to Donald Stockdale of the FCC, the complaints from telcos might have some merit. If spectrum is unsold at the end of the auction, this is most likely due to the auction being badly designed. Perhaps not enough was released, the channels hadn’t been cleared effectively, the reserve price was too high, or the obligations attached to the winning assets were deemed unreasonable by the telcos.

Telcos will always complain and point to markets where spectrum is effectively given away for free, however there are cases where they have a point. If such a valuable asset is remaining unsold, despite the pleas of telcos to free-up more spectrum, there is perhaps something wrong with the product itself.

What is worth noting is that the auction process is not perfect. There will always be complaints and criticism, though it is currently the least worst option. It is certainly better than the ‘beauty contest’ concept, which leaves the door wide open to corruption.

How to design and manage spectrum auctions is more of a ‘trial and error’ process, which will come as little comfort to those shelling out the investments, however the idea of standardising is something which should certainly be given more traction.

This will of course be a topic of conversation for at the World Radiocommunication Conference, especially concerning the higher frequency airwaves, though there is still a lot of work to do on the spectrum licenses which are already a hotch-potch of complexity.

While there is work being done to standardise spectrum across various different regions, this is a lot more complicated than just simply creating new rules. Bureaucrats have to deal with the dreading concept of legacy.

As Michael Sharpe, Director of Spectrum and Equipment Regulation at ETSI, pointed out there are 48 countries in Europe, all of which have been assigning spectrum to different usecases, products and services over the last few decades. Harmonisation is a topic of conversation now but unravelling the maze of red-tape which already exists in each of these nations is a very complex task.

First and foremost, there are some very attractive benefits from standardisation. In a region like Europe, the risk of interference is present, driving the case, while there are also be benefits driven through interoperability or economy of scale, however there will always be a downside.

Looking at Europe once again, the congestion of certain bands will vary depending on the demands of the nation, while the cost assigned to clearing these costs will certain vary quite considerably. Then you have to look at the idea of flexibility.

Politicians generally don’t like being told what to do, and they like it even less when it comes from bureaucrats over which they have very little influence. In designing a harmonised approach to spectrum allocation and usage, flexibility will need to be built into the process to ensure each nation can address the specific needs of dominant industries and the nuances of societal variance.

This is of course very difficult to judge the right balance, but it is a critical element not only to ensure economic prosperity in each of the nations, but to make sure the rules are adopted by the Governments in question. If it is too much of a hinderance or costs too much to clear the bands, who is to say these suggestions are not just simply ignored, these are sovereign states after all.

The final area which is attracting some attention out of the US is a spectrum sharing initiative out of the FCC.

Focusing specifically on the valuable 3.5 GHz spectrum band which is being championed in Europe to deliver the first 5G services, the FCC is trialling a dynamic spectrum sharing project. Known unofficially as the ‘Innovation Band’, it offers a palatable compromise between high-speed data transmission and extended coverage. However, the US has found itself in a bit of a pickle as the current incumbent on this spectrum band is the Navy.

The spectrum is currently utilised by the Navy in offshore radar operations, however it is not being used all the time, such is the nature of naval operations. For such valuable spectrum, this is largely viewed as a waste.

Stockdale highlighted the team has created a three-tier, demand-orientated system, where spectrum is utilised dependent on the presence of those in the tier above. The Navy has the right to use the band first and foremost, though when it become unutilised, mobile service providers can purchase licenses to gain access for the second tier. Should the Navy or the telcos not be making use of the spectrum, it can be assigned for general use for those approved in the third-tier.

Although this is only a trial for the moment, it demonstrates the point made above. Flexibility needs to be built into spectrum harmonisation initiatives, as it is unrealistic to repurpose this band in the US. The cost and effort are unlikely to be justifiable when you consider the size of the US Navy.

This is an excellent example of innovation when looking at spectrum, and regulators around the world should be paying attention to the lessons learned through this experiment. The idea of dynamic spectrum sharing could be huge if the fundamentals are validated here, such is the demand for this valuable and increasingly scarce resource.

This is of course not the only example of spectrum being repurposed in regions where it is not being utilised. In the UK, Ofcom has introduced rules which dictate unused spectrum must be released, assuming there is demand.

Vodafone recently announced it entered into a three-year agreement with StrattoOpencell to share the use of it 2.6 GHz spectrum assets to deliver connectivity in Devon. The spectrum licences are being used in highly-urbanised areas, but not in the countryside, therefore it is inefficient use of the asset without these rules from the regulator.

Although spectrum is a topic which has been the centre of many debates, it does appear it will be an ever-lasting ebb and flow. The World Radiocommunication Conference will likely free-up some more spectrum, but the TMT industry is very good at finding ways to use it. Scarcity is most likely going to persist, though there are some interesting conversations evolving to improve this niche of the mobile segment.

Europe postures with standards leadership still on the line

European standards organization ETSI has released a report demanding the continent take a leadership role for standards and regulation in the global digital economy.

While some might question whether the sluggish Brussels bureaucrats can get up to speed quick enough, there is hope; regulators around the world all share the same track-record when it comes to the painfully slow progress of creating regulatory and legal frameworks.

The report, commissioned at the request of ETSI, was authored over the first-half of 2019 and demands Europe take the lead on creating the standards necessary for a healthy and progressive digital economy.

“Our competitors are very serious about taking the lead in digital transformation,” said Carl Bildt, Co-Chair European Council on Foreign Relations.

“It is important that EU lawmakers put standardization at the centre of EU digital and industrial strategy. Otherwise Europe will become a rule taker, forever playing catch-up in the innovation, production and delivery of new digital products and services.”

Although many would want to see a collaborative, geographically-neutral, approach to standardising the digital economy, this is unlikely to happen. As Bildt highlights above, someone will take a leadership position, standards will gain acceptance, before other regions will have to adopt the rules.

The question which remains is whether Europe, the US or China will have the greatest influence on global standards. In fact, ETSI questions whether Europe is keeping pace with leaders today or whether its influence is waning already. Unfortunately, with the platform economy gaining more traction each day, this is one area which should not be considered a strength of the bloc.

46% of platforms with a revenue above $1 billion are based in the US and 35% in Asia, while Europe only accounts for 18%. These platforms often drive their own ecosystems and have largely been self-regulating to date. This is going to change in the future, though to give European organizations a chance at capturing growth, the European Commission might have to lead the charge to create open-standards. The contrary approach might only offer the established players greater momentum and influence.

This is perhaps the risk which is emerging currently. The idea of globalisation and open-standards are not new, though there is evidence certain markets are heading towards a more isolationist mindset and regime.

Although it is easy to point the finger at aggressive political leaders elsewhere, the report demands Europe look intrinsically also. The European Commission has to take a strong leadership position across the bloc, as with 28 members states there is risk of fragmentation. It only has to be slight variances to start, but this could snowball into greater complications. The digital tax conundrum is an example of what can go wrong over an extended period of time.

This report might be more of a generalist statement to encourage a proactive mindset from European bureaucrats, though there are plenty of examples of governments, public sector administrations and private industry trying to control the tone.

Looking at the ever more influential world of artificial intelligence, the number of feasibility, standards and ethics boards is quite staggering. All of these initiatives will want to create rules and frameworks to govern the operation and progression of AI, though only one can be adopted as the global standard. Regional variances are of course feasible, but this should not be deemed healthy.

In the UK, the Government created its AI Council. The European Commission has released various white papers exploring how AI should be governed. The White House’s National Science and Technology Council Committee on Technology is also exploring the world of AI. Facebook has even created its own independent advisory panel to aid the creation of standards and regulation.

Should Europe want to control the global standards process, it will come up against some stiff competition. The power and influence of the US should not be underestimated, it is home to some of the worlds’ most recognisable and profitable brands after all, while China has a track-record of flooding working groups at standards organizations. This will have a noticeable impact on the final outcome.

That said, the success of GDPR will offer hope.

Europe’s General Data Protection Regulation might have caused headaches all around the world, but it has set the tone on the approach to privacy, management of data and the influence of the consumer. Since its introduction, several other countries, India and Japan being two examples, have been inspired by GDPR to introduce similar regulation, while there have been calls in the US to do the same also.

This piece of regulation was critical to ensure the European principles of privacy are maintained moving forward. This is a win, but there are still battles to be had when it comes to AI, security, encryption, cross-border data flow and access to data.

Standardisation might not be the most exciting topic to discuss in the TMT world, though taking a leadership position can offer advantages to the companies who call that region home. A thorough and innovative regulatory regime can open-up new markets, ensure competition is healthy throughout the ecosystem and drive national economies at scale.

The regulatory landscape is set to undergo somewhat of a shift over the coming months and years, though which region will take the lead is still hanging in the balance.

Evolving ETSI engulfs enterprise

The connectivity landscape is evolving is quickly as the desire for telcos to diversify revenues, so it makes perfect sense the worlds’ standards authority does as well.

Speaking to Luis Jorge Romero, Director General of the European Telecommunications Standards Institute, or ETSI as its more commonly known, the mission statement is shifting. Traditionally, ETSI has focused on taking care of the telcos, but Romero is broadening his embrace to bring enterprise customers and technology leaders into the equation.

“They need connectivity, so they have to brought into the equation for the standards,” said Romero. “They are the ones who know the issues and want to have a solution. They can translate the problem into what we understand.”

As it stands, any telco which exclusively focuses on traditional connectivity services and products will struggle to survive in tomorrow’s digital world. Such is the financial demands of 5G, telcos will have to source new revenues to build the ROI and provide fuel for future expansion and upgrades.

Romero highlighted that it was critical not only to ensure the telco industry is supported in this time of rapid change, but the voices of the enterprise organizations and more specialist technology providers are heard as well. If the telcos are going to work more closely with industry, vertical specific applications will need to be developed. The Asian telcos have been incredibly proactive developing these usecases, though Europe and US have been sluggish.

And of course, every step forward has to be standardized to ensure a healthy and sustainable industry.

At ETSI, this translates into two different types of working groups; those at a high-level which are designed for the telcos themselves, and secondly, more drilled down, vertical specific applications. Romero pointed towards the creation of an open data platform to help the marine industry track assets throughout the world, also the participation of agricultural giant John Deere in IOT working groups, as two excellent examples of this evolution.

To bridge the gap between connectivity and the verticals, both segments need to be sitting down in the same room. Everyone realises this, and ETSI is taking an important step forward to facilitate progress.

ETSI publishes new spec and reports on 5G tech

The European Telecommunications Standards Institute, ETSI, has released new specifications on packet formatting and forwarding, as well as two reports on transport and network slicing respectively.

The new specification, called Flexilink, focusing on packet formats and forwarding mechanisms to allow core and access networks to support the new services proposed for 5G. The objective of the new specification is to achieve efficient deterministic packet forwarding in user plane for next generation protocols (NGP). In the conventional IP networks, built on the Internet Protocols defined in the 1980s, every packet carries all the information needed to route it to its destination. This is undergoing fundamental changes with new technologies like Software Defined Networking (SDN) and Control and User Plane Separation (CUPS), where most packets are part of a “flow” such as a TCP session or a video stream. As a result, there is increasingly a separation between the processes of deciding the route packets will follow and of forwarding the packets.

“Current IP protocols for core and access networks need to evolve and offer a much better service to mobile traffic than the current TCP/IP-based technology,” said John Grant, chairman of the ETSI Next Generation Protocol Industry Specification Group (ISG). “Our specifications offer solutions that are compatible with both IPv4 and IPv6, providing an upgrade path to the more efficient and responsive system that is needed to support 5G.”

The new specification defines two separate services, a “basic” service suitable for traditional statistically multiplexed packet data, and a “guaranteed” service providing the lowest possible latency for continuous media, such as audio, video, tactile internet, or vehicle position. It is worth noting that Flexilink only specifies user plane packet formats and routing mechanisms. Specifications for the control plane to manage flows have already been defined in an earlier NGP document “Packet Routing Technologies” published in 2017.

The report “Recommendation for New Transport Technologies” analyses the current transport technologies such as TCP and their limitations, whilst also providing high-level guidance on architectural features required in a transport technology to support the new applications proposed for 5G. The report also includes a framework where there is a clear separation between control and data planes. A proof-of-concept implementation was conducted to experiment the recommended technologies, and to demonstrate that each TCP session can obtain bandwidth guaranteed service or minimum latency guaranteed service. The report states:

“With traditional transport technology, for all TCP traffic passes through DIP router, each TCP session can only obtain a fraction of bandwidth. It is related to the total number of TCP sessions and the egress bandwidth (100 M).

“With new transport technology, new TCP session (DIP flows) could obtain its expected bandwidth or the minimum latency. And most [sic.] important thing is that the new service is not impacted by the state that router is congested, and this can prove that new service by new transport technology is guaranteed.”

Importantly, the PoC experiment showed that the current hardware technology is able to support the proposed new transport technology and provide satisfactory scalability and performance.

The report “E2E Network Slicing Reference Framework and Information Model” looks into the design principles behind network slicing. The topic of network slices encompasses the combination of virtualisation, cloud centric, and SDN technologies. But there is gap in normalized resource information flow over a plurality of provider administration planes (or domains). The report aims to “provide a simple manageable and operable network through a common interface while hiding infrastructure complexities. The present document defines how several of those technologies may be used in coordination to offer description and monitoring of services in a network slice.” It describes the high level functions and mechanisms for implementing network slicing, as well as addresses security considerations.

ETSI releases security standards for distributed systems

The distributed, cloud-based technological environment required by 5G and IoT will present a novel set of security challenges.

In anticipation of this the ETSI (European Telecommunications Standards Institute) technical committee on cybersecurity has released two specifications focused on attribute-based encryption (ABE). This seems to be a more flexible, tailored, bespoke form of encryption that can be applied to specific scenarios.

Here’s the ETSI explanation: “ABE is an asymmetric, multi-party cryptographic scheme that bundles access control with data encryption. In such a system, data can only be decrypted if the set of attributes of the user key matches the attributes of the encryption. For instance, access to employee pay data will only be granted to the role of Human Resources Employee working in the payroll department of a company, who has been there for one year or more.”

And here are the two specifications:

  • ETSI TS 103 458, which describes high-level requirements for Attribute-Based Encryption. One objective is to provide user identity protection, preventing disclosure to an unauthorized entity. It defines personal data protection on IoT devices, WLAN, cloud and mobile services, where secure access to data has to be given to multiple parties, according to who that party is.
  • ETSI TS 103 532, which specifies trust models, functions and protocols using Attribute-Based Encryption to control access to data, thus increasing data security and privacy. It provides a cryptographic layer that supports both variants of ABE – Ciphertext Policy and Key Policy – in various levels of security assurance. This flexibility in performance suits various forms of deployments, whether in the cloud, on a mobile network or in an IoT environment. The cryptographic layer is extensible and new schemes can be integrated in the standard to support future industry requirements and address data protection challenges in the post-quantum era.

Another point in favour of these specifications is that they claim to allow secure exchange of personal data among data controllers and data processors, which is a apparently a precondition for GDPR compliance. For emerging distributed core network technology they offer security standards that have the flexibility and scalability they need.

We’re more than networks now – ETSI

Transformation is one of the most common buzzwords in the telecoms world and it seems not even standards bodies can stand against the tides of change.

The world is changing, and changing very quickly. Operators are being pitted against new and unknown competitors, while profits are being sucked out of the telecoms sector. This change means companies have to play in new ballparks, to different rules, and the same can be said for ETSI.

“I don’t think ETSI will be doing the same thing in five years what it was doing five years ago,” said David Boswarthick, Director of Committee Support Center at ETSI.

ETSI’s bread and butter work to date has naturally been focused on the network. And while work here will never be complete, it is becoming less stressful. Projects are completed and new focus areas arise. Like augmented reality for instance.

Eventually operators will start making money out of next generation technologies like AR, but for the moment the foundations are being laid. And what is crucial to these foundations is bringing new stakeholders into the equation. ETSI’s AR working group is one of those which operates further up the value chain. Yes, there are networking questions to be asked, but the technology is much more consumer orientated. The purpose of this group is to assess the landscape, before moving onto standardization projects for the interfaces between devices and an industry accepted framework.

The problem with technologies like AR is that they tend to fall between the cracks. It traverses across so many different sectors, it is difficult for someone to be able to take control. Unfortunately this can lead to some disappointing results. Right now there are three companies (who shall remain nameless) who are dominating the AR space. The technology is proprietary and siloed right now which is a problem.

While some people would consider standards as a limitation for technologists and blue-sky thinkers, Boswarthick highlighted they are crucial for success in the long-run. AR has been walking down the proprietary path for some time unchecked, but to make sure the consumer and the wider ecosystem benefit, there has to be a process of checks and balances. This is what ETSI plans to oversee; the process of creating interoperability and a sustainable ecosystem.

But this is where the complications lie; ETSI has little or no experience in dealing with industry verticals. There are a few industry members in the groups right now, Siemens and Bosch are two examples, but more are needed. “ETSI getting close to the vertical domains is a tough nut to crack,” said Boswarthick, but considering industry players will influence and define applications on the network, they are needed in the conversation from the beginning.

This is one of the first examples of ETSI expanding into new areas, but there will be more. Autonomous vehicles for instance will muddy the waters with new players in the ecosystem, as will smart cities. ETSI certainly isn’t forgetting about its tried and tested playground, but this organization is going to be much more than networking before too long.

ETSI give TLC to MEC – aging buzzword to get a facelift

Multi-access Edge Computing (MEC) might have been given a bit of attention in months gone, but with the 5G dawn about to break a resurgence for MEC could be on the cards.

While it does not sound like the sexiest part of the mobile industry, MEC is crucially important. If we are to live the 5G dream of 8K videos or instant access to insight, the ability to store and cache data on the edge of the network is critical. This is an old story for the industry, but it is a narrative which has been neglected in recent months. ESTI is one organization which seems to be trying to gather some extra steam for the forgotten buzzword.

“As the first Standards Developing Organization to address the challenges of MEC, ETSI brings the world’s leading experts on MEC to the table,” said Alex Reznik, Chair of ETSI MEC Industry Specification Group. “The ETSI ISG MEC can make a significant impact in the effort to make 5G a reality and we invite the industry to take advantage of everything we have to offer.”

MEC is of course only one piece of the 5G puzzle and a step in the complicated journey of virtualization, but one which is very important. Will virtual assistants be able to perform adequately without it, or will latency be low enough for autonomous vehicles or remote surgery? Not only will we not be able to realise some of these glorious usecases, ignoring MEC could potentially undermine the whole premise of the 5G system architecture, which is supposed to be a distributed network. With the 5G light breaking over the horizon ETSI is shifting the focus back to MEC.

As part of the push, ETSI has released two white papers while also creating a Hackathon framework to accelerate multi-access edge computing adoption and interoperability, and encourage all stakeholders to use the group’s specifications to develop edge applications. Collaboration between the various different parties will be critical here, and considering some of the parties involved there is risk of a few disagreements.

“While MEC is central to enabling the world of 5G applications over both 4G and 5G networks, it is only part of a solution to a bigger puzzle,” Reznik had previously said. “Increasingly, the industry is looking for guidance on how to put the overall solution together. By providing end-to-end solution guidance, encouraging and promoting the market through events like Hackathons and other related activities, our group is stepping up to this challenge.”

ETSI is kicking starting the refocus onto MEC, but we expect this to be a much more prominent talking point (once again) over the next couple of months.

ETSI plots the end of mankind with new Working Group

ETSI has unveiled a new Industry Specification Group: Zero Touch Network and Service Management, which will aim to accelerate network automation; humans beware.

The idea here is relatively simple. With the introduction of new technology such as SDN, NFV and MEC, as well as network slicing just around the corner, the network is becoming increasingly complex. As human error is the most common root cause of any disaster in a business, not just telecoms, higher levels of automation are critical to making sure the network meets the high demands of the consumer.

40 organizations have already joined the group, which will be led by Deutsche Telekom’s Klaus Martiny. Nurit Sprecher of Nokia and Christian Toche of Huawei will act as Vice Chairs.

Thankfully robotics is an area of the technology world which is lightyears away from perfection, otherwise what would be the need in humans? Once the physical network is present in the real world, if Klaus and his cronies have their way all execution on the network could be handled by artificial intelligence. Human redundancy is near and it’s because we can’t be trusted with the complicated stuff.

“While 5G and its building blocks are being developed, it’s time to offer an end-to-end view focusing on automated end-to-end network and service management,” said Martiny.

“We want to offer the market open and simple solutions. A continuous feedback from all stakeholders will lead to the first implementations of the specifications which will be tested through Proofs of Concepts, the outcome being fed back to improve existing specifications. A strong collaboration and cooperation with others standards bodies and Open Source projects is important for the ISG.”

The goal of the ISG, which will have the ZSM acronym, is to create a framework where humans are essentially made redundant. From delivery, deployment, configuration, assurance, and optimization, all operational processes and tasks could be handled with 100% automation. Of course there will be a time limit.

The group has the emergence of 5G as a key driver to standardize this area of the industry, as 5G will ‘trigger the need to accelerate radical change in the way networks and services are managed and orchestrated’. This might seem close, but if the delivery of 5G has been anything like the ubiquitous delivery of 4G, we have a while.

ETSI has AR in its standardization sights

ETSI has waded into the murky waters of AR, creating a new Industry Specification Group called Augmented Reality Framework (ISG ARF).

As with most other ETSI working groups, the aim here will be synchronize efforts and identify key use cases and scenarios for developing an AR framework. While working groups at standards bodies are not the most exciting aspect of the industry, it is a crucial one. The group will work to create AR specifications in order to ensure interoperable implementations that will benefit both technology providers and end-users.

“There are huge differences in AR applications but mapping digital information with the real world implies the use of a set of common components offering functionalities such as tracking, registration, pose estimation, localization, 3D reconstruction or data injection,” said b<>com’s Muriel Deschanel, who will act as chair of the group.

“The development of such a framework will allow components from different providers to interoperate through the defined interfaces. This will in turn avoid the creation of vertical siloes and market fragmentation and enable players in the eco-system to offer parts of an overall AR solution.”

Although the first meeting of the group has not taken place yet, Industry 4.0, smart cities and smart homes are three areas which have been prioritized, while an eye will also be cast over applications for mobility, retail, healthcare, education and public safety.

These are all possible ideas, but for any new technology to become a reality, there needs to be a solid business case for the guys at the top of the value chain. And to do that, a transparent and reliable interworking between different AR components is key; in short, interoperability is good. ETSI is the enemy of vendor lock-in situations, and this is just the first step to bringing the technology under its protective wing.