ETSI gets to work on new contact tracing app standard

With countries across Europe all trying to reinvent the wheel with their own contact tracing apps, standardization is long overdue.

The responsibility for this has been taken by the European Telecommunications Standards Institute, which has created a special group dedicated to developing a ‘standardization framework for secure smartphone-based proximity tracing systems’. It’s called the Industry Specification Group “Europe for Privacy-Preserving Pandemic Protection,” which is mercifully abbreviated to ISG E4P.

“By their nature smartphones are highly personal devices, carrying large amounts of data about individuals,” said ETSI Director-General Luis Jorge Romero. “In ETSI we are committed to support an international development community with a robust standardization framework that allows rapid, accurate and reliable solutions while winning the trust of the population at large.”

Point well made about trust Luis. The UK, for example, currently seems determined to give its National Health Service access to the data created by the national contact tracing app. Not only would this alienate Google and Apple, thus making the app a lot less effective, but it would almost certainly lead to far fewer people using it.

“A primary challenge is collecting, processing and acting on information about citizens’ proximity at scale, potentially representing tens or hundreds of millions of people,” says the ETSI announcement. “This must also be achieved without compromising users’ anonymity and privacy, and while safeguarding them against exposure to potential cyber-attacks.”

Again, Google and Apple seem to have this more or less covered, but there’s no way a mega public bureaucracy like the EU would ever concede the private sector might have the answer to a public problem. So ETSI will probably take weeks to come up with something very similar, at which time the EC will order all its members to use it regardless of any progress they’ve made independently.

Artificial Intelligence for Networks: Understanding It Through ETSI ENI Use Cases and Architecture

On 17 April, ETSI officials from the Experiential Network Intelligence group (ISG ENI) gave a webinar entitled Artificial Intelligence for networks: understanding it through ETSI ENI use cases. This webinar attracted more than 150 online attendees including operators, vendors, research institutions, and international standards development organizations.

The first speaker, Dr. Luca Pesando, TIM, Vice Chair of ETSI ENI ISG introduced the scope of the group, membership and architecture, and Dr. Yue Wang, Samsung, Secretary of the group, gave some insight about selected ENI Use Cases. They highlighted that ENI is meant to be a flexible general-purpose AI Engine able to interface with multiple types of Assisted System, by means of open interfaces and API. Assisted Systems from multiple standards body can be interfaced (e.g. 3GPP, IETF, MEF, ITU, Broadband Forum) being able to control Access, Transport, Core technologies, from infrastructure to service layer of the network operation and management, creating AI based automation loops.

This webinar is available on the Brighttalk website.

This webinar will be followed on 6 May at 5pm CEST by a second webinar entitled ETSI ENI Architecture: AI for robust and manageable systems and applications.

You can register via the Brighttalk website.

The ETSI Industry Specification Group Experiential Network Intelligence created in February 2017 focuses on network intelligence and now comprises 60 organizations. ENI identified viable Use Cases and consequently derived the main Functionalities the ENI Engine has to provide. Five categories of Use Cases have been identified: Infrastructure Management; Network Assurance; Network Operation; Service Orchestration and Management; Network Security.

The ENI Engine aims to provide an easy way of user interaction, using a Human Like language to express the Intent of “What the User wants”, leaving the network with the task to translate it into Policies and How the Network can realize it. Evolution of the Architecture is increasing the possibility for ENI architecture to be applied to multiple Use Cases as well as increasing Security by Design. ISG ENI is working closely with the technologies defined by other ETSI groups including Fifth Generation Fixed Network (F5G), IPv6 integration (IP6), Multi-access Edge Computing (MEC), Network Function Virtualization (NFV), Secure AI (SAI) and Zero touch network and Service Management (ZSM). More information on ENI can be found on the ETSI website.

 

Huawei Technologies

ETSI sets out to develop non-IP networking standards for 5G services

The European Telecommunications Standards Institute (ETSI) has set up a new Industry Specification Group to address issues of age-old networking protocols faced by new services, in particular 5G.

The Industry Specification Group for Non-IP Networking (ISG NIN) had its kick-off-meeting at the end of March, but the announcement from the standardisation body only emerged this week. The new group will supersede an existing ETSI group for Next Generation Protocols (ISG NGP), created in 2015 to look at networking technology needs in the upcoming 5G era.

The industry has recognised for some time problems between new services and current networking technology. including ‘the complex and inefficient use of spectrum resulting from adding mobility, security, quality-of-service, and other features to a protocol that was never designed for them. The subsequent fixes and workarounds designed to overcome these problems themselves incur increased cost, latency, and greater power-consumption’.

ETSI set up the ISG NGP in 2015, still in the 4G era, to address these issues, with the work now being carried forward by the new ISG NIN. The group’s stated mission is ‘to develop standards that define technologies to make more efficient use of capacity, have security by design, and provide lower latency for live media’, all of which are key 5G promises.

“I’m really happy to have been entrusted with the Chairmanship of this group,” said John Grant of BSI, who was elected as ISG NIN’s Chair. “Finding new protocols for internet more suitable to the 5G era was essential. Big data and mission-critical systems such as industrial control, intelligent vehicles and remote medicine cannot be addressed the best way with current TCP/IP-based networking.”

Referred to in a group known as TCP/IP, the internet protocol suite, including the packet data identification standards (Internet Protocol, or IP) and the transmission standards (Transmission Control Protocol, or TCP), was developed inside the US defence department in the 1970s for transmitting text-based data between fixed computers.

When internet started to be widely adopted for civil use, there were discussions of building new networking technologies to handle the increased internet usage. In the end however, engineers decided to overlay new technologies on top the existing TCP/IP infrastructure.

“A lot of design and development happened to kind of rescue [it],” said Bilel Jamoussi, head of the ITU’s study groups responsible for ratifying technical standards during an FT interview. “We are now, I think, at another turning point, of saying, is that enough, or do we need something new?”

Built on the similar premise that the current TCP/IP infrastructure is no longer able to support demands from new technologies and new services, the Chinese proposal, titled ‘New IP’, was presented behind closed doors over a year ago at the ITU’s Geneva head office by delegates from Huawei, China Mobile, China Unicom, and China Academy of Information and Communications Technology (CAICT). It was only reported to the outside world when the FT got hold of the files.

Though scant in details, the fundamental scheme of the Chinese proposal is to replace the current open, flat, global, almost ‘wide wild web’ with a top-down approach to Internet management which ISPs, operators, and ultimately sovereign states, would have the overall control. The story was covered in detail by Light Reading, our sister publication.

Whether the announcement of ETSI’s new ISG NIN, two weeks after the meeting took place, is a response to the media coverage of the China plan, is anybody’s guess. The focus of the new group is different from the grandstand of the Chinese proposal: it is specific (targeted at supporting 5G services), phased (starting with private networks before being applied to public networks, starting from core networks before extending to the access), and addressing the underlying technology instead of looking at the control of the internet, or the lack of it.

To look at the situation from a positive perspective, at least both the Chinese party and ETSI, and ITU-T for that matter, agree that the TCP/IP infrastructure is getting obsolete.

“The IP stack and OSI layer model have undeniably enabled global connectivity – but since they originated in the 1970s, their design reflects the demands and capabilities of that era,” said Kevin Smith of Vodafone, who had been the chair of ISG NGP and was elected Vice Chair of ISG NIN. “Reassessing the fundamental design principles of network protocols offers the opportunity to deliver performance, security and efficiency gains for 2020 access networks and use cases, and may be achieved with simplification rather than expensive add-ons. The work of ETSI ISG NIN, in co-operation with industry organizations, can provide operators with a cutting-edge protocol suite to add to their service portfolio.”

Sharing, selling and standardizing – the great spectrum conundrum

With the World Radiocommunication Conference currently underway in Egypt, it’s timely to discuss some of the spectrum issues facing the industry today.

Spectrum is and will probably remain a hot-topic for the industry due to its critical importance. The success of a telco is partly defined by the spectrum licences it is able to horde, and depending on where you are in the world, the scarcity of these assets varies. That said, to describe anywhere as having an abundance would be foolish to say the least.

Starting with the idea of selling spectrum, this is a topic which is under constant debate, review and criticism.

“When you talk about spectrum, the price you have to pay always has an impact on the rollout strategy,” said Jasper Mikkelsen, Director of Public and Regulatory Affairs for the Telenor Group, during a panel session this week.

This is where the balancing act is at its finest. The regulators will argue they need to charge for access to spectrum for several reasons, but the price of spectrum is often the centre of criticism in various markets. Mikkelsen pointed out during the auctions in Thailand earlier this year, many of the spectrum assets went unsold due to the reserve prices assigned to the lots.

However, according to Donald Stockdale of the FCC, the complaints from telcos might have some merit. If spectrum is unsold at the end of the auction, this is most likely due to the auction being badly designed. Perhaps not enough was released, the channels hadn’t been cleared effectively, the reserve price was too high, or the obligations attached to the winning assets were deemed unreasonable by the telcos.

Telcos will always complain and point to markets where spectrum is effectively given away for free, however there are cases where they have a point. If such a valuable asset is remaining unsold, despite the pleas of telcos to free-up more spectrum, there is perhaps something wrong with the product itself.

What is worth noting is that the auction process is not perfect. There will always be complaints and criticism, though it is currently the least worst option. It is certainly better than the ‘beauty contest’ concept, which leaves the door wide open to corruption.

How to design and manage spectrum auctions is more of a ‘trial and error’ process, which will come as little comfort to those shelling out the investments, however the idea of standardising is something which should certainly be given more traction.

This will of course be a topic of conversation for at the World Radiocommunication Conference, especially concerning the higher frequency airwaves, though there is still a lot of work to do on the spectrum licenses which are already a hotch-potch of complexity.

While there is work being done to standardise spectrum across various different regions, this is a lot more complicated than just simply creating new rules. Bureaucrats have to deal with the dreading concept of legacy.

As Michael Sharpe, Director of Spectrum and Equipment Regulation at ETSI, pointed out there are 48 countries in Europe, all of which have been assigning spectrum to different usecases, products and services over the last few decades. Harmonisation is a topic of conversation now but unravelling the maze of red-tape which already exists in each of these nations is a very complex task.

First and foremost, there are some very attractive benefits from standardisation. In a region like Europe, the risk of interference is present, driving the case, while there are also be benefits driven through interoperability or economy of scale, however there will always be a downside.

Looking at Europe once again, the congestion of certain bands will vary depending on the demands of the nation, while the cost assigned to clearing these costs will certain vary quite considerably. Then you have to look at the idea of flexibility.

Politicians generally don’t like being told what to do, and they like it even less when it comes from bureaucrats over which they have very little influence. In designing a harmonised approach to spectrum allocation and usage, flexibility will need to be built into the process to ensure each nation can address the specific needs of dominant industries and the nuances of societal variance.

This is of course very difficult to judge the right balance, but it is a critical element not only to ensure economic prosperity in each of the nations, but to make sure the rules are adopted by the Governments in question. If it is too much of a hinderance or costs too much to clear the bands, who is to say these suggestions are not just simply ignored, these are sovereign states after all.

The final area which is attracting some attention out of the US is a spectrum sharing initiative out of the FCC.

Focusing specifically on the valuable 3.5 GHz spectrum band which is being championed in Europe to deliver the first 5G services, the FCC is trialling a dynamic spectrum sharing project. Known unofficially as the ‘Innovation Band’, it offers a palatable compromise between high-speed data transmission and extended coverage. However, the US has found itself in a bit of a pickle as the current incumbent on this spectrum band is the Navy.

The spectrum is currently utilised by the Navy in offshore radar operations, however it is not being used all the time, such is the nature of naval operations. For such valuable spectrum, this is largely viewed as a waste.

Stockdale highlighted the team has created a three-tier, demand-orientated system, where spectrum is utilised dependent on the presence of those in the tier above. The Navy has the right to use the band first and foremost, though when it become unutilised, mobile service providers can purchase licenses to gain access for the second tier. Should the Navy or the telcos not be making use of the spectrum, it can be assigned for general use for those approved in the third-tier.

Although this is only a trial for the moment, it demonstrates the point made above. Flexibility needs to be built into spectrum harmonisation initiatives, as it is unrealistic to repurpose this band in the US. The cost and effort are unlikely to be justifiable when you consider the size of the US Navy.

This is an excellent example of innovation when looking at spectrum, and regulators around the world should be paying attention to the lessons learned through this experiment. The idea of dynamic spectrum sharing could be huge if the fundamentals are validated here, such is the demand for this valuable and increasingly scarce resource.

This is of course not the only example of spectrum being repurposed in regions where it is not being utilised. In the UK, Ofcom has introduced rules which dictate unused spectrum must be released, assuming there is demand.

Vodafone recently announced it entered into a three-year agreement with StrattoOpencell to share the use of it 2.6 GHz spectrum assets to deliver connectivity in Devon. The spectrum licences are being used in highly-urbanised areas, but not in the countryside, therefore it is inefficient use of the asset without these rules from the regulator.

Although spectrum is a topic which has been the centre of many debates, it does appear it will be an ever-lasting ebb and flow. The World Radiocommunication Conference will likely free-up some more spectrum, but the TMT industry is very good at finding ways to use it. Scarcity is most likely going to persist, though there are some interesting conversations evolving to improve this niche of the mobile segment.

Europe postures with standards leadership still on the line

European standards organization ETSI has released a report demanding the continent take a leadership role for standards and regulation in the global digital economy.

While some might question whether the sluggish Brussels bureaucrats can get up to speed quick enough, there is hope; regulators around the world all share the same track-record when it comes to the painfully slow progress of creating regulatory and legal frameworks.

The report, commissioned at the request of ETSI, was authored over the first-half of 2019 and demands Europe take the lead on creating the standards necessary for a healthy and progressive digital economy.

“Our competitors are very serious about taking the lead in digital transformation,” said Carl Bildt, Co-Chair European Council on Foreign Relations.

“It is important that EU lawmakers put standardization at the centre of EU digital and industrial strategy. Otherwise Europe will become a rule taker, forever playing catch-up in the innovation, production and delivery of new digital products and services.”

Although many would want to see a collaborative, geographically-neutral, approach to standardising the digital economy, this is unlikely to happen. As Bildt highlights above, someone will take a leadership position, standards will gain acceptance, before other regions will have to adopt the rules.

The question which remains is whether Europe, the US or China will have the greatest influence on global standards. In fact, ETSI questions whether Europe is keeping pace with leaders today or whether its influence is waning already. Unfortunately, with the platform economy gaining more traction each day, this is one area which should not be considered a strength of the bloc.

46% of platforms with a revenue above $1 billion are based in the US and 35% in Asia, while Europe only accounts for 18%. These platforms often drive their own ecosystems and have largely been self-regulating to date. This is going to change in the future, though to give European organizations a chance at capturing growth, the European Commission might have to lead the charge to create open-standards. The contrary approach might only offer the established players greater momentum and influence.

This is perhaps the risk which is emerging currently. The idea of globalisation and open-standards are not new, though there is evidence certain markets are heading towards a more isolationist mindset and regime.

Although it is easy to point the finger at aggressive political leaders elsewhere, the report demands Europe look intrinsically also. The European Commission has to take a strong leadership position across the bloc, as with 28 members states there is risk of fragmentation. It only has to be slight variances to start, but this could snowball into greater complications. The digital tax conundrum is an example of what can go wrong over an extended period of time.

This report might be more of a generalist statement to encourage a proactive mindset from European bureaucrats, though there are plenty of examples of governments, public sector administrations and private industry trying to control the tone.

Looking at the ever more influential world of artificial intelligence, the number of feasibility, standards and ethics boards is quite staggering. All of these initiatives will want to create rules and frameworks to govern the operation and progression of AI, though only one can be adopted as the global standard. Regional variances are of course feasible, but this should not be deemed healthy.

In the UK, the Government created its AI Council. The European Commission has released various white papers exploring how AI should be governed. The White House’s National Science and Technology Council Committee on Technology is also exploring the world of AI. Facebook has even created its own independent advisory panel to aid the creation of standards and regulation.

Should Europe want to control the global standards process, it will come up against some stiff competition. The power and influence of the US should not be underestimated, it is home to some of the worlds’ most recognisable and profitable brands after all, while China has a track-record of flooding working groups at standards organizations. This will have a noticeable impact on the final outcome.

That said, the success of GDPR will offer hope.

Europe’s General Data Protection Regulation might have caused headaches all around the world, but it has set the tone on the approach to privacy, management of data and the influence of the consumer. Since its introduction, several other countries, India and Japan being two examples, have been inspired by GDPR to introduce similar regulation, while there have been calls in the US to do the same also.

This piece of regulation was critical to ensure the European principles of privacy are maintained moving forward. This is a win, but there are still battles to be had when it comes to AI, security, encryption, cross-border data flow and access to data.

Standardisation might not be the most exciting topic to discuss in the TMT world, though taking a leadership position can offer advantages to the companies who call that region home. A thorough and innovative regulatory regime can open-up new markets, ensure competition is healthy throughout the ecosystem and drive national economies at scale.

The regulatory landscape is set to undergo somewhat of a shift over the coming months and years, though which region will take the lead is still hanging in the balance.

Evolving ETSI engulfs enterprise

The connectivity landscape is evolving is quickly as the desire for telcos to diversify revenues, so it makes perfect sense the worlds’ standards authority does as well.

Speaking to Luis Jorge Romero, Director General of the European Telecommunications Standards Institute, or ETSI as its more commonly known, the mission statement is shifting. Traditionally, ETSI has focused on taking care of the telcos, but Romero is broadening his embrace to bring enterprise customers and technology leaders into the equation.

“They need connectivity, so they have to brought into the equation for the standards,” said Romero. “They are the ones who know the issues and want to have a solution. They can translate the problem into what we understand.”

As it stands, any telco which exclusively focuses on traditional connectivity services and products will struggle to survive in tomorrow’s digital world. Such is the financial demands of 5G, telcos will have to source new revenues to build the ROI and provide fuel for future expansion and upgrades.

Romero highlighted that it was critical not only to ensure the telco industry is supported in this time of rapid change, but the voices of the enterprise organizations and more specialist technology providers are heard as well. If the telcos are going to work more closely with industry, vertical specific applications will need to be developed. The Asian telcos have been incredibly proactive developing these usecases, though Europe and US have been sluggish.

And of course, every step forward has to be standardized to ensure a healthy and sustainable industry.

At ETSI, this translates into two different types of working groups; those at a high-level which are designed for the telcos themselves, and secondly, more drilled down, vertical specific applications. Romero pointed towards the creation of an open data platform to help the marine industry track assets throughout the world, also the participation of agricultural giant John Deere in IOT working groups, as two excellent examples of this evolution.

To bridge the gap between connectivity and the verticals, both segments need to be sitting down in the same room. Everyone realises this, and ETSI is taking an important step forward to facilitate progress.

ETSI publishes new spec and reports on 5G tech

The European Telecommunications Standards Institute, ETSI, has released new specifications on packet formatting and forwarding, as well as two reports on transport and network slicing respectively.

The new specification, called Flexilink, focusing on packet formats and forwarding mechanisms to allow core and access networks to support the new services proposed for 5G. The objective of the new specification is to achieve efficient deterministic packet forwarding in user plane for next generation protocols (NGP). In the conventional IP networks, built on the Internet Protocols defined in the 1980s, every packet carries all the information needed to route it to its destination. This is undergoing fundamental changes with new technologies like Software Defined Networking (SDN) and Control and User Plane Separation (CUPS), where most packets are part of a “flow” such as a TCP session or a video stream. As a result, there is increasingly a separation between the processes of deciding the route packets will follow and of forwarding the packets.

“Current IP protocols for core and access networks need to evolve and offer a much better service to mobile traffic than the current TCP/IP-based technology,” said John Grant, chairman of the ETSI Next Generation Protocol Industry Specification Group (ISG). “Our specifications offer solutions that are compatible with both IPv4 and IPv6, providing an upgrade path to the more efficient and responsive system that is needed to support 5G.”

The new specification defines two separate services, a “basic” service suitable for traditional statistically multiplexed packet data, and a “guaranteed” service providing the lowest possible latency for continuous media, such as audio, video, tactile internet, or vehicle position. It is worth noting that Flexilink only specifies user plane packet formats and routing mechanisms. Specifications for the control plane to manage flows have already been defined in an earlier NGP document “Packet Routing Technologies” published in 2017.

The report “Recommendation for New Transport Technologies” analyses the current transport technologies such as TCP and their limitations, whilst also providing high-level guidance on architectural features required in a transport technology to support the new applications proposed for 5G. The report also includes a framework where there is a clear separation between control and data planes. A proof-of-concept implementation was conducted to experiment the recommended technologies, and to demonstrate that each TCP session can obtain bandwidth guaranteed service or minimum latency guaranteed service. The report states:

“With traditional transport technology, for all TCP traffic passes through DIP router, each TCP session can only obtain a fraction of bandwidth. It is related to the total number of TCP sessions and the egress bandwidth (100 M).

“With new transport technology, new TCP session (DIP flows) could obtain its expected bandwidth or the minimum latency. And most [sic.] important thing is that the new service is not impacted by the state that router is congested, and this can prove that new service by new transport technology is guaranteed.”

Importantly, the PoC experiment showed that the current hardware technology is able to support the proposed new transport technology and provide satisfactory scalability and performance.

The report “E2E Network Slicing Reference Framework and Information Model” looks into the design principles behind network slicing. The topic of network slices encompasses the combination of virtualisation, cloud centric, and SDN technologies. But there is gap in normalized resource information flow over a plurality of provider administration planes (or domains). The report aims to “provide a simple manageable and operable network through a common interface while hiding infrastructure complexities. The present document defines how several of those technologies may be used in coordination to offer description and monitoring of services in a network slice.” It describes the high level functions and mechanisms for implementing network slicing, as well as addresses security considerations.

ETSI releases security standards for distributed systems

The distributed, cloud-based technological environment required by 5G and IoT will present a novel set of security challenges.

In anticipation of this the ETSI (European Telecommunications Standards Institute) technical committee on cybersecurity has released two specifications focused on attribute-based encryption (ABE). This seems to be a more flexible, tailored, bespoke form of encryption that can be applied to specific scenarios.

Here’s the ETSI explanation: “ABE is an asymmetric, multi-party cryptographic scheme that bundles access control with data encryption. In such a system, data can only be decrypted if the set of attributes of the user key matches the attributes of the encryption. For instance, access to employee pay data will only be granted to the role of Human Resources Employee working in the payroll department of a company, who has been there for one year or more.”

And here are the two specifications:

  • ETSI TS 103 458, which describes high-level requirements for Attribute-Based Encryption. One objective is to provide user identity protection, preventing disclosure to an unauthorized entity. It defines personal data protection on IoT devices, WLAN, cloud and mobile services, where secure access to data has to be given to multiple parties, according to who that party is.
  • ETSI TS 103 532, which specifies trust models, functions and protocols using Attribute-Based Encryption to control access to data, thus increasing data security and privacy. It provides a cryptographic layer that supports both variants of ABE – Ciphertext Policy and Key Policy – in various levels of security assurance. This flexibility in performance suits various forms of deployments, whether in the cloud, on a mobile network or in an IoT environment. The cryptographic layer is extensible and new schemes can be integrated in the standard to support future industry requirements and address data protection challenges in the post-quantum era.

Another point in favour of these specifications is that they claim to allow secure exchange of personal data among data controllers and data processors, which is a apparently a precondition for GDPR compliance. For emerging distributed core network technology they offer security standards that have the flexibility and scalability they need.

We’re more than networks now – ETSI

Transformation is one of the most common buzzwords in the telecoms world and it seems not even standards bodies can stand against the tides of change.

The world is changing, and changing very quickly. Operators are being pitted against new and unknown competitors, while profits are being sucked out of the telecoms sector. This change means companies have to play in new ballparks, to different rules, and the same can be said for ETSI.

“I don’t think ETSI will be doing the same thing in five years what it was doing five years ago,” said David Boswarthick, Director of Committee Support Center at ETSI.

ETSI’s bread and butter work to date has naturally been focused on the network. And while work here will never be complete, it is becoming less stressful. Projects are completed and new focus areas arise. Like augmented reality for instance.

Eventually operators will start making money out of next generation technologies like AR, but for the moment the foundations are being laid. And what is crucial to these foundations is bringing new stakeholders into the equation. ETSI’s AR working group is one of those which operates further up the value chain. Yes, there are networking questions to be asked, but the technology is much more consumer orientated. The purpose of this group is to assess the landscape, before moving onto standardization projects for the interfaces between devices and an industry accepted framework.

The problem with technologies like AR is that they tend to fall between the cracks. It traverses across so many different sectors, it is difficult for someone to be able to take control. Unfortunately this can lead to some disappointing results. Right now there are three companies (who shall remain nameless) who are dominating the AR space. The technology is proprietary and siloed right now which is a problem.

While some people would consider standards as a limitation for technologists and blue-sky thinkers, Boswarthick highlighted they are crucial for success in the long-run. AR has been walking down the proprietary path for some time unchecked, but to make sure the consumer and the wider ecosystem benefit, there has to be a process of checks and balances. This is what ETSI plans to oversee; the process of creating interoperability and a sustainable ecosystem.

But this is where the complications lie; ETSI has little or no experience in dealing with industry verticals. There are a few industry members in the groups right now, Siemens and Bosch are two examples, but more are needed. “ETSI getting close to the vertical domains is a tough nut to crack,” said Boswarthick, but considering industry players will influence and define applications on the network, they are needed in the conversation from the beginning.

This is one of the first examples of ETSI expanding into new areas, but there will be more. Autonomous vehicles for instance will muddy the waters with new players in the ecosystem, as will smart cities. ETSI certainly isn’t forgetting about its tried and tested playground, but this organization is going to be much more than networking before too long.

ETSI give TLC to MEC – aging buzzword to get a facelift

Multi-access Edge Computing (MEC) might have been given a bit of attention in months gone, but with the 5G dawn about to break a resurgence for MEC could be on the cards.

While it does not sound like the sexiest part of the mobile industry, MEC is crucially important. If we are to live the 5G dream of 8K videos or instant access to insight, the ability to store and cache data on the edge of the network is critical. This is an old story for the industry, but it is a narrative which has been neglected in recent months. ESTI is one organization which seems to be trying to gather some extra steam for the forgotten buzzword.

“As the first Standards Developing Organization to address the challenges of MEC, ETSI brings the world’s leading experts on MEC to the table,” said Alex Reznik, Chair of ETSI MEC Industry Specification Group. “The ETSI ISG MEC can make a significant impact in the effort to make 5G a reality and we invite the industry to take advantage of everything we have to offer.”

MEC is of course only one piece of the 5G puzzle and a step in the complicated journey of virtualization, but one which is very important. Will virtual assistants be able to perform adequately without it, or will latency be low enough for autonomous vehicles or remote surgery? Not only will we not be able to realise some of these glorious usecases, ignoring MEC could potentially undermine the whole premise of the 5G system architecture, which is supposed to be a distributed network. With the 5G light breaking over the horizon ETSI is shifting the focus back to MEC.

As part of the push, ETSI has released two white papers while also creating a Hackathon framework to accelerate multi-access edge computing adoption and interoperability, and encourage all stakeholders to use the group’s specifications to develop edge applications. Collaboration between the various different parties will be critical here, and considering some of the parties involved there is risk of a few disagreements.

“While MEC is central to enabling the world of 5G applications over both 4G and 5G networks, it is only part of a solution to a bigger puzzle,” Reznik had previously said. “Increasingly, the industry is looking for guidance on how to put the overall solution together. By providing end-to-end solution guidance, encouraging and promoting the market through events like Hackathons and other related activities, our group is stepping up to this challenge.”

ETSI is kicking starting the refocus onto MEC, but we expect this to be a much more prominent talking point (once again) over the next couple of months.