3GPP’s Release 16 timeline hangs in the balance thanks to coronavirus

Much has been made of the impact of the coronavirus outbreak on the telecoms industry, though it now appears the critically important Release 16 timeline could be under threat.

Now the initial shock of the Mobile World Congress cancellation has settled, business is seemingly back to normal, though the coronavirus outbreak has not been contained. The Barcelona trip was cancelled, but so were a horde of other ventures into foreign lands. The 3GPP 5G standards meetings also fell onto the chopping block.

Interestingly enough, Release 16 from the 3GPP, industry specifications to deal with 5G standalone RAN, virtualisation, the 5G core, network slicing and various other topics, could also feel the impact of the virus. The consequence could be further delay on the release of the industry specifications.

“When experiments with e-meetings have happened in the past it is very easy for discussion to spiral and go quite tangential,” one insider told Telecoms.com. “90% of the negotiation to reach agreements on contentious topics happen over coffee, lunch and dinner… you don’t get that opportunity in an e-meeting.”

As a result of the coronavirus outbreak, all Technical Specification Group (TSG) and Working Group (WG) meetings have now been replaced with electronic meetings for the first quarter. 3GPP has said this is only applicable where practical, and when not, presumably the meeting is cancelled.

For Q2, where the meetings were set to take place in China, activities have been cancelled for the moment. Replacement venues are being sought, but it is by no-means a guarantee the outbreak would be contained by this point.

In December 2018, 3GPP already announced a delay to the release of the standards. It might be stubbornly sticking to the existing timelines right now, but if it was not able to stay on schedule in 2018, what chance will it have while the coronavirus outbreak is still at large?

“We are obliged to suspend belief, but I suspect it won’t take much to blow past March plenaries without agreement on some key topics,” our source continued.

“The problem is it will only take one little thing to hold up the whole lot. The release has to be an independently implantable document set. If anything that is mandatory to support is not included, they have to make it optional, remove it completely to the next release or delay. First two options would require a lot of work in themselves.

“e-meetings do not make agreement easy, and it would only take one topic to get political to have someone attempt to throw everything into question as a tactic.”

As it stands, the group is attempting to negotiate the specifications for the RAN aspect of Release 16. The virtualisation components are all largely finalised, while the core aspect is not due until June, when Release 16 would be theoretically frozen. That said, any delay would push these timelines back once again.

According to Dario Talmesio, Principal Analyst & Practice Leader for Omdia, further delay to Release 16 could cascade and possibly have an echo effect throughout the industry. Time to ROI would be increased, which would not be considered welcome news for the financially strained telcos or vendors which have been promised shareholders 5G fortunes.

With RAN being the topic of debate currently, the standards hang in a precarious position. RAN is the aspect of network infrastructure which attracts the most attention because of the scale of deployment. While other standards missions can run in parallel, RAN is critical to product development timelines. Delays here would be heart-breaking for telcos and vendors, but there could also be a knock-on effect for the International Mobile Telecommunications-2020 requirements issued by the ITU.

Although our source was not necessarily the most confident, there are some who are a bit more optimistic.

“The next set of 3GPP meetings have been turned ‘virtual’ and in 3GPP SA2 at least all Release 17 proposals are postponed until the April meeting,” said Alan Carlton, VP of Wireless and Internet Technologies at InterDigital.

“In the upcoming meeting companies will only be allowed to submit corrections to Release 16. This will obviously cause some delays. New ad hoc meetings to catch up will have to be approved and it is not certain all companies will agree to that. Hopefully this is just a blip and thing will get caught-up post this somewhat unusual ‘flu’ season.”

The 3GPP is maintaining an optimistic position on the meetings for the next six weeks and will not make a call on the timelines until TSG Chairs have a chance to discuss progress.

A bunch of operators get together to push 5G and MEC interoperability

América Móvil, KT Corp., Rogers, Telstra, Verizon and Vodafone have formed a new gang called the 5G Future Forum.

The stated aim of the gang is to ‘accelerate the delivery of 5G and mobile-edge computing-enabled solutions around the world.’ It apparently thinks that there are issues around the interoperability of 5G specifications that need sorting out. This doesn’t seem to refer to the 5G standard itself, but rather 5G-enabled solutions like autonomous vehicles, smart factories and so on.

“This forum of global leaders in 5G marks an important step in ensuring edge computing works seamlessly for our customers,” said Vinod Kumar, CEO of Vodafone Business. “These new specifications will allow us to offer services that work consistently across the globe and support devices moving between countries. 5G opens up a wealth of opportunities for new solutions and business models and we’re excited to play a role in bringing them to life.”

“5G is a key enabler of the next global industrial revolution, where technology will transform how we live and work. It’s critical that technology partners around the world unite to create the most seamless global experience for our customers,” said Hans Vestberg, CEO of Verizon. “We are proud to join with our fellow 5G leaders to unlock the full potential of applications and solutions that will transform with 5G’s fast speeds, high reliability, improved security and single-digit latency.”

All the other founding members got a canned quote too but you get the gist. Other than a press release there doesn’t seem to be much else to the forum yes, not even a website. Presumably other operators will be brought into the fold in due course, but the absence of any telecoms or technology specification organisations looks like a potential issue.

Big Tech sign-up to make smart home standards

Apple, Amazon and Google are joining forces with the Zigbee Alliance to form the Connected Home over IP project to create universal standards for the smart home ecosystem.

The aim of the project is simple; get ahead of the game and reduce the potential for ecosystem fragmentation in the smart home. With Apple, Amazon and Google on board, the working group has access to the worlds’ most popular virtual assistants and can drive towards creating a framework which encourages interoperability and compatibility.

“Our goal is to bring together market-tested technologies to develop a new, open smart home connectivity standard based on Internet Protocol (IP),” Google’s Nik Sathe and Grant Erikson wrote on the company’s blog.

“Google’s use of IP in home products dates back to the launch of Nest Learning Thermostat in 2011. IP also enables end-to-end, private and secure communication among smart devices, mobile apps, and cloud services.”

While it might seem slightly unusual that the internet giants are attempting to collaborate without being forced to, the bigger picture makes it a bit more logical.

The likes of Google, Apple and Amazon are looking to make more money from the software and services elements of the smart home ecosystem. This is an admirable quest, though for money to be made there needs to be mass adoption of smart home products.

As it stands, smart home devices manufacturers are facing a conundrum. Either, spend a lot of money to make sure devices are compatible with all the different smart home ecosystems which are developing, or pick a winner and risk losing out on customers who will exist elsewhere. By creating universal standards for the smart home ecosystem, the manufacturers will theoretically be more encouraging to engage in this emerging segment.

What is always worth remembering is that while the likes of Google and Amazon currently sell smart home devices, there will be a lot more money for these companies on the software side when smart home products are adopted on scale. This is their bread and butter after all, with a plethora of existing relationships already in place. Looking at Apple, this is a company which manufactures premium devices, but has some very aggressive ambitions in the software and services world. This is where CEO Tim Cook envisions growth for the company in the future.

Ultimately this is a good sign for the industry. Collaboration is a word which is thrown around so much nowadays it is almost meaningless, but when it results in universally accepted standards to drive interoperability and compatibility, there is something genuinely exciting to look forward to.

Europe postures with standards leadership still on the line

European standards organization ETSI has released a report demanding the continent take a leadership role for standards and regulation in the global digital economy.

While some might question whether the sluggish Brussels bureaucrats can get up to speed quick enough, there is hope; regulators around the world all share the same track-record when it comes to the painfully slow progress of creating regulatory and legal frameworks.

The report, commissioned at the request of ETSI, was authored over the first-half of 2019 and demands Europe take the lead on creating the standards necessary for a healthy and progressive digital economy.

“Our competitors are very serious about taking the lead in digital transformation,” said Carl Bildt, Co-Chair European Council on Foreign Relations.

“It is important that EU lawmakers put standardization at the centre of EU digital and industrial strategy. Otherwise Europe will become a rule taker, forever playing catch-up in the innovation, production and delivery of new digital products and services.”

Although many would want to see a collaborative, geographically-neutral, approach to standardising the digital economy, this is unlikely to happen. As Bildt highlights above, someone will take a leadership position, standards will gain acceptance, before other regions will have to adopt the rules.

The question which remains is whether Europe, the US or China will have the greatest influence on global standards. In fact, ETSI questions whether Europe is keeping pace with leaders today or whether its influence is waning already. Unfortunately, with the platform economy gaining more traction each day, this is one area which should not be considered a strength of the bloc.

46% of platforms with a revenue above $1 billion are based in the US and 35% in Asia, while Europe only accounts for 18%. These platforms often drive their own ecosystems and have largely been self-regulating to date. This is going to change in the future, though to give European organizations a chance at capturing growth, the European Commission might have to lead the charge to create open-standards. The contrary approach might only offer the established players greater momentum and influence.

This is perhaps the risk which is emerging currently. The idea of globalisation and open-standards are not new, though there is evidence certain markets are heading towards a more isolationist mindset and regime.

Although it is easy to point the finger at aggressive political leaders elsewhere, the report demands Europe look intrinsically also. The European Commission has to take a strong leadership position across the bloc, as with 28 members states there is risk of fragmentation. It only has to be slight variances to start, but this could snowball into greater complications. The digital tax conundrum is an example of what can go wrong over an extended period of time.

This report might be more of a generalist statement to encourage a proactive mindset from European bureaucrats, though there are plenty of examples of governments, public sector administrations and private industry trying to control the tone.

Looking at the ever more influential world of artificial intelligence, the number of feasibility, standards and ethics boards is quite staggering. All of these initiatives will want to create rules and frameworks to govern the operation and progression of AI, though only one can be adopted as the global standard. Regional variances are of course feasible, but this should not be deemed healthy.

In the UK, the Government created its AI Council. The European Commission has released various white papers exploring how AI should be governed. The White House’s National Science and Technology Council Committee on Technology is also exploring the world of AI. Facebook has even created its own independent advisory panel to aid the creation of standards and regulation.

Should Europe want to control the global standards process, it will come up against some stiff competition. The power and influence of the US should not be underestimated, it is home to some of the worlds’ most recognisable and profitable brands after all, while China has a track-record of flooding working groups at standards organizations. This will have a noticeable impact on the final outcome.

That said, the success of GDPR will offer hope.

Europe’s General Data Protection Regulation might have caused headaches all around the world, but it has set the tone on the approach to privacy, management of data and the influence of the consumer. Since its introduction, several other countries, India and Japan being two examples, have been inspired by GDPR to introduce similar regulation, while there have been calls in the US to do the same also.

This piece of regulation was critical to ensure the European principles of privacy are maintained moving forward. This is a win, but there are still battles to be had when it comes to AI, security, encryption, cross-border data flow and access to data.

Standardisation might not be the most exciting topic to discuss in the TMT world, though taking a leadership position can offer advantages to the companies who call that region home. A thorough and innovative regulatory regime can open-up new markets, ensure competition is healthy throughout the ecosystem and drive national economies at scale.

The regulatory landscape is set to undergo somewhat of a shift over the coming months and years, though which region will take the lead is still hanging in the balance.

IBC 2019: Linear TV isn’t dead just yet

This might sound like a very bold and short-sighted statement, but thanks to the development of IP-based standards, traditional broadcasters might just be able to survive in the digital economy.

This is of course not a statement which suggests business is as usual, there are major restructures and realignments which need to occur to future-proof the business, but linear TV and traditional broadcasters can survive in the cut-throat world of tomorrow.

The change which is being forced onto the world is HbbTV and ATSC 3.0, two new standards for the traditional broadcasters to get behind which offer the opportunity to create the experiences consumers desire and the business model which advertisers demand.

HbbTV, Hybrid Broadcast Broadband TV, and ATSC 3.0 are both standards which aim to take the broadcasting industry into the digital world. Although these standards are not necessarily harmonised, the IP approach effectively forces manufacturers and broadcasters into an era of on-demand content, interactive experiences and hyper-targeted advertising.

Over the last few years, many in the TMT world have been quick to write the obituaries for linear programming, but this is not an area which should be written off so abruptly. There is still a niche for the idea of linear TV, and if executed competently, there will be an audience of Generation Z sitting on the sofa next to the Baby Boomers.

Oliver Botti of the Fincons Group, pointed to two areas where linear TV currently, and will continue to, thrive. Firstly, live sports, and secondly, reality TV programming such as Celebrity Big Brother. With both of these standards, new content, experiences and advertising business models can be enabled to ensure continued relevance.

For sports, additional content can be offered to the consumer alongside the action to offer the viewer more control of their experience. This is something which is becoming increasingly common in the OTT world, though it is yet to genuinely penetrate traditional broadcasting in any meaningful way. The second example Botti highlighted is a very interesting one.

The concept of Celebrity Big Brother is not new to most. Dozens of cameras in a closed environment, following around the lives of prima donnas where at least one will probably make some sort of racist gaff at some point. However, with the new standards, Botti highlighted users can choose which camera is live on their own TV, creating a personalised content experience.

It does sound very creepy, but this is the sort of thing which is likely to appeal to some audiences…

Both of these examples are live content. For some, this experience can not be replicated in an on-demand environment, driving the continued relevance for linear TV. It is a niche, but one which will drive the relevance of traditional broadcasters and the relevance of linear programming for years to come.

Vincent Grivet, Chairman of the HbbTV Association, also highlighted the standards also allow for personalised advertising. This is just as, or perhaps more, important to the survival of traditional broadcasters as without the advertising dollars these businesses will not survive. Advertisers know what they want nowadays mainly because Silicon Valley can offer it. If hyper-targeted advertising is not an option, advertisers will not part with their valuable budgets.

What is worth noting, is that both of these standards rely on the TV manufacturers creating products which allow for success to continue. This is where an issue might arise; currently there is no global harmonisation.

HbbTV has been adopted in Europe, while ATSC 3.0 has been championed in the US and South Korea. China is doing what China does and going down its own separate path, creating a notable amount of fragmentation. This might be a challenge.

Richard Friedel, Executive VP of Technology & Broadcast Strategy of 21st Century Fox, told us that as an engineer he would like to see more harmonisation, but as a pragmatist, he doesn’t see it happening any time soon. All the standards are IP-based, therefore there will be a natural alignment as the industry evolves over the next couple of years, but this does not necessarily mean genuine harmonisation.

This presents a complication for the industry, but let’s not forget that this is a positive step in the right direction. Linear TV might not be attracting the headlines, but if you listen to the right people, it is certainly not dead.

Interdigital sues Motorola-owner Lenovo over 4G patents

Mobile and video tech developer Interdigital has filed patent infringement action against Lenovo in the UK because they can’t agree a price for use of its 4G patents.

Perhaps wary of being labelled a patent troll, Interdigital is keen to stress that this is the first patent infringement litigation it has initiated for six years. It claims its hand has been forced after the failure of almost a decade of negotiation with Lenovo, which makes Motorola phones as well as its own-branded devices.

Interdigital reckons it owns around 10% of the standards-essential patents in both 3G and 4G technology, which means it gets a piece of the action whenever someone sells a device that uses them. How much users of these patents have to pay is usually determined on a FRAND (fair, reasonable and non-discriminatory) basis, but apparently Lenovo won’t even accept third party FRAND arbitration.

Patent litigation canned comments are among the most formulaic, but let’s have a look anyway. “Having product companies take fair licenses to patented technologies flowing out of fundamental research is absolutely essential for the long-term success of worldwide standards like 4G and 5G,” said William Merritt, CEO of Interdigital.

“InterDigital has a long history of valuable technology innovation and patient, good faith negotiation and fair licensing practices, including our willingness to allow the economic terms of a FRAND license to be determined via binding neutral arbitration. We also have longstanding licensing relationships with many of the top companies in the mobile space, including successful license arrangements with Samsung, Apple, LG and Sony, among others.

“For our company, we turn to litigation only when we feel that negotiations are not being carried out in good faith. In bringing this claim in the UK High Court of Justice, which has a history of examining standards-essential patent issues, we are hopeful for a speedy resolution and a fair license.”

Here are the patents in question:

  • European Patent (UK) 2 363 008 – Enables the efficient control of carrier aggregation in 4G (LTE). In advanced mobile phones, carrier aggregation is key to achieving high data rates.
  • European Patent (UK) 2 557 714 – Supports the use of multiple antennae transmissions in 4G (LTE). The patent enables the use of flexible levels of error protection for reporting by the handset, increasing the reliability of the signaling.
  • European Patent (UK) 2 485 558 – Allows mobile phone users quick and efficient access to 4G (LTE) networks. One of the main technological challenges of developing LTE networks was efficient bandwidth usage for various traffic types such as VoIP, FTP and HTTP. This patent relates to inventions for quickly and efficiently requesting shared uplink resources — for example, reducing lag when requesting a webpage on a smartphone on LTE networks.
  • European Patent (UK) 2 421 318 – Decreases latency during HSUPA transmission by eliminating certain scenarios in HSUPA where scheduling requests may be blocked. A blocked scheduling request may prevent a smartphone from sending data.

Interdigital presumably has others that Lenovo is using in its devices, so either there’s no dispute over the them or Interdigital is focusing on the four juiciest ones, who knows? Patent litigation is pretty arcane stuff at the best of times, but it seems like Lenovo must have really pushed its luck for its relationship with Interdigital to come to this. It’s hard to see how they can justify refusing to go to FRAND arbitration, but there could well be extenuating circumstances that will come to light in due course.

Google prefers cookies to fingerprints

Internet giant Google has announced some measures designed to better protect the privacy of users of its Chrome browser.

Under the heading of ‘Privacy Sandbox’ Google wants to develop a set of open privacy standards. At the core of this initiative is the use of cookies, which are bits of software that track people’s online activity and, so the theory goes, serve them more relevant advertising. Google concedes that some use of cookies doesn’t meet acceptable data privacy standards, but that blocking them isn’t the answer.

A major reason for this is that it encourages the use of another tracking technique called fingerprinting. This aggregates a bunch of other user preferences and behaviours to generate a unique identifier that performs a similar function to cookies. The problem with fingerprints, however, is that there’s no user control over them and hence they’re bad for data privacy.

Since the digital ad market now expects a considerable degree of targeting, but fingerprinting is considered an unacceptable solution to the blocking of cookies, Google wants to come up with a better one that will be implemented across all browsers, hence this initiative. The Privacy Sandbox is a secure environment designed to enable safe experimentation with other personalization technologies.

“We are following the web standards process and seeking industry feedback on our initial ideas for the Privacy Sandbox,” blogged Justin Schuh Director of Chrome Engineering at Google. “While Chrome can take action quickly in some areas (for instance, restrictions on fingerprinting) developing web standards is a complex process, and we know from experience that ecosystem changes of this scope take time. They require significant thought, debate, and input from many stakeholders, and generally take multiple years.”

While this is all laudable it should be noted that Google has possibly the greatest vested interest in optimising targeted advertising online. While that makes it perfectly understandable that it would want to take the initiative in standardizing the way it’s done, other big advertisers and browser providers may have reservations about surrendering much control of the process to Google.

ZTE moves to prove its own security credentials

Taking a page from the Huawei playbook, ZTE is opening its own European cybersecurity lab to demonstrate its own security credentials and appeal to customers.

Although Huawei is taking a battering on the US side of the Atlantic, European nations have stubbornly stood by the side of reason and reasonable behaviour, asking for evidence before signing an execution order. One of the reasons for this will be the apparent transparency to security through its cybersecurity centres in the UK and Belgium, and it seems ZTE is following suit.

“The security lab is an open and cooperative platform for the industry,” said Zhong Hong, ZTE Chief Security Officer.

“ZTE plans to gradually achieve the cybersecurity goals through three steps: first, meeting the requirements of cybersecurity laws, regulations and industry standards as well as certification schemes; second, conducting an open dialogue to enhance transparency and establishing cooperation with customers as well as regulatory agencies; and third, sustaining the open cooperation mechanism to contribute to cybersecurity standardization.”

Opening in Rome, the cybersecurity lab will enable telcos to contribute ideas to improve the security credentials of ZTE products, while customers will also be able to conduct audits of all products and services in the labs. This approach is seemingly working for Huawei, and ZTE is recognising the opportunity to get in on the action as 5G ramps up across the continent.

For ZTE this is a perfectly sensible move to mitigate against future risks. As Huawei is largely a proxy for Chinese aggression, it would be reasonable to assume any action taken against Huawei would be replicated against ZTE. Anything which can be done to get into the good graces of potential European customers should be seen as a priority.

Although it is for selfish reasons, the cybersecurity centre also adds more credibility to the standardisation approach which seems to be forming across the European continent. The more vendors who agree to the higher barriers to entry, the closer the continent comes to standardising security credentials. This approach to risk mitigation, an acceptance that 100% secure is an impossible objective, manages threats while also preserving competition.

Until there is concrete proof of collusion with the Chinese government for nefarious aims, this is the most sensible approach, taking the argument out of the political arena.

Evolving ETSI engulfs enterprise

The connectivity landscape is evolving is quickly as the desire for telcos to diversify revenues, so it makes perfect sense the worlds’ standards authority does as well.

Speaking to Luis Jorge Romero, Director General of the European Telecommunications Standards Institute, or ETSI as its more commonly known, the mission statement is shifting. Traditionally, ETSI has focused on taking care of the telcos, but Romero is broadening his embrace to bring enterprise customers and technology leaders into the equation.

“They need connectivity, so they have to brought into the equation for the standards,” said Romero. “They are the ones who know the issues and want to have a solution. They can translate the problem into what we understand.”

As it stands, any telco which exclusively focuses on traditional connectivity services and products will struggle to survive in tomorrow’s digital world. Such is the financial demands of 5G, telcos will have to source new revenues to build the ROI and provide fuel for future expansion and upgrades.

Romero highlighted that it was critical not only to ensure the telco industry is supported in this time of rapid change, but the voices of the enterprise organizations and more specialist technology providers are heard as well. If the telcos are going to work more closely with industry, vertical specific applications will need to be developed. The Asian telcos have been incredibly proactive developing these usecases, though Europe and US have been sluggish.

And of course, every step forward has to be standardized to ensure a healthy and sustainable industry.

At ETSI, this translates into two different types of working groups; those at a high-level which are designed for the telcos themselves, and secondly, more drilled down, vertical specific applications. Romero pointed towards the creation of an open data platform to help the marine industry track assets throughout the world, also the participation of agricultural giant John Deere in IOT working groups, as two excellent examples of this evolution.

To bridge the gap between connectivity and the verticals, both segments need to be sitting down in the same room. Everyone realises this, and ETSI is taking an important step forward to facilitate progress.

BBWF 2018: Telcos are starting to find their voice through openness – TIM

For years the CSPs have been a fading voice in the telco ecosystem, but control is being wrestled back through the open communities.

The challenge over the last couple of years has been a lack of control. Standards organizations and technological developments are controlled by the vendors, which in turn results in control the industry’s landscape. The CSPs are no-longer masters of their own fate, which is primarily their own fault, but according to Mauro Tilocca of Telecom Italia the open source communities are giving the CSPs a voice.

“We need to blend the strengths of standards organizations and open source communities,” said Tilocca at Broadband World Forum in Berlin. “This is the only way to get to carrier grade solutions.”

It’s amazing to think that in years gone CSPs used to be technologically innovative organizations. But tough market conditions and stress on profitability has seen a trend of outsourcing responsibility. In other words, outsourcing the risk element of innovation.

Speaking to other attendees at this year’s event, there is a feeling the CSPs of yesteryear wanted to be financing organizations. This might explain why there are so many accountants in leadership positions, and such a distaste for risk. Allowing others to innovate and then leaning on the findings is certainly a safer way to conduct business, sitting on the top of the stack realising the advances of others; standing on the shoulders of giants is a common phrase which can be applied here.

But the downside is a loss of influence from a technological perspective and future developments in the industry. This might have allowed the accountants to manipulate spreadsheets to make the financials of an organization healthier, but the standards working groups and research projects are dominated by vendors. The CSPs are having their roadmaps dictated to them because they have lost their voice in the ecosystem.

With open source and white box groups becoming more prominent in the ecosystem, the CSPs are starting to find their feet. Open source and open standards are becoming more regular fixtures of the telco world, and these are the groups which are designed to be led by the CSPs. Of course, these groups alone cannot dictate the terms of the industry, there is still too much knowledge and talent hiding away in the vendor-influenced standards groups, but the balance of power might be shifting towards a healthier position.

The outsourcing trend which handed control of the ecosystem to the vendors was a massive over-reaction from the telcos. Fear took over and too much risk was outsourced. Openness is leading a CSP renaissance, but it is still a bit early to call CSPs innovative.