Huawei says US is only hurting itself with sanctions

Speaking at this years’ virtual Huawei Analyst Summit, Rotating Chairman Guo Ping hit back at the US, suggesting it will only do more damage to itself by pursing its current course.

The keynote session from the newly rotated executive was one of defiance as a confident face was put on newly refined aggression from the US. The latest actions to inhibit Huawei’s supply chain will almost certainly have an impact, but it will continue to be a very prominent player in the telco industry.

“We have survived and forged ahead despite all the odds,” Ping said, while also boasting of the $120 billion in revenues achieved in 2019. “The US insists on persistently attacking Huawei, but what will that achieve for the world?”

Ping is referring to additional sanctions placed on Huawei at the end of last week. Announced by Commerce Secretary Wilbur Ross, the US will prevent any company around the world from using US equipment, IP or software to work with Huawei. The aim is to choke the vendors supply of components and semiconductors, a critical element of smartphones and telecoms base stations.

To mitigate these actions, Ping has said R&D is on the up, to remove dependence on US suppliers, while the business has been stockpiling. But there will be a material impact on operations eventually.

This is a mitigation strategy, softening the blow but it is not a concrete solution. The US semiconductor industry can do want few others can, cultivating specialisms which have taken years to fine tune. This cannot be replicated by China overnight.

“Our business will be inevitably impacted,” said Ping. “But we are confident in finding a solution soon.”

Huawei has consistently stated such actions for the US would be a net loss for the industry, but what is the risk? Ping is pointing towards industry fragmentation.

This is of course a dirty word in the telecoms industry, but Huawei’s warnings should be taken with a pinch of salt. Ping warned of standard fragmentation, which is a long-term risk, but it is not one which is emerging now. The immediate risk is two, independent ecosystems, the creation of two distinct markets. Suppliers would hate this, and there is a chance competition (and therefore prices) would be impacted for telcos.

However, there is not really a risk of standard deviation is the short-term. Like the US, Huawei seems to be playing a bit fast and loose with rhetoric and muddied statements.

Ping also suggested this would be a severe consequence for the US telcos, a lesson which they should have already learned.

According to the executive, during the 2G era US telcos did not align on standards whereas European counterparts did. This offered scaled business opportunity to European suppliers, while the US vendors have to deal with fragmentation. Ultimately, the US has no remaining vendors because of this, while the likes of Ericsson and Nokia have thrived.

This is a mishmash of the truth, half-correct statements, half-informed assumptions and missing information.

Firstly, European telcos backed GSM standards. The fragmentation of standards was not US in-fighting, but a Europe versus North America situation, with Europe winning out. Secondly, yes, US vendors were swallowed up by bigger and more successful rivals, but so were European ones. The likes of Siemens and Alcatel have been acquired during the same period.

The reason there are so few suppliers is because previous generations of bureaucrats embraced market consolidation in a way which would have turned stomachs today.

Should the US continue to pursue Huawei in this manner, it will hurt everyone. It could lead to industry fragmentation, the separation of the East and West into two separate markets and much more isolationist policy making.This will hurt Huawei, it will hurt market competition, it will hurt the telcos, it will hurt US suppliers and it will hurt the industry on the whole.

There might be some inaccuracies here, but the overall message is very relevant; isolationist policy is not the friend of the telecommunications industry.

5G patent chest-beating is an unhelpful distraction

The constant propaganda sent out by the big kit vendors has recently moved onto 5G patents and the latest claims are more unhelpful than ever.

A couple of weeks ago Huawei flagged up a report from the European Patent Office entitled ‘Digital technologies take top spot in European patent applications’. The reason it did so was to bring attention to the table at the end of it that had Huawei as the clear leader among all companies when it came to patent applications to the EPO last year.

Then this morning Nokia sent out a press release headed ‘Nokia announces over 3,000 5G patent declarations’. Specifically the declaration was of 3,000 patent families to ETSI (European Telecommunications Standards Institute) that are essential for 5G and this milestone was positioned as a major step forward from the previous one six months ago concerning 2,000 patent declarations.

There followed a bunch of self-promotion and generic quotes from Nokia bigwigs about how into R&D the company its and, by extension, what a great company it is. The clear intention was to create the impressing that Nokia’s 5G R&D efforts have improved by 50% in the last six months, but it’s very difficult to verify that claim since there are so many unknowns in the claim.

Were the 2,000 patent declarations also ETSI 5G essential families or something else? What even is a patent declaration – is it an application or a full patent? If we are comparing apples with apples and the two milestones concern exactly the same type of claim, how has Nokia suddenly managed such a dramatic uptick in its 5G R&D efforts?

At least Nokia’s claims concern the 5G standard. Huawei’s big achievement was merely to file more applications for all kinds of technology with the EPO than any other company. Anyone can file a claim but that says nothing about the utility or viability of the patent in question and, if a company was determined to win a given patent application race, it could just order loads of its employees to file applications for any old rubbish.

Ericsson has yet to join in this patent pissing competition via press release and when we asked it for comment we were directed to this blog post from October last year, entitled Why you shouldn’t believe everything you read about 5G patents. In it Christina Petersson, CIPO and Head of IPR & Licensing at Ericsson, argues that when you apply certain essentiality filters, Ericsson comes out on top when it comes to 5G patents.

We might be a bit thick here at Telecoms.com, but we find all these claims and counterclaims totally confusing and impossible to derive any useful conclusions from. So we asked around the industry and came to the conclusion that we’re not alone, even among people whose job it is to understand this stuff.

Our instincts about the Nokia announcement were supported, that they may well be comparing apples with oranges, disguising that fact with slippery PR language. For those claims to have substance they needed to be a lot more specific. On top of the vagueness surrounding the 2,000 milestone, we don’t know if those 3,000 patent families are unique to 5G or just recycled legacy patents.

This apparently happens a lot, you see. 5G wasn’t created in a vacuum, it stands on the shoulders of 4G and many of the patents that concern the underlying physics of lobbing voice and data from a transmitter to a phone go back to even earlier generations. So many of the technologies required to make 5G work were actually invented decades ago, but still apply today. That’s fine but if participants in the patent Olympics are counting old patents among their big 5G achievements that’s cheating, surely.

Then you have the matter of where the patent applications are filed. It’s easy to file a patent but a lot harder to have it granted. That involves getting a lot of things right and jumping through a lot of bureaucratic hoops. In principle you literally write a claim on the back of a fag packet and hand it in, and that would count as an application. ETSI seems to be the gold standard when it comes patent rigour.

Around half of Huawei’s 5G patent applications seem to have been made in China, however, and they account for half of all such applications made in China. While there’s nothing intrinsically wrong with that, it’s worth noting that Samsung and LG, which are in the top three 5G patent applicants alongside Qualcomm, have hardly filed any applications in Korea. It’s almost as if the barrier to entry for patent filing in China is somehow lower.

Apparently it didn’t used to be like this. There were no press releases talking up how many patents a company had filed and such things weren’t used as a proxy for general R&D competence. The impression we get is that it was kicked off by Huawei, which is showing an increasing fondness for chucking thinly supported claims around, and the likes of Nokia feel compelled to return fire.

So why would Huawei, which still seems to be on top despite the best efforts of the US government, feel the need to resort to such questionable tactics to inflate its public image? The answer probably lies in its increasing belligerence in the face of President Trump’s provocations, as illustrated by its recent decision to file a lawsuit against Verizon. Again, this is unprecedented, as companies tend not to sue potential customers.

There has been a steady drip of propaganda positioning Huawei as the clear 5G technological leader. The message seems to be that if countries allied to the US decide to ban Huawei from their 5G networks, that will put them at a significant disadvantage against those who don’t. Additionally it tells the US that Huawei doesn’t need it anyway and strikes a general tone of defiance.

The fact that this patent war is being waged in Europe probably isn’t a coincidence either, as that is the primary battleground in the geopolitical battle of wills between the US and China. Every time a European country refuses to ban Huawei that represents a win for China and its belt-and-road strategy of economic imperialism.

The fact remains, however, that nearly all of the patent announcements being chucked out there are largely meaningless given the lack of qualification and context attached to them. Most patent applications made now won’t be processed for around four year, and it’s only then that we’ll know who the 5G technology leader is. Until then the industry would be well advised to take any claims with a big pinch of salt. We certainly will.

Four operators take the lead on GSMA edge initiative

Last week, the GSMA announced an initiative to standardise the edge, with Telefónica, KT, China Unicom and Telstra the first to step up to lead the way.

In signing a Memorandum of Understanding (MoU), the four telcos the aim will be to test Edge Computing functionality and interconnection capability, as well as verifying the ease and simplicity of a MEC platform for application developers to leverage.

“Together with these Tier 1 operators, we are making available to the industry the means to build and deliver a global telco-based Edge Cloud service, providing the necessary mechanisms that complement current MEC standards to enable the federation of operator’s edge computing platforms,” said Juan Carlos García, SVP Technology and Ecosystem at Telefónica.

“With this, telcos will be able to deliver a universal Edge Computing service that will facilitate application developers and Enterprises the deployment of their services globally through a simple and single interface.”

The aim of the GSMA initiative is to standardise platforms for edge computing, ultimately driving towards interoperability in the telco community. Although standards might not be the most exciting part of the industry, they are critical to ensure smooth progress and also realising the telco rank in the pecking order.

The collaboration will take place over four phases:

  • Phase One: development of basic Edge Computing capabilities such as interconnection of MEC platforms, smart edge discovery and smart resource allocation
  • Phase Two: enabling mobility features
  • Phase Three: service availability to roamers, to enable the use of edge when customers moves from their home network and visit a different network
  • Phase Four: federation capabilities

Ultimately the aim is to create global consistency, a telco platform without the need to develop custom integrations for each and every market. Such interoperability and consistency is critical to ensure the effective development of a sustainable edge ecosystem. It also provides confidence to customers to deploy applications in any data centre, with policies designed for privacy, security and enhanced performance.

“Through our partnership with Telefonica, Telstra and China Unicom, all from different regions across the world, we set out to explore the most effective way to build a globally federated edge platform and tap into the full potential of telco-based Edge Computing,” said Jongsik Lee, SVP & Head of Infra R&D at KT.

“Leveraging MEC standards and key technologies, we aim to provide a reference model the industry can build on and developers and enterprises can take advantage of.”

GSMA takes standardisation to the edge

The GSMA has announced a new working group to develop an Edge Compute architectural framework and reference platform.

With China Unicom, Deutsche Telekom, EE, KDDI, Orange, Singtel, SK Telecom, Telefonica and TIM joining forces, the aim will be to develop an interoperable platform to make edge compute capabilities widely and easily available. The edge has long been championed as a means to drive additional revenues and differentiation, though if the telco industry is not sharp enough it will lose the initiative to the internet giants.

“Operators are very well placed to provide capabilities such as low latency through their network assets,” said Alex Sinclair, CTO at GSMA. “It is essential for enterprises to be able to reach all of their customers from the edge of any network. Based on the GSMA Operator Platform Specification, Telco Edge Cloud will provide enterprise developers and aggregators with a consistent way to reach connected customers.”

One of the issues which has been facing progress in this emerging segment is interoperability. Fragmentation is the enemy of the telco world and the GSMA is hoping these specifications will address these challenges.

“Edge Cloud is a promising opportunity to enable the development of services that need low latency connection and to meet various service demands from enterprise customers,” said Yoshiaki Uchida, Executive Director, Technology Sector at KDDI. “The innovation of telecommunication services will be accelerated by the enhancement of service quality and the customer experience in real-time applications such as cloud XR and cloud gaming.”

Edge cloud and computing is an opportunity to deliver new services based on the low-latency advantage which the 5G era can offer. While this is a clear opportunity to add additional value and work with more enterprise customers for mission critical services, the profits are being threatened by the internet giants.

The likes of AWS, Microsoft and Google have been beefing up their cloud services team with new hires and the creation of new services to make good on this promise also. This is a promising new segment for the connectivity world, a chance to offer genuinely new services, though the telcos will have to duke it out with the internet giants.

3GPP’s Release 16 timeline hangs in the balance thanks to coronavirus

Much has been made of the impact of the coronavirus outbreak on the telecoms industry, though it now appears the critically important Release 16 timeline could be under threat.

Now the initial shock of the Mobile World Congress cancellation has settled, business is seemingly back to normal, though the coronavirus outbreak has not been contained. The Barcelona trip was cancelled, but so were a horde of other ventures into foreign lands. The 3GPP 5G standards meetings also fell onto the chopping block.

Interestingly enough, Release 16 from the 3GPP, industry specifications to deal with 5G standalone RAN, virtualisation, the 5G core, network slicing and various other topics, could also feel the impact of the virus. The consequence could be further delay on the release of the industry specifications.

“When experiments with e-meetings have happened in the past it is very easy for discussion to spiral and go quite tangential,” one insider told Telecoms.com. “90% of the negotiation to reach agreements on contentious topics happen over coffee, lunch and dinner… you don’t get that opportunity in an e-meeting.”

As a result of the coronavirus outbreak, all Technical Specification Group (TSG) and Working Group (WG) meetings have now been replaced with electronic meetings for the first quarter. 3GPP has said this is only applicable where practical, and when not, presumably the meeting is cancelled.

For Q2, where the meetings were set to take place in China, activities have been cancelled for the moment. Replacement venues are being sought, but it is by no-means a guarantee the outbreak would be contained by this point.

In December 2018, 3GPP already announced a delay to the release of the standards. It might be stubbornly sticking to the existing timelines right now, but if it was not able to stay on schedule in 2018, what chance will it have while the coronavirus outbreak is still at large?

“We are obliged to suspend belief, but I suspect it won’t take much to blow past March plenaries without agreement on some key topics,” our source continued.

“The problem is it will only take one little thing to hold up the whole lot. The release has to be an independently implantable document set. If anything that is mandatory to support is not included, they have to make it optional, remove it completely to the next release or delay. First two options would require a lot of work in themselves.

“e-meetings do not make agreement easy, and it would only take one topic to get political to have someone attempt to throw everything into question as a tactic.”

As it stands, the group is attempting to negotiate the specifications for the RAN aspect of Release 16. The virtualisation components are all largely finalised, while the core aspect is not due until June, when Release 16 would be theoretically frozen. That said, any delay would push these timelines back once again.

According to Dario Talmesio, Principal Analyst & Practice Leader for Omdia, further delay to Release 16 could cascade and possibly have an echo effect throughout the industry. Time to ROI would be increased, which would not be considered welcome news for the financially strained telcos or vendors which have been promised shareholders 5G fortunes.

With RAN being the topic of debate currently, the standards hang in a precarious position. RAN is the aspect of network infrastructure which attracts the most attention because of the scale of deployment. While other standards missions can run in parallel, RAN is critical to product development timelines. Delays here would be heart-breaking for telcos and vendors, but there could also be a knock-on effect for the International Mobile Telecommunications-2020 requirements issued by the ITU.

Although our source was not necessarily the most confident, there are some who are a bit more optimistic.

“The next set of 3GPP meetings have been turned ‘virtual’ and in 3GPP SA2 at least all Release 17 proposals are postponed until the April meeting,” said Alan Carlton, VP of Wireless and Internet Technologies at InterDigital.

“In the upcoming meeting companies will only be allowed to submit corrections to Release 16. This will obviously cause some delays. New ad hoc meetings to catch up will have to be approved and it is not certain all companies will agree to that. Hopefully this is just a blip and thing will get caught-up post this somewhat unusual ‘flu’ season.”

The 3GPP is maintaining an optimistic position on the meetings for the next six weeks and will not make a call on the timelines until TSG Chairs have a chance to discuss progress.

A bunch of operators get together to push 5G and MEC interoperability

América Móvil, KT Corp., Rogers, Telstra, Verizon and Vodafone have formed a new gang called the 5G Future Forum.

The stated aim of the gang is to ‘accelerate the delivery of 5G and mobile-edge computing-enabled solutions around the world.’ It apparently thinks that there are issues around the interoperability of 5G specifications that need sorting out. This doesn’t seem to refer to the 5G standard itself, but rather 5G-enabled solutions like autonomous vehicles, smart factories and so on.

“This forum of global leaders in 5G marks an important step in ensuring edge computing works seamlessly for our customers,” said Vinod Kumar, CEO of Vodafone Business. “These new specifications will allow us to offer services that work consistently across the globe and support devices moving between countries. 5G opens up a wealth of opportunities for new solutions and business models and we’re excited to play a role in bringing them to life.”

“5G is a key enabler of the next global industrial revolution, where technology will transform how we live and work. It’s critical that technology partners around the world unite to create the most seamless global experience for our customers,” said Hans Vestberg, CEO of Verizon. “We are proud to join with our fellow 5G leaders to unlock the full potential of applications and solutions that will transform with 5G’s fast speeds, high reliability, improved security and single-digit latency.”

All the other founding members got a canned quote too but you get the gist. Other than a press release there doesn’t seem to be much else to the forum yes, not even a website. Presumably other operators will be brought into the fold in due course, but the absence of any telecoms or technology specification organisations looks like a potential issue.

Big Tech sign-up to make smart home standards

Apple, Amazon and Google are joining forces with the Zigbee Alliance to form the Connected Home over IP project to create universal standards for the smart home ecosystem.

The aim of the project is simple; get ahead of the game and reduce the potential for ecosystem fragmentation in the smart home. With Apple, Amazon and Google on board, the working group has access to the worlds’ most popular virtual assistants and can drive towards creating a framework which encourages interoperability and compatibility.

“Our goal is to bring together market-tested technologies to develop a new, open smart home connectivity standard based on Internet Protocol (IP),” Google’s Nik Sathe and Grant Erikson wrote on the company’s blog.

“Google’s use of IP in home products dates back to the launch of Nest Learning Thermostat in 2011. IP also enables end-to-end, private and secure communication among smart devices, mobile apps, and cloud services.”

While it might seem slightly unusual that the internet giants are attempting to collaborate without being forced to, the bigger picture makes it a bit more logical.

The likes of Google, Apple and Amazon are looking to make more money from the software and services elements of the smart home ecosystem. This is an admirable quest, though for money to be made there needs to be mass adoption of smart home products.

As it stands, smart home devices manufacturers are facing a conundrum. Either, spend a lot of money to make sure devices are compatible with all the different smart home ecosystems which are developing, or pick a winner and risk losing out on customers who will exist elsewhere. By creating universal standards for the smart home ecosystem, the manufacturers will theoretically be more encouraging to engage in this emerging segment.

What is always worth remembering is that while the likes of Google and Amazon currently sell smart home devices, there will be a lot more money for these companies on the software side when smart home products are adopted on scale. This is their bread and butter after all, with a plethora of existing relationships already in place. Looking at Apple, this is a company which manufactures premium devices, but has some very aggressive ambitions in the software and services world. This is where CEO Tim Cook envisions growth for the company in the future.

Ultimately this is a good sign for the industry. Collaboration is a word which is thrown around so much nowadays it is almost meaningless, but when it results in universally accepted standards to drive interoperability and compatibility, there is something genuinely exciting to look forward to.

Europe postures with standards leadership still on the line

European standards organization ETSI has released a report demanding the continent take a leadership role for standards and regulation in the global digital economy.

While some might question whether the sluggish Brussels bureaucrats can get up to speed quick enough, there is hope; regulators around the world all share the same track-record when it comes to the painfully slow progress of creating regulatory and legal frameworks.

The report, commissioned at the request of ETSI, was authored over the first-half of 2019 and demands Europe take the lead on creating the standards necessary for a healthy and progressive digital economy.

“Our competitors are very serious about taking the lead in digital transformation,” said Carl Bildt, Co-Chair European Council on Foreign Relations.

“It is important that EU lawmakers put standardization at the centre of EU digital and industrial strategy. Otherwise Europe will become a rule taker, forever playing catch-up in the innovation, production and delivery of new digital products and services.”

Although many would want to see a collaborative, geographically-neutral, approach to standardising the digital economy, this is unlikely to happen. As Bildt highlights above, someone will take a leadership position, standards will gain acceptance, before other regions will have to adopt the rules.

The question which remains is whether Europe, the US or China will have the greatest influence on global standards. In fact, ETSI questions whether Europe is keeping pace with leaders today or whether its influence is waning already. Unfortunately, with the platform economy gaining more traction each day, this is one area which should not be considered a strength of the bloc.

46% of platforms with a revenue above $1 billion are based in the US and 35% in Asia, while Europe only accounts for 18%. These platforms often drive their own ecosystems and have largely been self-regulating to date. This is going to change in the future, though to give European organizations a chance at capturing growth, the European Commission might have to lead the charge to create open-standards. The contrary approach might only offer the established players greater momentum and influence.

This is perhaps the risk which is emerging currently. The idea of globalisation and open-standards are not new, though there is evidence certain markets are heading towards a more isolationist mindset and regime.

Although it is easy to point the finger at aggressive political leaders elsewhere, the report demands Europe look intrinsically also. The European Commission has to take a strong leadership position across the bloc, as with 28 members states there is risk of fragmentation. It only has to be slight variances to start, but this could snowball into greater complications. The digital tax conundrum is an example of what can go wrong over an extended period of time.

This report might be more of a generalist statement to encourage a proactive mindset from European bureaucrats, though there are plenty of examples of governments, public sector administrations and private industry trying to control the tone.

Looking at the ever more influential world of artificial intelligence, the number of feasibility, standards and ethics boards is quite staggering. All of these initiatives will want to create rules and frameworks to govern the operation and progression of AI, though only one can be adopted as the global standard. Regional variances are of course feasible, but this should not be deemed healthy.

In the UK, the Government created its AI Council. The European Commission has released various white papers exploring how AI should be governed. The White House’s National Science and Technology Council Committee on Technology is also exploring the world of AI. Facebook has even created its own independent advisory panel to aid the creation of standards and regulation.

Should Europe want to control the global standards process, it will come up against some stiff competition. The power and influence of the US should not be underestimated, it is home to some of the worlds’ most recognisable and profitable brands after all, while China has a track-record of flooding working groups at standards organizations. This will have a noticeable impact on the final outcome.

That said, the success of GDPR will offer hope.

Europe’s General Data Protection Regulation might have caused headaches all around the world, but it has set the tone on the approach to privacy, management of data and the influence of the consumer. Since its introduction, several other countries, India and Japan being two examples, have been inspired by GDPR to introduce similar regulation, while there have been calls in the US to do the same also.

This piece of regulation was critical to ensure the European principles of privacy are maintained moving forward. This is a win, but there are still battles to be had when it comes to AI, security, encryption, cross-border data flow and access to data.

Standardisation might not be the most exciting topic to discuss in the TMT world, though taking a leadership position can offer advantages to the companies who call that region home. A thorough and innovative regulatory regime can open-up new markets, ensure competition is healthy throughout the ecosystem and drive national economies at scale.

The regulatory landscape is set to undergo somewhat of a shift over the coming months and years, though which region will take the lead is still hanging in the balance.

IBC 2019: Linear TV isn’t dead just yet

This might sound like a very bold and short-sighted statement, but thanks to the development of IP-based standards, traditional broadcasters might just be able to survive in the digital economy.

This is of course not a statement which suggests business is as usual, there are major restructures and realignments which need to occur to future-proof the business, but linear TV and traditional broadcasters can survive in the cut-throat world of tomorrow.

The change which is being forced onto the world is HbbTV and ATSC 3.0, two new standards for the traditional broadcasters to get behind which offer the opportunity to create the experiences consumers desire and the business model which advertisers demand.

HbbTV, Hybrid Broadcast Broadband TV, and ATSC 3.0 are both standards which aim to take the broadcasting industry into the digital world. Although these standards are not necessarily harmonised, the IP approach effectively forces manufacturers and broadcasters into an era of on-demand content, interactive experiences and hyper-targeted advertising.

Over the last few years, many in the TMT world have been quick to write the obituaries for linear programming, but this is not an area which should be written off so abruptly. There is still a niche for the idea of linear TV, and if executed competently, there will be an audience of Generation Z sitting on the sofa next to the Baby Boomers.

Oliver Botti of the Fincons Group, pointed to two areas where linear TV currently, and will continue to, thrive. Firstly, live sports, and secondly, reality TV programming such as Celebrity Big Brother. With both of these standards, new content, experiences and advertising business models can be enabled to ensure continued relevance.

For sports, additional content can be offered to the consumer alongside the action to offer the viewer more control of their experience. This is something which is becoming increasingly common in the OTT world, though it is yet to genuinely penetrate traditional broadcasting in any meaningful way. The second example Botti highlighted is a very interesting one.

The concept of Celebrity Big Brother is not new to most. Dozens of cameras in a closed environment, following around the lives of prima donnas where at least one will probably make some sort of racist gaff at some point. However, with the new standards, Botti highlighted users can choose which camera is live on their own TV, creating a personalised content experience.

It does sound very creepy, but this is the sort of thing which is likely to appeal to some audiences…

Both of these examples are live content. For some, this experience can not be replicated in an on-demand environment, driving the continued relevance for linear TV. It is a niche, but one which will drive the relevance of traditional broadcasters and the relevance of linear programming for years to come.

Vincent Grivet, Chairman of the HbbTV Association, also highlighted the standards also allow for personalised advertising. This is just as, or perhaps more, important to the survival of traditional broadcasters as without the advertising dollars these businesses will not survive. Advertisers know what they want nowadays mainly because Silicon Valley can offer it. If hyper-targeted advertising is not an option, advertisers will not part with their valuable budgets.

What is worth noting, is that both of these standards rely on the TV manufacturers creating products which allow for success to continue. This is where an issue might arise; currently there is no global harmonisation.

HbbTV has been adopted in Europe, while ATSC 3.0 has been championed in the US and South Korea. China is doing what China does and going down its own separate path, creating a notable amount of fragmentation. This might be a challenge.

Richard Friedel, Executive VP of Technology & Broadcast Strategy of 21st Century Fox, told us that as an engineer he would like to see more harmonisation, but as a pragmatist, he doesn’t see it happening any time soon. All the standards are IP-based, therefore there will be a natural alignment as the industry evolves over the next couple of years, but this does not necessarily mean genuine harmonisation.

This presents a complication for the industry, but let’s not forget that this is a positive step in the right direction. Linear TV might not be attracting the headlines, but if you listen to the right people, it is certainly not dead.

Interdigital sues Motorola-owner Lenovo over 4G patents

Mobile and video tech developer Interdigital has filed patent infringement action against Lenovo in the UK because they can’t agree a price for use of its 4G patents.

Perhaps wary of being labelled a patent troll, Interdigital is keen to stress that this is the first patent infringement litigation it has initiated for six years. It claims its hand has been forced after the failure of almost a decade of negotiation with Lenovo, which makes Motorola phones as well as its own-branded devices.

Interdigital reckons it owns around 10% of the standards-essential patents in both 3G and 4G technology, which means it gets a piece of the action whenever someone sells a device that uses them. How much users of these patents have to pay is usually determined on a FRAND (fair, reasonable and non-discriminatory) basis, but apparently Lenovo won’t even accept third party FRAND arbitration.

Patent litigation canned comments are among the most formulaic, but let’s have a look anyway. “Having product companies take fair licenses to patented technologies flowing out of fundamental research is absolutely essential for the long-term success of worldwide standards like 4G and 5G,” said William Merritt, CEO of Interdigital.

“InterDigital has a long history of valuable technology innovation and patient, good faith negotiation and fair licensing practices, including our willingness to allow the economic terms of a FRAND license to be determined via binding neutral arbitration. We also have longstanding licensing relationships with many of the top companies in the mobile space, including successful license arrangements with Samsung, Apple, LG and Sony, among others.

“For our company, we turn to litigation only when we feel that negotiations are not being carried out in good faith. In bringing this claim in the UK High Court of Justice, which has a history of examining standards-essential patent issues, we are hopeful for a speedy resolution and a fair license.”

Here are the patents in question:

  • European Patent (UK) 2 363 008 – Enables the efficient control of carrier aggregation in 4G (LTE). In advanced mobile phones, carrier aggregation is key to achieving high data rates.
  • European Patent (UK) 2 557 714 – Supports the use of multiple antennae transmissions in 4G (LTE). The patent enables the use of flexible levels of error protection for reporting by the handset, increasing the reliability of the signaling.
  • European Patent (UK) 2 485 558 – Allows mobile phone users quick and efficient access to 4G (LTE) networks. One of the main technological challenges of developing LTE networks was efficient bandwidth usage for various traffic types such as VoIP, FTP and HTTP. This patent relates to inventions for quickly and efficiently requesting shared uplink resources — for example, reducing lag when requesting a webpage on a smartphone on LTE networks.
  • European Patent (UK) 2 421 318 – Decreases latency during HSUPA transmission by eliminating certain scenarios in HSUPA where scheduling requests may be blocked. A blocked scheduling request may prevent a smartphone from sending data.

Interdigital presumably has others that Lenovo is using in its devices, so either there’s no dispute over the them or Interdigital is focusing on the four juiciest ones, who knows? Patent litigation is pretty arcane stuff at the best of times, but it seems like Lenovo must have really pushed its luck for its relationship with Interdigital to come to this. It’s hard to see how they can justify refusing to go to FRAND arbitration, but there could well be extenuating circumstances that will come to light in due course.