The private power of the edge

One of conundrums which has been quietly emerging over the last couple of months concerns how to maintain privacy when attempting to improve customer experience, but the power of the edge might save the day.

If telcos want to be able to improve customer experience, data needs to be collected and analysed. This might sound like a very obvious statement to make, but the growing privacy movement across the world, and the potential of new regulatory restraints, might make this more difficult.

This is where the edge could play a more significant role. One of the more prominent discussions from Mobile World Congress in Barcelona this year was the role of the edge, and it does appear this conversation has continued through to Light Reading’s Big 5G Event in Denver.

Some might say artificial intelligence and data analytics are solutions looking for a problem, but in this instance, there is a very real issue to address. Improving customer experience though analytics will only be successful if implemented quickly, some might suggest in real-time, therefore the models used to improve performance should be hosted on the edge. This is an example of where the latency business model can directly impact operations.

It also addresses another few issues, firstly, the cost of sending data back to a central data centre. As it was pointed out today, telcos cannot afford to send all customer data back to be analysed today, it is simply an unreasonable quantity, therefore the more insight which can be actioned on the edge, with only the genuinely important insight being sent back to train models, the more palatable customer experience management becomes.

Secondly, the privacy issue is partly addressed. The more which is actioned on the edge, as close to the customer as possible, the lesser the concerns of the privacy advocates. Yes, data is still being collected, analysed and (potentially) actioned upon, but as soon as the insight is realised the sooner it can be deleted.

There are still sceptics when it comes to the edge, the latency business case, artificial intelligence and data analytics, but slowly more cases are starting to emerge to add credibility.

Europe unveils its own attempt to address ethical AI

Addressing the ethical implications of artificial intelligence has become very fashionable in recent months, and right on cue, the European Commission has produced seven guidelines for ethical AI.

The guidelines themselves are not much more than a theoretical playbook for companies to build products and services around for the moment. However, any future legislation which is developed to guide the development of AI in the European Union will likely use these guidelines as the foundation blocks. It might not seem critical for the moment, but it could offer some insight into future regulation and legislation.

“The ethical dimension of AI is not a luxury feature or an add-on,” said Vice-President for the Digital Single Market Andrus Ansip. “It is only with trust that our society can fully benefit from technologies. Ethical AI is a win-win proposition that can become a competitive advantage for Europe: being a leader of human-centric AI that people can trust.”

“We now have a solid foundation based on EU values and following an extensive and constructive engagement from many stakeholders including businesses, academia and civil society,” said Commissioner for Digital Economy and Society Mariya Gabriel. “We will now put these requirements to practice and at the same time foster an international discussion on human-centric AI.”

The seven guidelines are as follows:

  1. Human agency and oversight: AI systems should enable equitable societies by supporting human agency and fundamental rights, and not decrease, limit or misguide human autonomy.
  2. Robustness and safety: Trustworthy AI requires algorithms to be secure, reliable and robust enough to deal with errors or inconsistencies during all life cycle phases of AI systems.
  3. Privacy and data governance: Citizens should have full control over their own data, while data concerning them will not be used to harm or discriminate against them.
  4. Transparency: The traceability of AI systems should be ensured.
  5. Diversity, non-discrimination and fairness: AI systems should consider the whole range of human abilities, skills and requirements, and ensure accessibility.
  6. Societal and environmental well-being: AI systems should be used to enhance positive social change and enhance sustainability and ecological responsibility.
  7. Accountability: Mechanisms should be put in place to ensure responsibility and accountability for AI systems and their outcomes.

The Commission will now launch a pilot phase with industry and academia to make sure the guidelines are realistic to implement in real-world cases. The results of this pilot will inform any measures taken by the Commission or national governments moving forward.

This is one of the first official documents produced to support the development of AI, though many parties around the world are attempting to weigh in on the debate. It is critically important for governments and regulators to take a stance, such is the profound impact AI will have on society, though private industry is attempting to make itself heard as well.

From private industry’s perspective, the mission statement is relatively simple; ensure any bureaucratic processes don’t interfere too much with the ability to make money. Google was the latest to attempt to create its own advisory board to hype the lobby game, but this was nothing short of a disaster.

Having set up the board with eight ‘independent’ experts, the plan was scrapped almost immediately after employees criticised one of the board members for not falling on the right side of the political divide. This might have been an embarrassing incident, though the advisory board was hardly going to achieve much.

Google suggested the board would meet four times a year to review the firms approach to AI. Considering AI is effectively embedded, or will be, in everything which Google does, a quarterly assessment was hardly going to provide any actionable insight. It would be simply too much to do in a short period of time. This was nothing more than a PR plug by the internet giant, obsessed with appearing to be on the side of the consumer.

AI will have a significant impact on the world and almost everyone’s livelihood. For some, jobs will be enhanced, but there will always be pain. Some will find their jobs redundant, some will find their careers extinguished. Creating ethical guidelines for AI development and deployment will be critical and Europe is leading the charge.

Orange Spain and ZTE complete Europe’s first standalone 5G call

The mobile operator claimed that the voice and data call over end-to-end 5G network in Valencia was the first of its kind in Spain as well as in Europe. All other trials have been done over non-standalone networks.

The Spanish branch of Orange successfully trialled a voice and data call on a “100% 5G” network with standalone architecture, the company announced. The end-to-end solution was provided by ZTE, one of Orange’s suppliers. The test achieved a peak downlink data rate of 876 Mbps on one test terminal, and 3.2 Gbps with 12 test terminals working simultaneously in the same cell.

“It is critical to understand this new and disruptive technology, with which we could close the gap from our 4G networks to offer our customers the best possible 5G network in the world when the time is right,” said Mónica Sala, Director of Networks at Orange (translated from Spanish). “The know-how of ZTE is evident in achieving this milestone and we are very proud of the results.”

The live 5G networks today, in South Korea and the US, for example, are primarily providing enhanced mobile broadband services, which can be achieved with non-standalone mode, i.e. overlaying 5G radio networks on top of 4G core. This was the architecture that Huawei used when demonstrating 5G at MWC on Vodafone’s network. On the other hand, to achieve 5G’s full capabilities, including to provide virtualised networks (e.g. network slicing for a particular client) and to run the extreme low latency applications (e.g. automatic cars) there would need end-to-end 5G networks, i.e. 5G radio and 5G core.

ZTE was also obviously happy with the success of the trial. “It is a great pleasure for us to work hand in hand with Orange for technological innovation and 5G leadership,” Xiao Ming, President of Global Sales at ZTE stressed. Orange is one of ZTE’s two biggest accounts in Europe (the other being the Three group), so holding on and expanding the partnership is critical for the company that has been struggling in the mature markets.

Orange Spain plans to extend 5G trials to other industries including construction, energy, health, automotive, and tourism, to test out the use cases. The company also said that it is going to test 5G in a handful of cities with the support provided by Red.es, the country’s digital transformation programmes, operated under the direction of the Secretary of State for Information Society and Digital Agenda.

Taking advantage of intent-based networking automation

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece David Erickson, co-founder and CEO of Forward Networks, makes the case for intent-based networking and its use in existing networks.

Perhaps the most exciting network evolution in recent years is intent-based networking (IBN). IBN is a visionary approach to network automation and management that aligns the intended high-level behavior and policies of the network with the actual network design and configurations.

Implicit in this definition is that there is a great deal of automated intelligence that understands how to design efficient, error-free networks, as well as how to reason about the resulting behavior from an extremely large number of various design and configuration details. Although still in its infancy, IBN systems have started to automate error detection and analysis, change window verification and network upgrades and roll-outs.

Looking to take advantage of this kind of automated intelligence, carrier-grade providers frequently ask if IBN technology can be deployed on existing networks, or if the integration issues and technology upgrades lend themselves only to greenfield deployments? To answer that question, let’s break down IBN into two functional areas: design and verification.

If you start with your network policy objectives and want to drive the configurations and management of network devices, that is design-oriented IBN. Going the other way, if you want to analyze your current network design and verify that it conforms to your stated objectives, that is intent-based verification. The good news is that verification of existing network designs is a much more thoroughly solved problem and completely lends itself to any existing network today (assuming the IBN system has the capability to model and analyze the various vendors, devices and protocols in the underlying network).

Verifying network behavior requires reasoning and intelligence

IBN verification allows service providers to automate the analysis of existing network paths end-to-end, based on the collected information from every network device and modeling the behavior for all possible traffic at each hop. The end-to-end path behavior can then be verified against similar end-to-end network requirements and policy statements. Some examples of end-to-end behavior that IBN can verify easily:

  • Are there at least 3 redundant paths from a particular POP’s access layer router to another POP through the MPLS Core?
  • Are there any single points of failure along the entire path?
  • Are all BGP advertised routes correctly configured on upstream routers across my entire network?
  • Ensure there are no viable routes from one customer’s POP to a different customer POP, and that they are effectively isolated for specific types of traffic.
  • Is a particular multi-tier, multi-service POP configured for only the defined set of services using expected protocols, like Layer 2 Metro Ethernet or L3VPN.
  • Is traffic coming in to the provider edge gateway from a customer properly restricted to only specific destination POPs and services?

IBN systems have the capability of understanding such high-level, generalized requirements and verifying them in the context of the current network design. IBN effectively bridges the intent and the individual device configurations to reason through and automate the verification process. From an IT perspective, this can proactively identify any latent errors in the network which could eventually lead to outages, while avoiding tedious manual searches to isolate issues or perform root-cause analysis. For example, if a set of configuration changes are proposed or a new service is deployed, IBN can help verify the impact to existing policies in the software model before deploying to the live network, averting possible roll-backs and helping to accelerate change windows.

Verification is a distinctly different methodology than traditional testing environments, because verification is reasoning based on an analysis of the network design, configurations and current network state. It does not look at live traffic flows to determine network activity. Verification can thus do something traditional testing can rarely do: “prove a negative”, by confirming that something can’t happen, such as two networks being unreachable through any path. IBN verification can also identify configuration errors like MTU mismatches, forwarding loops, or IP address duplication anywhere in the network without reviewing devices one by one.

Requirements that allow IBN to be applied to existing networks

IBN systems create a software model of the provider network and can model all possible behaviors the design allows to verify compliance with the intended policies and service descriptions. For an IBN system to work on an existing network, the only conditions that need to be met are: 1) Read-only access must be available to each device to pull configuration files and state information, 2) the IBN software model must accurately model the behavior of each network device (switch, router, firewall and load balancer) for all possible packet flows, and 3) the IBN model must accommodate all carrier class protocols such as EVPN, BGP, MPLS, virtual networking, etc.

Like a secure sandbox or test environment, IBN systems are a separate software model that is non-intrusive to the network, requires no points of integration besides read-only access, and requires no software agents or packet sniffers. For that reason, service providers are finding that they can generally be up and running on existing networks in only a few hours without risk or impact to existing services. Even for legacy architectures and extremely large carrier-grade networks, IBN verification technology can quickly deliver tangible results.

 

forward networks_david-ericksonDavid Erickson is the co-founder and CEO at Forward Networks. David holds a PhD in Computer Science from Stanford. He is a contributor to the OpenFlow spec and the author of Beacon, the OpenFlow controller at the core of commercial products from Big Switch Networks, Cisco, and others, and open source controllers such as Floodlight and OpenDaylight. His thesis used SDN to improve virtualized data center performance.

Verizon hits reset button with 2.0 launch

Verizon has announced it is now a new business, one which is customer centric and ready for the digital world of tomorrow. Smells like a polite way of announcing a restructure.

It might sound like a PR plug to stay relevant, heavily relying on friendly buzzwords such as customer centric and corporate social responsibility, but there is some pragmatism in behind the fluff. Like many telcos around the world, Verizon appears to be prepping for a restructure to refocus the business on tomorrow’s digital bonanza.

“It’s not only that we have a new operational structure from today, but it is also about the way we are thinking about our customers, the way we are thinking about our culture and leadership and society,” said Hans Vestberg, CEO of Verizon Communications. “We have a strategy that we are going to execute on.”

The plug itself seems to be focused on five areas. Firstly, corporate social responsibility. This will now be one of the promoted corporate values of the business, and will also factor into procurement decisions, but will also likely be included in various marketing campaigns.

While this sort of announcement might get some excited, Verizon is late to the show and, quite frankly, we’re surprised it has taken this long to include CSR in the corporate values. This is PR 101 and is a play which almost every other company on the planet is taking advantage of. Verizon might plug this as ‘innovation’, but the tiresome beast is catching up on a trend which ran wild years ago.

Secondly, the business will split into two business groups, Consumer and Business. Again, this seems like a move which should have been made some time ago.

Thirdly, Verizon 2.0 isn’t just a PR play but also symbolises progress which has been made on the network. Network virtualisation and softwarisation of the network is key here, and a critical component to ensure Verizon is a competitive force in the digital economy of tomorrow.

“We’ll also be working in new ways,” said Verizon employee Sravya Gajjala. “2.0 is our opportunity to take a look at what’s in front of us, at our existing processes and make fundamental changes across the business.”

This is the fourth point which to us sounds like corporate slang for restructure.

It might sound like a dirty word, perhaps because pain is a natural accompaniment to restructure, but it is critical. If Verizon is to maintain its lofty position of influence, it needs to be a business which is ready for the digital economy. This might mean redundancies, but it will certainly mean evolving from a Communications Service Provider (CSP) to a Digital Services Provider (DSP).

The final plug is innovation, the most overused and meaningless buzzword in the technology industry. Innovation means very little when everyone claims to be innovative because, quite frankly, only a small percentage actually are. For Verizon, this means pushing into new segments and offering new services. The imagery in the promotional video, which you can see at the foot of the article, suggest data is going to be a key aspect.

This might not sound revolutionary or new, but it is critical. The data intensive industries of tomorrow are going to rule the economy, but the telcos are not sitting in a strong position to capitalise on the gains. Trends are leading the telcos towards the role of utility, though there is still an opportunity to play a valuable role in the blossoming and disruptive segments.

This is the crux of the message; Verizon is attempting to re-model itself as a business which is relevant for the digital economy. It wants to be a partner of these innovative companies, offering services which go above and beyond the connectivity utility.

 

 

Microsoft and BMW pair up for IoT Open Manufacturing Platform

Microsoft has partnered up with the BMW Group to launch a new initiative aimed at stimulating growth for IoT in the smart factory segment.

The Open Manufacturing Platform (OMP) will be built on the Microsoft Azure cloud platform, aiming to have four to six partners by the end of the year, to help grow an ecosystem and build future Industry 4.0 solutions. The smart factory segment is promising much with the emergence of 5G, but with every new concept there is scepticism; someone always needs to drag it towards the finish line.

“Microsoft is joining forces with the BMW Group to transform digital production efficiency across the industry,” said Scott Guthrie, EVP of the Microsoft Cloud and AI Group. “Our commitment to building an open community will create new opportunities for collaboration across the entire manufacturing value chain.”

“We have been relying on the cloud since 2016 and are consistently developing new approaches,” said Oliver Zipse, a board member at BMW. “With the Open Manufacturing Platform as the next step, we want to make our solutions available to other companies and jointly leverage potential in order to secure our strong position in the market in the long term.”

BMW is already a significant customer of Microsoft Azure, with over 3,000 machines, robots and autonomous transport systems connected with through the BMW Group IoT platform, which is built on Microsoft Azure cloud.

Openness is one of the key messages here as the pair bemoan data silos and slow productivity created by complex, proprietary systems. The OMP aims to break down these barriers through the creation of an open technology framework and cross-industry community.

For both, the objective of this group is relatively simple. At BMW, the team wants to improve operational efficiencies and reduce costs, partly by taking back control of the supply chain, while Microsoft just wants more people, processes and data on Azure. The more accessible the smart factory is, more companies will become cloud-first, and the more successful the OMP becomes, the more customers Azure gains.

The OMP will provide community members with a reference architecture with open source components based on open industrial standards and an open data model. Through openness, the pair claim data models will be standardised to enable more data analytics and machine learning scenarios and usecases. For Microsoft and the manufacturers, its great news, for the suppliers not so much.

Openness sounds like a great idea, but with any fundamental change comes consequence. There will be numerous companies who benefit considerably from proprietary technologies and processes, especially in traditional industries like manufacturing, though those who resist change will be the losers in the long-run. The world is evolving to a new dynamic, where openness rules the roost, resistance only means future redundancy.

Intel VC arm plugs its disruptive vision

Intel has seemingly learned a lesson from the woes of stumbling giants, announcing it has invested $117 million in ‘disruptive’ start-ups at its annual VC conference.

There is a very good reason investors are so keen to pump cash into the likes of Google and Amazon, despite recent criticism and the threat of regulatory reform; these are companies which never sit still. The likes of Jeff Bezos and Sundar Pichai are constantly pushing the boundaries, expanding the business into new segments. It should be viewed as a lesson for every CEO around the world.

However, this is seemingly a lesson which has only recently been added to the management curriculum. In generations gone, some of the worlds’ leading technology companies have climbed further than any other before, and then stopped exploring. IBM, Oracle and Microsoft are three examples of companies which sat still for years, and the industry moved on without them. They have since recovered, but it took a lot to bridge the chasm.

“Intel Capital is continuing that legacy of disruption with these investments,” said Wendell Brooks, President of the VC arm, Intel Capital.

“These companies are shifting the way we think about artificial intelligence, communications, manufacturing and health care – areas that will become increasingly essential in coming years as the linchpins of a smarter, more connected society.”

One of the oldest phrases in the technology industry is often forgotten, but it seems Intel is attempting to resurrect it; disrupt or be disrupted.

Google and Amazon are the perfect embodiment of this statement. If you look at the acquisitions made over the years, they are incredibly intelligent bets. Google bought YouTube, Android and DeepMind for huge sums at the time, but now they look like bargains. Amazon didn’t make a profit for years, instead re-investing and now has AWS as a profit machine. These companies could have collected profits, paid more dividends and rewarded management with more bonus’, but look at what the end result is.

As it stands, Intel is in a relatively healthy position. Looking at the financials for 2018, revenue was $17.1 billion for the fourth quarter and $62.8 billion for the 12 months. These figures are 8% and 9% up year-on-year respectively, with data-centric revenue up 21% compared to Q4 in 2016. Share price declined on the news, investors were concerned over a conservative forecast, but the warning shot has seemingly been heeded.

If growth is not satisfying investors, something needs to change. The status quo is unlikely to reap more rewards tomorrow than today, therefore investment is required. Some of this will be directed inwards, though through the investments in Intel Capital the firm is welcoming disruption; it wants to be in on the ground floor of these potential booming enterprises.

“Our continued goal is to leverage the global resources and expertise of the world’s greatest engineering company, and its ecosystem of customers and partners, to help these founders accelerate growth and innovation,” said Brooks.

Looking at the investments, AI features heavily. Cloudpick is a smart retail technology provider with proprietary computer vision, deep learning, sensor fusion and edge computing technologies to enable cashier-free stores. SambaNova Systems is building an advanced systems platform to run AI applications. Zhuhai EEasy Technology is an AI system-on-chip (SoC) design house and total solution provider.

The team is also investing in the edge computing hype with Pixeom, mobile content streaming with Polystream, digital healthcare with Medical Informatics and Reveal Biosciences and also smart manufacturing.

The lessons of sitting still are incredibly obvious. Oracle founder CEO Larry Ellison dismissed the cloud and look where that has landed the firm. IBM refused to respond to the evolving PC market and it resulted in a colossal overhaul. Microsoft was another which ignored market trends, with former CEO Steve Balmer making some very off-target predictions in 2006. All of these companies have learned a lesson on disruption, but it came at a cost which took years to fix.

With its VC arm, Intel is promising to invest $300 to $500 million a year in disruptive technologies. It is taking a page out of the Amazon and Google playbook; if you want to remain on top, you can never sit still.

BT mulls redundancies for a quarter of staff – report

BT is considering further redundancies to increase profitability at the firm, with 25,000 jobs, a quarter of its employees, reportedly under threat.

According to Bloomberg, the battle-weary telco is mulling over the plans as new CEO Phillip Jensen prepares for his first meaningful earnings call in May. The telco has not confirmed or denied the reports thus far.

Having taken over the business from Gavin Patterson in February, Jensen is currently in the unenviable position of turning the supertanker. BT has been under pressure in recent years, not only due to an industry where profitability is decreasing rapidly, but with increased competition and CAPEX demands from the government, none of the trends seem to be favouring BT.

Sources claim internal discussions have been taking place, setting a target headcount of 75,000. The redundancies will allow for greater automation of back-office roles, while the team is also considering business disposals and streamlining the management functions. The plan is reportedly to trim headcount down by 25,000 by 2023.

This is of course not the first time BT has discussed redundancies, with the team also announcing 13,000 cuts back in May. The first round of cuts accompanied efforts to overhaul BT’s supply chain as part of a wider restructuring process to make the business more agile and fit for the digital era. These cuts, should the rumours turn out to be true, would be seen as a continuation of this strategy.

As you can see from the graph below, BT is not in the healthiest position and could be viewed as bloated in comparison to other telcos throughout Europe.

Bloomberg Intelligence

Source: Bloomberg Intelligence

Jensen now has the unenviable role of ensuring BT is fit for purpose, both from a business and network perspective, as the demands of the digital era start to weigh heavy. Not only has BT got to fuel 5G deployment, fibre connectivity is being demanded by customers and the government. BT has seemingly been able to ignore some of these demands in by-gone years, but the emergence, and initial success, of alt-nets are providing stern competition.

Although investors are seemingly happy with the rumours, share price increased by 1.6% following the report, we’ll have to wait until May to get the full details. It is believed Jensen will unveil the next stage of BT’s transformation during the earnings call.