Vodafone Australia admits to misleading carrier billing service

After an Australian Competition and Consumer Commission (ACCC) investigation, Vodafone Australia has admitted misleading consumers through its third-party Direct Carrier Billing (DCB) service.

The investigation looked into transactions made between 1 January 2013 to 1 March 2018, though it is most likely Vodafone broke the rules upon the introduction of an Australian Securities and Investments Commission Act in 2015.

“Through this service, thousands of Vodafone customers ended up being charged for content that they did not want or need, and were completely unaware that they had purchased,” said ACCC Chair Rod Sims. “Other companies should note, money made by misleading consumers will need to be repaid.”

The service was first introduced in January 2013 allowing customers to purchase digital content from third party developers such as games, ringtones and apps, with charges being applied to pre-paid and post-paid accounts.

The issue which Vodafone seems to be facing is the service was automatically applied to customer accounts, with purchases being made with one or two clicks. As the customer was not suitably informed, the service has been deemed to be misleading.

Vodafone has already begun the process of contacting impacted customers and will be offering refunds where appropriate. The telco has phased out the majority of the service already, owing to an increasing number of complaints during 2014 and 2015.

While a final judgment has not been released just yet, a confirmation and fine will likely follow in the next couple of weeks, other Australian telcos have been found guilty of the same offence. Both Telstra and Optus have been fined AUS$10 million for their own misleading carrier billing services.

Although it is hardly rare for a telco to be found on the wrong side of right, especially in Australia where the ACCC seems to be incredibly proactive, such instances will create a negative perception at the worst time for the telcos.

In an era when the telcos are searching for additional revenues, carrier billing initiatives are an excellent option. Assuming of course the telcos don’t mess it up.

The digital economy is becoming increasingly embedded in today’s society though there are still many consumers who will begrudgingly hand over credit card details to companies with whom they are not familiar. This mistrust with digital transactions could potentially harm SMEs while providing more profit for the larger players who have established reputations on the web.

In this void of trust and credibility, the telcos have an opportunity to step in and play the intermediately as a trusted organization; how many people have an issue with handing credit card information over to a telco?

There are plenty of examples of this theory in practice; Amazon or eBay are the most obvious and most successful. These are online market places which allow the flow of goods and cash between two parties who may not have had a prior relationship. The consumer might have an issue paying Joe Bloggs Ltd. as there is little credibility, though many trust the likes of Amazon and eBay, allowing the third party to manage the transaction and take a small slice of the pie.

Carrier billing can be an excellent opportunity to add value to a growing digital ecosystem, using the consumer trust in the telcos to drive opportunities for those businesses which want to grow online. However, should there be a perception that the telcos do not act responsibly with a customers’ bill, this opportunity will dry out very quickly.

Aside from costing Vodafone a couple of million dollars, this also dents the credibility of the telco (and overall industry by association). This example suggests it is just as risky purchasing goods through the telco as it is an unknown supplier online.

IBM and Red Hat seal the deal

The $34 billion acquisition of opensource enterprise software vendor Red Hat by venerable tech giant IBM has finally been completed.

The mega M&A was first announced last October and, given the size of it, seems to have gone through relatively quickly. Now begins the significant undertaking of integrating two such massive organisations that may well have quite distinct cultures.

IBM was founded in 1911 and has undergone several transformations to become the enterprise software and services company it is today. Red Hat only came into existence in 1993 and has always focused on the decidedly un-corporate open-source software community. IBM will be hoping some of its youthful vigour and flexibility will rub off, but that remains to be seen.

The official line is that the acquisition makes IBM one of the leading hybrid cloud providers as well as augmenting its software offering. There’s much talk Red Hat’s independence being preserved but, of course, it will now be taking orders from IBM.

“Businesses are starting the next chapter of their digital reinventions, modernizing infrastructure and moving mission-critical workloads across private clouds and multiple clouds from multiple vendors,” said Ginni Rometty, IBM chairman, president and CEO. “They need open, flexible technology to manage these hybrid multicloud environments. And they need partners they can trust to manage and secure these systems.”

“When we talk to customers, their challenges are clear: They need to move faster and differentiate through technology,” said Jim Whitehurst, president and CEO of Red Hat (what’s the difference?). “They want to build more collaborative cultures, and they need solutions that give them the flexibility to build and deploy any app or workload, anywhere.

“We think open source has become the de facto standard in technology because it enables these solutions. Joining forces with IBM gives Red Hat the opportunity to bring more open source innovation to an even broader range of organizations and will enable us to scale to meet the need for hybrid cloud solutions that deliver true choice and agility.”

That’s it really. There’s lots aspirational talk and general banging on in the press release, but you get the gist of it. Whitehurst will join the senior management team and report into Rometty, who seems to possess every senior management position worth having. IBM has been steadily increasing cloud as a proportion of total revenues and the pressure is now on to take that growth to the next level.

Q&A with Rupesh Chokshi – Assistant Vice President, Edge Solutions Product Marketing Management, AT&T Business

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. Rupesh Chokshi is a leader in technology with a strategic focus for growth in global technology and telecommunications. He currently leads the product marketing team within Edge Solutions for AT&T Business which focuses on product management, strategy and business development, and is transforming services and networks using software-defined networking (SDN), network function virtualization (NFV) and SD-WAN technologies.

To help determine the state of virtualization today, Light Reading spoke with Rupesh Chokshi – Assistance Vice President, Edge Solutions Product Marketing Management at AT&T Business – and one of the industry leading experts presenting at this years’ Network Virtualization & SDN Americas event in September.

Light Reading (LR): How has network virtualization evolved in the last three years?

Rupesh Chokshi (RC): AT&T has been in the business of delivering software-centric services for several years, and we’ve seen adoption from businesses looking to update their infrastructures, increase their agility and transform their businesses. Networks are almost unrecognizable from what they used to be – data traffic on our mobile network grew more than 470,000% since 2007, and video traffic increased over 75% in the last year. Given the new network demands, companies need to adapt by changing the way they manage networks.

We took a unique approach with our infrastructure ability by using software-defined network (SDN) and network function virtualization (NFV) with our own network, meeting our goal of 65% virtualized network by 2018 and setting us up to achieve our goal of 75% virtualization by 2020. At the same time we started using SDN and NFV in our own network, we utilized SDN to deliver AT&T’s first SDN service, AT&T Switched Ethernet Service with Network on Demand (ASEoD). This allowed thousands of customers to provision and manage their network in a fraction of the time it took in the past, and now enables them to scale bandwidth on demand to meet their business’ seasonality.

ASEoD was only the first of a series of solutions we are creating to address shifting network needs. Three years ago, we introduced our first global software-based solution, AT&T FlexWareSM, which uses both SDN and NFV to increase business agility by simplifying the process for dynamically adding and managing network functions.

LR: What technology developments are you most excited about for in the future?

RC: The work we did up to this point to deliver SDN within our network and for our customers set us up for the next generation of wireless technology, 5G. As the first SDN-enabled wireless technology, and the first wireless network born in the cloud, 5G will ultimately enable new use cases that take advantage of network slicing, the ability to support a high number of IoT devices and greater low-latency edge compute capabilities.

In addition, we are collaborating with VMWare SD-WAN by VeloCloud to implement 5G capabilities into our software-defined wide area networking (SD-WAN). This will give business new levels of control over their networks and is key for companies looking to use SD-WAN with a high-speed, low-latency 5G network as their primary or secondary WAN connection type.

LR: How can businesses move forward with virtualization today?

RC: Today, businesses need to make sense of data faster and more efficiently than ever before, which is driving businesses to evaluate how they use their network for all applications, and to find ways to maximize their resources. One-way companies can do this and move forward with virtualization is through AT&T’s comprehensive SD-WAN portfolio. AT&T’s SD-WAN technology supports this new way of working by letting companies define policies based on individual business needs using centralized software-based control.

LR: How can businesses determine the business benefits and ROI of virtualization today?

RC: Businesses can determine the business benefits of virtualization through cost savings, application-level visibility and near real-time analytics.

Potential cost savings is one of the key benefits of SD-WAN that is touted by technology suppliers and service providers alike. In our experience, it is during the process of fleshing out the technical details of the solution and how to best integrate it into their network that enterprises begin to fully appreciate where those cost benefits may come from, as well as understanding other benefits or features that may also be important to them. Keep in mind the importance of considering potential cost savings in the context of total cost of ownership, not just looking at the relative cost of the CPE vs. the cost of the network access.

Additionally, SD-WAN technology can provide more application-level visibility and control on a per site basis, and these capabilities go far to help customers assess and experience the benefits of the performance of their network access and transport.

SD-WAN also enables customers to access analytics in near real time or on a historical basis for bandwidth consumption and application visibility. This is instrumental in setting KPIs and measuring ROI and planning for future network growth.

LR: What virtualization strategies should businesses be focusing on?

RC: Businesses need to adopt efficient, high-performing networks to take advantage of the newest technology and bandwidth needs. Automation is a great example of this. As businesses require more bandwidth, we need to provide more elegant solutions in order for them to take full advantage of more ubiquitous, high-speed broadband.

Additionally, while digital transformation is top of mind for businesses of all sizes and in every industry, dynamic SD-WAN is still in a relatively early stage of growth and adoption. And for others, MPLS and IPsec remain important options. Hybrid WAN designs will continue to be popular as customers utilize multiple technologies (MPLS, IPSec, SD-WAN) for optimal results.

LR: How can businesses build these technologies into their long-term business models?

RC: We live in a digital economy, and AT&T provides fundamental platforms for businesses to grow, differentiate and innovate. We work with businesses of all sizes to help transform their long-term business models through technology solutions delivered in the form of a managed service. Customers come to us because of our expertise, breadth and depth of capabilities, global scale and innovation in areas such as software defined networking, network function virtualization, mobility, IoT and SD-WAN.

As businesses grow, they need to think about their overall networking health. And how they can use their networks to meet all their business objectives. Key considerations in bringing that to life include:

  • Holistic solutions that can combine SD-WAN functionality with network services from AT&T or other providers, virtualized CPE, wired and wireless LAN, security, voice over IP and much more;
  • Reduced operational expense and less need for in-house expertise with a managed solution that handles all aspects of the end-to-end solution design and setup;
  • Global deployment options that remove the headaches of onsite installation and support in countries around the world; and
  • Flexible SD-WAN policy management where the customer can choose to set and update application level policies themselves or rely on AT&T experts to manage this for them.

Want to deep dive into real-world issues and virtualization deployment challenges with Rupesh and other industry leaders?

 

Join Light Reading at the annual Network Virtualization& SDN Americas event in Dallas, September 17-19. Register now for this exclusive opportunity to learn from and network with industry experts. Communications service providers get in free!

Intelligent Transport Networks for the 5G and Cloud Era Catalyzes 5G Business Success

At the Huawei User Group Meeting 2019, Mr. Tang Hong, Director of Data Communication Research Institute, Intelligent Network & Devices Research Institute, China Telecom, delivered a keynote speech entitled “Intelligent Transport Networks in the 5G and Cloud Era.”

Tang Hong China Telecom

In his insightful speech, Mr. Tang Hong highlighted the unprecedented transport network challenges arising from the rapid development of emerging services, including 5G, cloud VR/AR, and site-to-cloud private lines for enterprises. Specifically, transport networks shall provide larger capacity and higher interface bandwidths to meet the predicted 10-fold traffic growth in the next five years. Transport networks should also provide more flexible and elastic connectivity needed for 1000-fold increase in connections due to Internet of Things (IoT) communication and trillions of apps, ever-increasingly meshed traffic, and uncertain service connections. Driven by one-click-to-cloud and inter-domain connections, network-wide automation and proactive O&M will be must-haves for transport networks as well.

With the advent of the 5G and cloud era, the network has become the digital foundation for 5G infrastructures. Typical network trends include providing ultra-high bandwidth and application-level Service-Level Agreements (SLAs) to serve the access of massive vertical industries; adding intelligence to networks; and comprehensively transporting diverse services.

To keep up with these inevitable trends, intelligent transport networks are emerging for the 5G and cloud era and taking on three distinct characteristics: ultra-broadband network, intelligent connections, and committable high availability. Specifically, “ultra-broadband” redefines cost-effective network infrastructures by making the most of 50GE, 100GE, and 400GE interfaces. “Intelligence” reshapes connections with SRv6 for flexible and reliable connections. “Simplicity” reinvents O&M by introducing a Software-Define Networking (SDN)-capable intelligent management and control system for simplified O&M and improved network availability.

Ultra-broadband network: Leveraging the mature industry chain to build a cost-effective ultra-broadband network

In the 4G era, GE rings were used at the access layer of a transport network during the early stages. With the rapid traffic growth, GE rings evolved to 10GE rings. When it comes to the 5G era, the access ring bandwidth will exceed 10 Gbit/s. Currently, the mainstream interfaces beyond 10GE are the extremely costly 100GE interfaces. As such, an alternative is 50GE, an innovative Ethernet interface technology, which will be the best choice for operators to meet the access ring bandwidth requirements. This is because 50GE can use the same optical-layer industry chain as that used by 25GE, as 25GE optical modules are in wide commercial use in data centers (DCs).

The combined use of single-channel 25GE optical components and PAM4 achieves high-speed 50GE transmission. In this regard, the concerted efforts made by upstream and downstream players in the industry chain will help operators reduce the network construction costs. In addition, PAM4 technology can also be used on 400GE optical modules and will eventually help operators build end-to-end, cost-effective, and high-bandwidth networks.

Intelligent connections: Using SRv6 to cope with uncertainties of service connections

Nowadays, traffic overflow brought about by cloud computing is becoming more apparent than ever. However, traffic is often distributed throughout the network and the demands for east-west connections keep increasing. This requires flexible and elastic connectivity on the network, and SRv6 well meets this pursuit.

SRv6 is a combination of segment routing (SR) and IPv6. The number of IPv6 addresses is huge, meeting the massive application and service access requirements. The SRv6-based Nature IP attribute and powerful programmability help with automatic inter-domain connections and allow one-click-to-cloud for service configurations. SRv6 can also provide service awareness and SLA guarantee at the tenant and app levels.

In January 2019, the Intelligent Network & Devices Research Institute of China Telecom joined forces with Huawei to complete the verification of the first SRv6 commercial-use in China. In February 2019, China Telecom partnered with Huawei to implement the first global commercial use of SRv6, reducing the inter-domain service provisioning time from months to days.

Committable high availability: Using SDN technology to simplify O&M and improve network availability

As a multi-service transmission pipe, the transport network has native characteristics like a large number of nodes, complex configurations, inability to detect terminal service quality, and difficulty in self-evidence. These factors have brought great challenges to network O&M in the 5G era.

Additionally, traditional O&M for operators’ networks is driven by user complaints. The NMS is highly dependent on device alarms and cannot diagnose or predict potential faults. In addition, the O&M of routers is difficult and error-prone as well as heavily relies on personnel skills.

Against this backdrop, how to implement proactive O&M and improve O&M efficiency with the focus on user experience is in urgent need. The SDN-capable intelligent management and control system can make a difference. This new O&M system features intelligent connections and high availability, achieving automated O&M for the entire lifecycle of IP transport networks.

A combination of the SDN controller and SRv6 helps implement 50 ms protection switching for any topology, traffic optimization, and fast fault location to achieve network-wide load balancing and predictive maintenance, thereby improving network availability. These benefits help China Telecom with business expansion in vertical industries in the 5G and cloud era.

4G changes people’s lives, while 5G changes society. Mr. Tang Hong said that 5G transport networks not only support people-people connections, but also enable rapid development across vertical industries, such as intelligent manufacturing, intelligent transportation (autonomous driving), smart city, and intelligent health. In the future, China Telecom will continue to carry forward their openness philosophy and work with industry partners to build 5G transport capabilities and enable 5G services for new win-win outcomes in the future.

US rumoured to begin posturing at India

The Trump administration is certainly an ‘enthusiastic’ one, and now it appears it might be turning its attention to India.

According to Reuters, the US is set to start beating its chest in front of the Indian Government, suggesting it will limit H-1B work visas for nations who force foreign firms to store data locally. US Secretary of State Mike Pompeo is currently readying his forces for an Indian excursion, and it would appear he is not in a friendly mood.

The H-1B work visas are popular with many US businesses and individuals around the world. The visa allows an individual to enter the US to temporarily work at an employer in a specialty occupation. Considering the foundations of Silicon Valley were built on its ability to attract the best talent from around the world, this has been a very beneficial programme for the US economy and those specialised workers who want to maximise their earnings.

Although there are no official quotas, it is believed that Indian citizens account for 70% of the H-1B visas which are issued each year. One outcome which is being suggested is that the country and its citizens would be limited to 15% of the annual total.

However, should the rumours prove to be true, it could widen an already strained relationship between the US and India. The US might be the most powerful and influential nation worldwide, but it is proving to be more of a bully than an ally at the moment.

Interestingly enough, this move could also put the US at odds with numerous other nations; India is not the only one which places data localisation requirements on foreign countries.

Russia and China are two countries which have very strict data localisation laws, though it will surprise few that the US does not have an issue in upsetting these governments. The European Union is another however. There are currently laws in place in Europe which make it difficult to transfer large volumes of data outside of the bloc, effectively acting as localisation requirements.

One of the best examples of how difficult the European laws can be to navigate is a battle between the US Government and Microsoft which ended last year. In this saga, the Department of Justice wanted access to emails stored on one of Microsoft’s Irish servers. Due to localisation complications, Microsoft argued the US had no jurisdiction and should not be granted access to the content. This battle continued for five years, with the passing of the CLOUD Act effectively ending the debate. It is still controversial, but this chapter has been closed.

Protection of a nations citizens and jurisdiction are two of the reasons sovereign nations are keen to implement data localisation laws, as well as the fact it provides a handy boost to the local economies. We suspect there will be objections if the US attempts to bully its way to eradicating these requirements.

Should the reports be true, another data privacy and protection debate could be on the horizon.

AT&T + HPE = edgy TLC

AT&T has announced a new partnership with HPE to drive the benefits of edge computing in enterprise services.

The duo has agreed a go-to-market strategy to accelerate business adoption of edge connections and edge computing, seen by some as one of the most interesting areas of the up-coming 5G economy.

“AT&T’s software-defined network, including our 5G network, combined with HPE’s intelligent edge infrastructure can give businesses a flexible tool to better analyze data and process low-latency, high-bandwidth applications,” said Mo Katibeh, CMO of AT&T Business. “Bringing compute power closer to our network helps businesses push the boundaries of what is possible and create innovative new solutions.”

“HPE believes that the enterprise of the future will need to be edge-centric, cloud-enabled and data-driven to turn all of its data into action and value,” said Jim Jackson, CMO of HPE. “Our go-to-market alliance with AT&T, using HPE Edgeline Converged Edge Systems, will help deliver AT&T MEC services at scale to help our customers more quickly convert data into actionable intelligence, enabling unique digital experiences and smarter operations.”

There are of course many benefits to edge computing, though one of the areas AT&T will be hoping to address through this tie-up will be the security concerns which will emerge. This looks like it could be one of the key marketing plugs of the AT&T proposition, as its Multi-access Edge Compute (MEC) Services will hope to drive the benefits of mobility to enterprise customers.

From HPE’s perspective, the team will be contributing on the low-latency side of the 5G euphoria. HPE suggests its Edgeline Converged Edge Systems could help create use cases where applications can reside on premises for lower latency processing.

It might not be as ‘sexy’ as plugging ridiculous download speeds, but the greatest benefits of 5G to the telcos would appear to be diversification as opposed to increased squeeze on the wallets of consumers. With more data being created each day, the edge will become increasingly important to activate products, services and business models in a faster and more operationally efficient manner. Enterprise organizations will largely be unaware of how to reap the greatest benefits, a pleasant niche the telcos could certainly profit from.

HMD moves Nokia phone user data storage to Finland

HMD Global, the maker of Nokia-branded smartphones, announced that it is moving the storage of user data to Google Cloud servers located in Finland, to ease concerns about data security.

The phone maker announced the move in the context of its new partnership with CGI, a consulting firm that specialises in data collection and analytics, and Google Cloud, which will provide HMD Global with its machine learning technologies. The new models, Nokia 4.2, Nokia 3.2 and the Nokia 2.2, will be the first ones to have the user data stored in the Google Cloud servers in Hamina, southern Finland. Older models that will be eligible for upgrading to Android Q will move the storage to Finland at the upgrade, expected to take place from late 2019 to early 2020. HMD Global commits to two years’ OS upgrades and three years’ security upgrades to its products.

HMD Global claims the move will support its target to be the first Android OEMs to bring OS updates to its users, and to improve its compliance with European security measures and legislation, including GDPR. “We want to remain open and transparent about how we collect and store device activation data and want to ensure people understand why and how it improves their phone experience,” said Juho Sarvikas, HMD Global’s Chief Product Officer. “This change aims to further reinforce our promise to our fans for a pure, secure and up to date Android, with an emphasis on security and privacy through our data servers in Finland.”

Sarvikas denied to the Finnish news outlet Ilta-Sanomat that the move was a direct response to privacy concerns triggered by the controversy earlier this year when Nokia-branded phones sold in Norway were sending activation data to servers in China. At that time HMD Global told Telecoms.com that user data of phones purchased outside of China is stored in AWS servers in Singapore, which, the company said, “follows very strict privacy laws.” However, according to GDPR, to take user data outside of the EU, the company would have had to obtain explicit consent from its EU-based users.

Sarvikas claimed that the latest decision to move storage to Finland has been a year in the making and is part of the company’s overall cloud service vendor swap from Amazon to Google. “Staying true to our Finnish heritage, we’ve decided to partner with CGI and Google Cloud platform for our growing data storage needs and increasing investment in our European home,” Sarvikas added in the press release.

Francisco Jeronimo, Associate VP at IDC, saw this move a positive action by HMD Global, calling it a good move “to address concerns about data privacy” on Twitter.

Microsoft wastes no time in countering Google’s cloud gaming ambitions

Cloud gaming is emerging as a major new front in the competitive war between tech giants, who are using the E3 gaming trade show to flex their muscles.

A few days ago Google added some substance to its cloud gaming ambitions by announcing more details, including a list of games and pricing, of its Stadia platform. Shortly after Microsoft had its big E3 reveal, which of course focused on plans for its Xbox gaming console. These included the ability to stream games from a console to a mobile device and the opportunity for attendees to stream games from the cloud via the Project xCloud streaming service.

The difference between the two is that the latter doesn’t require you to own an Xbox console and so is directly equivalent to Stadia. Microsoft first announced Project xCloud last October, so it’s clearly a tricky proposition if this is the progress it has managed in the intervening 8 months. Having said that, at least one gaming hack seemed impressed with the streaming performance, noting that the latency was acceptably low.

Low latency will be the killer feature of 5G if this sort of thing is to drive truly mobile gaming, as we noted from MWC earlier this year. Not only is it a prerequisite for fast-paced first-person shooter games, in which any delay presents a massive competitive disadvantage, but it’s widely recognised that its essential for the virtual reality user experience.

You can see a bit more from the Microsoft E3 announcement below, but the company still seems to be keeping its cards fairly close to its chest on this topic. Maybe one reason is that it’s trying to work out what this means for its cloud gaming partnership with Sony, which just happens to make the main competitor to the Xbox. The market is clearly receptive to cloud-based subscription services for its entertainment such as Netflix and Spotify. We will soon find out if this applies to gaming too.

Google cloud dives deeper into the data dreamland

Google’s cloud business unit has announced the acquisition of data analytics firm Looker for $2.6 billion, further expanding products available in the ever-growing world of cloud.

While another acquisition at Google is nothing out of the ordinary, this happens to be the first under the tenure of Thomas Kurian, the newest CEO of the cloud business. Kurian took the reigns from Diane Greene at the beginning of this year, after Greene failed to deliver on the hype which surrounded her appointment.

“A fundamental requirement for organizations wanting to transform themselves digitally is the need to store, manage, and analyse large quantities of data from a variety of sources,” said Kurian in a blog announcement. “Google Cloud offers customers a broad and integrated suite of cloud services to ingest data in real time, cleanse it, process it, aggregate it in a highly scalable data warehouse and analyse it.

“Looker extends our business analytics offering with two important capabilities—first, the ability to define business metrics once in a consistent way across data sources. This makes it easy for anyone to query data while maintaining consistent definitions in their calculations, ensuring teams get accurate results.

“Second, Looker also provides users with a powerful analytics platform that delivers applications for business intelligence and use-case specific solutions such as Sales Analytics, as well as a flexible, embedded analytics product to collaborate on business decisions.”

With Looker being integrated into the Google proposition, the cloud team will have something more interesting to talk about. Kurian has discussed a more complete analytics solution, including visualisation of results and integration into daily workflows, as well as the ability to make more customisable solutions for the verticals.

Another interesting benefit of this acquisition is building Google’s ability to work in a multi-cloud landscape. Although any cloud company will want to pigeon hole enterprises into their own products, bleeding customers is of course more profitable, it is not realistic in today’s world. If you do not have a proposition which is interoperable with other cloud providers, you are not going to be attractive to customers.

There are numerous examples of this being an important factor of the cloud world of tomorrow. The Data Transfer Project is an initiative to build a common framework with open-source code that can connect any two online service providers, while Vodafone Business and IBM came together to create a joint-venture aiming to solve the problem presented by multi-cloud interoperability.

As part this acquisition, Google is also inheriting the ability to play in this world, bumping its ability to bring together data from SaaS applications like Salesforce, Marketo, and Zendesk, as well as traditional data sources.

Google Cloud has seemingly been losing out to the likes of Microsoft Azure and AWS in recent years, a factor which reportedly contributed to Greene’s downfall. This is not to say the cloud business is not successful, but it is not tearing up trees at the same rate as the two market leaders.

Perhaps this is only one of the first announcements we should expect from Kurian over the next couple of months. This is a man who needs to make his mark on the business, but also close the gap Microsoft and Amazon have created at the top of the cloud rankings.

Google fleshes out its Stadia cloud gaming platform

Having teased a new cloud gaming platform earlier this year, Google has finally got around to launching it properly.

Stadia offers games that are 100% hosted in the cloud, which means you don’t need a console, don’t need to install any software and can game on any screen with an adequate internet connection. Right now Google is only launching the premium tier, which offers 4K gaming but requires a £9 per month subscription and a 35 Mbps connection.

A freemium tier will follow in due course that won’t change a subscription fee but will offer reduced performance. It looks like both tiers will charge full-whack for individual games, although the premium one will chuck in a few freebies to sweeten the pot. Among the games announced by Google is a third version of the popular RPG Baldur’s Gate.

To seed the market Google is urging early adopters to by a Founder’s Edition bundle that includes a controller, a Chromecast Ultra dongle and three months subscription to the ‘Pro’ premium tier for £119. Here’s what you get for Pro versus the basic package.

stadia pricing

The main telecoms angle here is bandwidth. Google reckons you still need a 20 Mbps connection even for 1080p gaming, which a lot of people, even in the UK, still struggle to reach. But the real strain on networks will come if people start using stadia via mobile devices. This is unlikely to really take off until you get games developed specifically for mobile, probably with a location and/or AR element to them, but when they do we might finally see a killer consumer app for 5G.