Google cloud dives deeper into the data dreamland

Google’s cloud business unit has announced the acquisition of data analytics firm Looker for $2.6 billion, further expanding products available in the ever-growing world of cloud.

While another acquisition at Google is nothing out of the ordinary, this happens to be the first under the tenure of Thomas Kurian, the newest CEO of the cloud business. Kurian took the reigns from Diane Greene at the beginning of this year, after Greene failed to deliver on the hype which surrounded her appointment.

“A fundamental requirement for organizations wanting to transform themselves digitally is the need to store, manage, and analyse large quantities of data from a variety of sources,” said Kurian in a blog announcement. “Google Cloud offers customers a broad and integrated suite of cloud services to ingest data in real time, cleanse it, process it, aggregate it in a highly scalable data warehouse and analyse it.

“Looker extends our business analytics offering with two important capabilities—first, the ability to define business metrics once in a consistent way across data sources. This makes it easy for anyone to query data while maintaining consistent definitions in their calculations, ensuring teams get accurate results.

“Second, Looker also provides users with a powerful analytics platform that delivers applications for business intelligence and use-case specific solutions such as Sales Analytics, as well as a flexible, embedded analytics product to collaborate on business decisions.”

With Looker being integrated into the Google proposition, the cloud team will have something more interesting to talk about. Kurian has discussed a more complete analytics solution, including visualisation of results and integration into daily workflows, as well as the ability to make more customisable solutions for the verticals.

Another interesting benefit of this acquisition is building Google’s ability to work in a multi-cloud landscape. Although any cloud company will want to pigeon hole enterprises into their own products, bleeding customers is of course more profitable, it is not realistic in today’s world. If you do not have a proposition which is interoperable with other cloud providers, you are not going to be attractive to customers.

There are numerous examples of this being an important factor of the cloud world of tomorrow. The Data Transfer Project is an initiative to build a common framework with open-source code that can connect any two online service providers, while Vodafone Business and IBM came together to create a joint-venture aiming to solve the problem presented by multi-cloud interoperability.

As part this acquisition, Google is also inheriting the ability to play in this world, bumping its ability to bring together data from SaaS applications like Salesforce, Marketo, and Zendesk, as well as traditional data sources.

Google Cloud has seemingly been losing out to the likes of Microsoft Azure and AWS in recent years, a factor which reportedly contributed to Greene’s downfall. This is not to say the cloud business is not successful, but it is not tearing up trees at the same rate as the two market leaders.

Perhaps this is only one of the first announcements we should expect from Kurian over the next couple of months. This is a man who needs to make his mark on the business, but also close the gap Microsoft and Amazon have created at the top of the cloud rankings.

What to bin and what to keep; the big data conundrum

Figuring out what is valuable data and binning the rest has been a challenge for the telco industry, but here’s an interesting dilemma; how do you know the unknown value of data for the usecases of tomorrow?

This was broadly one of the topics of conversation at Light Reading’s Software Defined Operations & the Autonomous Network event in London. Everyone in the industry knows that data is going to be a big thing, but the influx of questions are almost overwhelming as the number of data sets available.

“90% of the data we collect is useless 90% of the time,” said Tom Griffin of Sevone.

This opens the floodgates of questions. Why do you want to collect certain data sets? How frequently are you going to collect the data? Where is it going to be stored? What are the regulatory requirements? How in-depth does the data need to be for the desired use? What do you do with the redundant data? Will it still be redundant in the future? What is the consequence of binning data which might become valuable in the future? How long do you keep information for with the hope it will one day become useful?

For all the promise of data analytics and artificial intelligence in the industry, the telcos have barely stepped off the starting block. For Griffin and John Clowers of Cisco, identifying the specific usecase is key. While this might sound very obvious, it’s amazing how many people are still floundering, but once this has been identified machine learning and artificial intelligence become critically important.

As Clowers pointed out, with ML and AI data can be analysed in near real-time as it is collected, assigned to the right storage environment (public, private or traditional dependent on regulatory requirements) and then onto the right data lakes or ponds (dependent on the purpose for collecting the data in the first place). With the right algorithms in place, the process of classifying and storing information can be automated, freeing up the time of the engineers to add value, though it also keeps an eye on costs. With the sheer volume of information being collected increasing very quickly, storage costs could rise rapidly.

And this is below the 5G and IoT trends have really kicked in. If telcos are struggling with the data demands of today, how are they going to cope with the tsunami of information which is almost guaranteed in tomorrow’s digital economy.

Which brings us back to the original point. If you have to be selective with the information which you keep, how do you know what information will be valuable for the usecases of tomorrow? And what will be the cost of not having this data?

AI and security sets us apart from the crowd – Google Cloud CEO

Google Cloud is not winning the cloud battle with AWS and Microsoft Azure right now, but with security credentials and artificial intelligence smarts, CEO Diane Greene thinks the future looks profitable.

Speaking at Google Next ’18 event in London, Greene claims superior security and industry-leading AI are differentiators for Google Cloud which will lead it to the top of the rankings. This is a business which is not shy about spending its way to success, the last 18 months have seen huge amounts spent on infrastructure to fuel momentum in the cloud business, but it’s the value which Google Cloud can add to operations, not simply a pricing war and availability, which is the recipe for success.

“We’re still early in the cloud, only 10% of workloads are in the cloud, but this is where all the digital transformation is coming from,” said Greene. “There are bottom line and top line benefits. Cloud is becoming a working structure to how you can enforce change in companies.”

“It’s our AI, data analytics and security. AI is everyone’s biggest opportunity, and cybersecurity is everyone’s biggest threat, and Google has the best of these. AI is built into everything we do, completely infused into G-Suite. This is such a powerful technology and we’re so proud to have the world’s leading experts. It has such a power for good but we see the concerns in the world.”

This is of course not a new message. Artificial intelligence has been a talking point for Google Cloud for some time but why is it different today? Because the cloud is now normalised.

Considering how long the concept of cloud computing has been around, and the money which is spent each year, it is amazing to think only 10% of workloads have been moved to the cloud and there are still organizations out there who are resisting. However, most enterprise-scale or forward-looking businesses are taking a cloud-first approach to business. This is evidence the cloud has been normalised, and suggests the industry could become commoditized before too long.

Commoditization is not necessarily a bad thing, especially when you look at the room for growth. With the likes of Google, AWS and Microsoft Azure offering massive scale, the commoditization business model works and it also makes the cloud more accessible for those who have not considered migration so far. Initiatives like the Data Transfer Project, which aims to standardize cloud environments to allow for easier migration of data from one provider to another, will also help rollout cloud everywhere else, but this also helps Google.

With the idea commoditized, Google Cloud can start focusing further up the value chain, leveraging the capabilities which it has in-house. Security is an excellent differentiator to bring customers into the fray, though the value-added services in data analytics and artificial intelligence could set Google apart from the crowd.

AirAsia is a good example, as Greene ushered CEO Tony Fernandez onto the stage during the keynote session. AirAsia has leveraged the power of the cloud to continually build his business, the airline now has 250 planes, 200,000 staff and will serve 90 million customers this year, but this year the focus is on enhancing the business with artificial intelligence. The team are working on several areas from predictive maintenance of aircraft, through to weather predictions to inform customers of potential delays, and also using facial recognition to improve the customer experience. This is where Google Cloud can prove its worth; the value add for customers who have already embraced the concept of cloud computing.

Through Deepmind Google has created an artificial intelligence foundation which can be rivalled by few in the technology world. Having acquired the company for roughly $500 million, a bargain in an era of multi-billion dollar acquisitions, the Oxford scientists are providing value throughout the Google universe. On top of this, Greene highlighted investments in Google Brain, a separate deep-learning research team, an Advanced Solutions Lab in Dublin, as well as AI training workshops in 20 European countries.

Google might be losing the battle to secure the commoditized cloud business, AWS is a clear and runaway winner here, though the value-add segment of cloud is starting to emerge. The opportunity to build more advanced solutions on top of a basic cloud-orientated business model is perhaps another segment which will certainly be attractive. This is a segment which will be defined by artificial intelligence.

AWS and Microsoft Azure are both eyeing up the AI world with their own investments, though Google has arguably put itself in the lead through years of acquisition and investment in product areas with an eye on tomorrow’s profitability. The long-game might be starting to pay off.

Google poaches IoT exec, what does this mean for Samsung’s software ambitions?

Google has hired former Samsung mobile communications CTO Injong Rhee who will now lead the internet giant’s move into IoT under the very glamourous job title of Entrepreneur In Residence.

Rhee, who left Samsung in December, was responsible for several heavyweight products at Samsung including Knox, its enterprise security platform, and mobile payment solution Samsung Pay. Specifics of Rhee’s role in Google is not entirely clear just yet, but he will lead the IoT business unit, reporting into Diane Greene, CEO of Google Cloud.

“IoT is a new and exciting space with tremendous potential to transform how we use and deploy technology in our everyday lives,” Rhee said in a LinkedIn post.

“Google and Alphabet have many IoT related products and assets. One of the first things I would like to do with my Google colleagues is to get these efforts coordinated and aligned toward a concerted IoT story of Google — in the process, create distinct consumer and enterprise product lines.”

While this is certainly a positive move for Google, a company seemingly on a never-ending quest to diversify, it is less promising for Samsung. Samsung is a business which has been searching for future relevance and losing a senior executive is never a good sign for a business.

While Samsung is still the leader in the smartphone global rankings, it hasn’t been able to emulate Apple is collecting the profits. This is not to say the mobile is not making money, more that Apple has done an excellent job in monopolizing smartphone profits. Software and services has been one of the areas Samsung has been looking to for differentiation and additional revenue streams.

Bixby, Samsung’s own digital assistant, is a good example of how the Koreans want to be more involved in the consumers lives, but acquisitions such as cloud provider Joyent (which Rhee was responsible for) could be viewed as a strategy to be more than a simply a smartphone brand. With the connected economy just around the corner there are profits to be made in the software game where recurring revenues can be realised, but losing such a senior executive will surely dent the Samsung ambitions.

Apple is perhaps the only brand which can survive in product mode. We’re not suggesting the iEngineers aren’t working on improving the software side of the business, but such is the level of brand loyalty the company will be able to continue making money off products for at least the foreseeable future. Other brands, who customers are a bit less sticky, will have to throw a horde of new features and software at the consumer to demonstrate the attractiveness of their devices. Such is the minimal differentiation which we are seeing in the devices market.

The hire gives us insight into yet another disruptive play from Google, but also puts a bit of a dampener on Samsung’s connected dreams. Rhee has been credited as the driving force behind Samsung’s positive moves into the world of software and will be a notable loss for the firm.

The internet levelled the playing field, but data might be just for the big boys

Mass market penetration of the internet may have levelled the playing field for SMEs, but will a shift to a data driven economy tilt the favour back towards the heavy hitters?

The emergence of new ideas and new practices has two notable impacts on the business world. Firstly, there is the element of surprise. Entrepreneurs and innovative organizations create new products and services generating hordes of cash due to the ability to perceive and adapt to the opportunities which are created. The second impact is the normalization of the technology and it being accepted as general practise by the larger and more traditional organizations.

In terms of the digital economy, we’re probably just getting to the stage now where these ideas are being taken seriously by the larger organizations. They might tell you they have been digital organizations for some time, but this is not the case. They prodded and investigated digital with limited success, but legacy processes and technologies still held them behind. As we move towards 2018, more organizations are starting to get digital, meaning the advantage of the agile and open-minded SME will soon be eroded.

This slow acceptance of new technology is nothing new. For every step forward taken there are numerous organizations which make themselves household names out of disruption. The internet saw Facebook and Amazon become giants, and mobility gave rise to Uber and apps like Tinder.

Soon it will become a rarity for non-entities to grow to such heights, as the fuelling technologies become normalized and a status quo is established at the top. The disruption has already been caused, Blockbusters or Encyclopaedia Britannia for example, and the traditional players are creating their own digital offers to negate the threat of potential usurpers.

So what comes next? How about the data economy? But this might be one which changes the rules of disruption.

A common phrase when looking at the connected society is ‘data is the new oil’, which is correct, but data on its own is useless, just like oil is. The winners of the data-driven economy will be the ones who can best make use of the information which is out there, turn it into insight and monetize that insight. If data is oil, then algorithms are the refinery’s and customer touch-points are the retail petrol stations.

To be a player in the data driven economy, you need to have all three. You have to be able to mine the data, then have the intelligence and technology to turn it into something useful, and finally, the ability to present a service or product to the customer.

Two and three can be done by anyone, this is the glorious thing about the internet, it democratises access to customers. But the first one is a bit more complicated. Getting hold of the data will be tricky, storing it in a compliant manner will also be difficult and adhering to new data protection rules which will be coming out in May 2018 will be complex. The ones who have the knowhow and the technical expertise to do this will be the players who are established already.

With new rules customers will have more control over what is deemed their personal data. Companies will have to be more transparent in the way they use data, meaning an element of trust will have to be created. Most customers are sceptical by nature, meaning a solid brand will be critical to capitalizing on the data economy. This might make it tough for a nobody to make cash.

There will of course be success stories, but perhaps the majority will be coming out of the already established players. Think of the smart home, an area which we think is going to make a huge amount of cash. The ones making the moves here are Google and Amazon. We can’t see any newcomers breaking the status quo. Or retail, autonomous vehicles, VR, digital healthcare, social media, finance. The new ideas are coming from the established players, or technology giants leaking into other verticals.

Moving into the data driven economy will start to make the big players bigger. They have the raw tools to make the information age work for them. Considering the normalization of digital engagement techniques, the online world is starting to become very noisy, just like the high street used to be. The ones with the financial might and scale won back then in the physical world, and we might be starting to see the same trend here.

To survive and thrive in the data driven world you need data, algorithms and access to the customer. Unfortunately, unless you have all three already, you’re probably going to start to fall behind, and it will become tougher to catch up.

Telia tries shocking new strategy: improving customer experience

Telcos in the UK might not understand what it means to be customer centric, but the Swedes seem to be having a solid crack at it. Well, Telia at least.

The point of attack here is an app called ‘Min Mobile’, developed by a company called eBuilder, which Telia has a non-controlling stake in. Essentially it is a platform which collects information about you and how you use you device, but then offers advice on how you can improve performance.

It’s an interesting strategy to re-engage customers, who are starting to forget about the operator. Telia’s Gustav Berghog highlighted to us the user is now more concerned with the handset manufacturer and the flashy content providers/OTTs; there is a risk the operator will be thought as nothing more than a commodity, and therefore traded out without much thought or emotional loss.

Ideas like ‘Min Mobile’ are designed to take Telia back into the customer life. The team want to show the operator is more than just a connectivity provider, but provide an experience which adds value. This idea of ‘positive discounting’, as Berghog describes it, removes the idea of an operator relationship being transactional, and aims to create an element of loyalty. That’s ultimately what ‘Min Mobile’ is; a customer retention strategy.

So how does the app work? Once downloaded, the app monitors how you use your device, and aims to predicts any flaws or errors on the device. The app currently monitors four areas; storage, battery, general performance and device condition/age. There are plans to extend in others, but these address the main pain points for consumers for the moment.

After monitoring your device for a while, the app might figure out that your battery performance is 15% less than other users on the same one. Using this information, Telia can make communication with you much more personalised. The might send out a message with tips focused on improving battery life to you, but your partner might get one on storage tricks, as this was an issue highlighted on their device. It is much more appealing, a step away from the general ‘engagement’ messaging which most operators make use of, and it is pretty useful as well.

Data usage is another area which the team might investigate. By monitoring your geographical location and where you turn your wifi on, a pattern will soon emerge. For those who are data conscious, a small reminder to turn on your wifi would be a good little value add. These are not ground breaking ideas, but tie enough of them together and they start to make a difference.

And it seems the Swedes like it as well. Since launching in January, the app has been downloaded 100,000 times, 76% of those downloads retained, and has a rating of 4.5/5 on Google Play. For those who have the app, the Net Promoter Score is 29, compared to a score of 1 when they don’t.  Berghog wasn’t able to say whether this has had a direct positive impact on churn rates, but the early signs are certainly good ones.

But it should be worth noting Telia also plan to make money off this data as well. By collecting information around storage, battery, general performance and device age/condition, and combining that with other data sets such as customer demographic, handset type and historical upgrading behaviour, Telia can start to develop a purchase pattern for each customer. This can be used to approach the customer at the right time to renew an agreement or upsell to more premium products.

Customer retention is clearly an objective for Telia, and creating these purchase patterns mean the team can engage the customer earlier in the process. Potentially the team can start that conversation before the customer get curious by other deals.

Berghog thinks there is also another way the team can provide value by becoming a bit of a broker for the mobile industry. All the data which has been collected so far not only allows Telia to increase engagement, avoid churn and upsell new products, it also tells the team about how you use your device specifically. An ambition for Berghog is to become an independent advisor to the customer.

Imagine you currently have a Samsung handset. With all of this information, Telia might be able to say because the way you use the device, the new Huawei model might be more suitable. It might also be able to suggest not to update to the newest version of an operating system, for example, because that would not suit the way you use your device. Helping the customer make more informed decisions is one way in which Berghog feels Telia can add value and create an emotional connection to the customer.

The best ideas are the ones where both sides of the equation feel they have gained something. This is one of the instances where it could be true. The customer gets a better experience, and potentially a better deal, whereas Telia increases customers loyalty. It is still early days for the moment; Berghog highlighted the team need to validate the benefits to Telia, while also scaling to the rest of the user base, but the early signs are certainly positive.