UK police are using AI to make precrime a reality

UK local councils and police forces are using personal data they own and algorithms they bought to pre-empt crimes against children, but there are many things that could go wrong with such a system.

A new research by Cardiff University and Sky News shows that at least 53 UK local councils and 45 of the country’s police forces are heavily relying on computer algorithms to assess the risk level of crimes against children as well as people cheating on benefits. It has raised many eyebrows on both the method’s ethical implications and its effectiveness, with references to Philip K Dick’s concept of precrime inevitable.

The algorithms the authorities sourced from IT companies use the personal data in their possession to train the AI system to predict how likely a child in a certain social environment is going to be subjected to crime, giving each child a score between 1 and 100, then classifying the risk level against each child as high, medium, or low. The results are then used to flag to social workers for intervention before crimes are committed. This does not read too dissimilar to the famous Social Credit system that China is building on national scale, though without the benefits of faster housing loans or good schools for kids as a reward for good behaviour.

The Guardian reported last year that data from more than 377,000 people were used to train the algorithms for similar purposes. This may have been a big underestimate of the scope. The research from Cardiff University disclosed that in Bristol alone, data from 54,000 families, including benefits, school attendance, crime, homelessness, teenage pregnancy, and mental health are being used in the computer tools to predict which children are more susceptible to domestic violence, sexual abuse, or going missing.

On benefit assessment side, the IT system to support the Universal Credit scheme has failed to win much praise. A few days ago, computer generated warning letters were sent out to many residents in certain boroughs, warning them their benefits would be taken away because they have been found cheating. Almost all the warnings turned out to be wrong.

There are two issues here. One is administrative, that is how much human judgement can be used to overrule the algorithms. Local councils insisted that analytics results will not necessarily lead to actions. Privacy activists disagreed. “Whilst it’s advertised as being able to help you make a decision, in reality it replaces the human decision. You have that faith in the computer that it will always be right,” one privacy advocacy group told Sky News. Researchers from Cardiff University also found that “there was hardly any oversight in this area.” Over-enthusiastic intervention, for example taking children away from their families in not absolutely necessary circumstances can be traumatic to the children’s development. Controversies of this kind have been long and hard debated in places like Norway, Sweden, and Finland.

Another is how accurate the output from the algorithms are. The police in Kent believed that among the cases pursued by their algorithm, over a third of all cases on the police’s hand, 98% have been accurate. If this is true, then either Kent Police has a rather relaxed definition of “accuracy”, or it knows something the technology world does not. IBM’s Watson, one of the world’s most advanced AI technologies, has been used by Vodafone to help provide digital customer service. It has won Vodafone prizes and was hailed as a big AI success by IBM during MWC 2019. Watson’s success rate at Vodafone was 68%

Late last year the Financial Times reported that one of China’s most ambitious financial service, Ant Financial, which is affiliated to Alibaba, has never used its credit scoring system to make lending decisions, despite that it had been four years in the making and had access to billions of data points in the Alibaba ecosystem. “There was a difference between ‘big data’ and ‘strong data’, with big data not always providing the most relevant information for predicting behaviour,” an executive from Ant Financial told the FT. A think-tank analyst put it in a more succinct way: “Someone evading taxes might always pay back loans, someone who breaks traffic rules might not break other rules. So I don’t think there is a general concept of trustworthiness that is robust. Trustworthiness is very context specific.”

It is understandable that the UK police and local councils are increasingly relying on algorithms and machine learning as they have been under severe spending cut. The output of algorithms could be used as helpful references but should not be taken at its face value. It is probably safer to admit that AI is simply not good enough yet to drive or guide important decisions as policing, criminal investigation, or social worker intervention. Getting Vodafone’s customer service more accurate is a more realistic target. Even if the bot still failed to help you set your new phone up properly, you would not end up queuing at the foodbank, or have your children taken away for “crime prevention” purposes.

President Xi doesn’t love the private sector, but he needs friends with benefits

Beijing has issued a new policy to encourage entrepreneurship, but it will comes with some pretty major strings attached.

Over the past few months there have been repeated cries in China, ostensibly from some parts of the academia and media, to relegate the private sector further to a second-class role in the economy, to play only a supporting part to the state-owned enterprises (SOEs), despite the former contributing 60% of China’s GDP and creating 80% of the jobs.

But at the beginning of this month Chinese President Xi Jinping hosted a high-profile meeting with 50 leading private companies to reassure the private sector that “you belong to our family”. Promises were made to help alleviate the private sector’s tax burdens, make their access to the financial and capital markets easier, do away with more restriction on competing with SOEs, as well as protect their private property rights.

Since then we understand the state and local media have interviewed numerous cabinet as well as local officials who all vowed to carry out the presidential decree in earnest. Private sector sources in China have also told us that they have received many visits from local officials to hear their grievances and to promise support.

None of these is unheard of. Mr Xi, who has assumed supreme power and has steered the country more and more towards a Mao-style governance, has not been the biggest fan of market economy and has not shied from saying so. But the time has changed. Private sector confidence in the economy has hit a historic low, and many businesses have opted to sell and pull out of the country. One of the highest profile cases in recent years was the withdrawal of Li Ka-shing, the Hong Kong tycoon, who owned Hutchison (and ultimate owned Three UK). This is not helped by the slowdown of the economy in general and the trade war with the US. The country desperately needs the private sector to shore up the economy and keep people employed.

As a follow-up action, Beijing’s municipal government issued its new policy to improve business environment. Many concrete measures are put into place, including shortening the approval process of new business set-up to three days by the end of 2018, and to two days by the end of 2019, and the process to register fixed assets to one to four days. The new regulation also raised the threshold of business revenue that can enjoy favourable tax rate as well as the tax-deductible amount companies can spend on R&D equipment.

The measures may be favourable to private businesses but may not always be beneficial to individuals. For example, business owners are given more flexibility on how much social security payment they can choose to pay for their employees, especially the housing benefit. More importantly, buried in Article 15 of a total of 22 articles, are the specifications on implementing “social credit system” in Beijing.

As was already reported, Beijing vowed to implement the social credit system by 2020 as a model of “city of integrity and honesty” for the whole country. Specifically, the authorities (15 municipal government departments and all the 16 district councils) will complete three lists: data list, behaviour list, and measure (reward and punishment) list, which are to be used to give every resident (22 million of them) a “social security point”.

The names of individuals and businesses that lose credit will be publicly shamed. As the regulation stated in unambiguous terms, those with good points will have “green channels” opened to them in areas like market access, public service, travel, employment, setting up own business, etc. On the other hand, anyone who loses a credit point somewhere will be restricted everywhere.

This most likely will be the biggest artificial intelligence and smart city project (and not in the IoT sense) in the world. But, with the presidential backing, it may succeed.

China’s social credit system set to kick off in Beijing in 2020

The Chinese state wants to control its citizens via a system of social scoring that punishes behaviour it doesn’t approve of.

This initiative has been widely reported, including an excellent piece from ABC Australia, but this marks one of the first times a specific timescale has been attributed to it. Bloomberg reports that Beijing, China’s capital city, plans to implement the social credit system by the end of 2020, which will affect 22 million citizens.

The full plan has been published on a Chinese government website, and we currently have our Beijing bureau sifting through it to bring you our own take on the primary material. But for the time being we’re relying on Bloomberg’s account, which highlights just how sinister this sort of thing is.

People who accumulate higher social ‘scores’, the rules and algorithms for which are presumably opaque, subjective and nebulous, get access to special privileges, while those who fall foul of the system will apparently be unable to move even a single step. This is hopefully at least a bit hyperbolic, but it does indicate that a lot of the sanctions attached to a low score focus on the ability to travel.

Mobile technologies, including smartphones, social media, facial recognition, etc, will clearly play a big part in this Orwellian social manipulation strategy. The fact that our every action, or even inaction, now leaves a permanent digital fingerprint makes this sort of thing possible in a way that it never has been before. If you want a further sense of quite how seamlessly it could metastasize beyond China, watch the episode of Black Mirror called Nosedive, a preview of which you can see below.