The top-line message from the secretary of state for digital, Oliver Dowden, is that government wants the “high watermark” of data sharing that’s been seen in the UK during the pandemic to be the new normal for the 2020s — greasing the pipe for “businesses, government and organisations to innovate, experiment and drive a new era of growth”, as he puts it.
“The new strategy will look at how the country can leverage existing UK strengths to boost use of data in business, government and civil society,” the government writes. “It proposes an overhaul in the use of data across the public sector and the government will launch a programme of work to transform the way data is managed, used and shared internally and with wider public sectors organisations, to create an ethical, joined up and interoperable data infrastructure.”
By “data” the policy paper makes it clear that the government means the whole ‘kit & caboodle’ — aka “information about people, things and systems” — though the focus of the strategy is purely on digital data, not information held on paper.
“Given the significant technological changes of the last five years, and the more significant changes we expect to see throughout the 2020s, we need a data strategy that reflects the opportunities and challenges of our new hyper-digital world, and ensures that the decisions, priorities and potential trade-offs that we face are considered in a deliberate and evidence-driven way,” it goes on.
In the early stages of the pandemic, the UK quickly inked a number of health data-sharing deals with tech giants including Google and Palantir — granting access to health information on millions of UK citizens to develop a data platform to coordinate its response to the COVID-19 public health crisis.
At the time it touted the power of “secure, reliable and timely data” to inform “effective” pandemic decisions. Though the arrangements have attracted controversy over their scope and lack of transparency.
Now the government is saying it wants this ‘pandemic level’ of urgency to apply everyday, accelerating data sharing across government and beyond regardless of whether or not there’s a burning health emergency.
To feed its grand ambition of data-fuelled ‘levelling up’ of the public sector, the policy paper sets out a major civil service upskilling plan — with the government saying it wants 500 data analysts across the public sector to be trained in data science by 2021. The Office for National Statistics will play a central role here, with the training being delivered by its Data Science Campus.
The government also plans to offer up to ten “innovation” fellowships per year — with the aim of attracting “world-class tech talent” to work with it to support digital transformation in the public sector. It says the fellowships are modelled on a similar US scheme which attracted the lead developer on Google Maps, former CEO of Symantec and co-founder of the Earth Genome Project to work on US government projects.
“Those fellows will sit within No 10, the Government Digital Service and a number of departments, and use their skills to contribute to the kind of fulfilling challenging projects that only the public sector can offer — ones that have a huge impact on society as a whole,” it writes.
A new government Chief Data Officer will also be appointed to lead a “whole-government” approach to transforming data use — with a focus on driving efficiency in public service delivery. This role is in addition to a new Chief Digital Officer post announced last month.
“To help arm the next generation with high quality data skills, the Government will explore new ways to teach undergraduate students data skills that complement the existing current maths and computing curriculums, as well as developing T-Levels which include qualifications on digital skills,” it adds in a press release.
There’s a strong, Brexit-fuelled, de-regulatory whiff to the strategy — with the paper containing lines like: “Having left the European Union, the UK will champion the benefits that data can deliver”; and: “We will promote domestic best practice and work with international partners to ensure data is not inappropriately constrained by national borders and fragmented regulatory regimes so that it can be used to its full potential.”
Yet the government also writes that it’s committed to seeking “positive adequacy decisions from the EU, under both the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED), before the end of the [Brexit] transition period”. Although, of course, it’s hardly going to say it wouldn’t like a nice data deal with the EU.
Without an EU data adequacy decision, a post-Brexit UK will be treated by the bloc as what’s known as a ‘third country’ — piling legal risk and friction on data transfers from the EU, with huge implications for the UK’s digital services sector. Per the government’s press release, “data-enabled” UK service exports were valued an estimated £243BN in 2019, or 75 per cent of total service exports — and a major chunk of that business involves EU citizens’ data. So any future barriers to EU to UK data transfers risk blowing a very sizeable hole in the economic component of the strategy.
The UK’s prospects of securing a data adequacy decision from the Commission will depend on how aligned it is with relevant EU regulations. And EU lawmakers confirmed this month that a recent court ruling by Europe’s top court (aka Schrems II) — which struck down a recent data adequacy agreement between the EU and the US — has implications for a post-Brexit Britain (which has its own swingeing surveillance regime).
So the UK’s high level talk here of adopting ‘data maximizing’ domestic standards and blasting through regulatory constraints appears to deny the existence of international standards, international law and geopolitics. It may also be viewed dimly in Brussels if the comments are interpreted as they sound, i.e. like a sideswipe at current EU data standards.
On international data flows the UK strategy also targets what it dubs “unjustified barriers” to cross border data flows — such as localization requirements to store/process data in a particular country.
“The UK will take a leading role in encouraging the removal of such barriers to unlock the growth potential of global digital trade,” it writes, adding that it will seek provisions with trade partners (including via its current negotiations with the EU) to prevent “the use of unjustified data localisation measures”.
The tension between the UK’s desire to slash barriers to data sharing as a strategy to drive economic growth and the parallel need to operate a “trusted” data regime to maintain public trust — and, indeed, access international data — are evident elsewhere in the policy paper, where the government writes: “We want our data protection laws to remain fit for purpose amid rapid technological change.” [emphasis ours]
“To build a world-leading data economy, we must maintain and bolster a data regime that is not too burdensome for the average company — one that helps innovators and entrepreneurs to use data legitimately to build and expand their businesses, without undue regulatory uncertainty or risk in the UK and globally,” the government goes on.
“Given the rapid innovation of data-intensive technologies, we also need a data regime that is neither unnecessarily complex nor vague. Businesses need certainty to thrive, and the government will work with regulators to prioritise timely, simple and practical guidance, especially for emerging technologies, and create more opportunities to experiment safely.”
On the latter, the strategy talks about testing the “possibilities” of sharing data between the public and private spheres.
Specifically, it’s announced a plan to fire up a cottage industry for AI -powered content moderation tools — seemingly to underpin a wider plan to regulate online harms — via a new £2.6M project which it says will “model how improved systems for classification and sharing of data could support a competitive commercial market in tools able to detect online harms such as cyberbullying, harassment or suicide ideation”.
Here’s more from the press release:
The Online Harms Data Infrastructure project is a new £2.6m pilot project, funded through the HM Treasury Shared Outcomes Fund, to explore how improved systems for data sharing and data interoperability could support innovation and competition in the detection of online harms. This project will analyse the current data landscape and the economic and social benefits of opening up online harms data, and then test a number of potential practical solutions. It forms part of the wider programme of work led by DCMS and Home Office to make the UK the safest place in the world to go online, and the best place to grow and start a digital business.
Through this programme, the government says it will “review and upgrade the data standards and systems that underpin the monitoring and reporting of online harms such as child sexual abuse, hate speech and self harm and suicide ideation”, per the policy paper — which, again, is a reference to its ambitious plan to regulate online content by imposing a duty of care on platforms.
It’s not clear how the government proposes to enable the sharing of such sensitive user data with commercial entities via this program without major data protection risks.
“Beyond the commitment to open data, the government has long recognised that new models and approaches are needed to drive value from data and data systems that span the private and public sector – this is particularly important in cases where the data itself is not appropriate to be shared as open data, be it for privacy, national security or commercial reasons,” is all the policy paper has to say on that.
It’s worth noting that the government has dipped its toe in the water on the public-private AI content moderation front before now. Back in 2018, the then Home Secretary announced a machine learning tool, developed with public money by a UK AI firm, which it claimed could automatically detect propaganda produced by the Islamic State terror group with “an extremely high degree of accuracy” — as it sought to amp up pressure on Internet giants to accelerate takedowns of terrorist content.
As an extended aside, the UK company that developed that tool was called ASI Data Science. The company has since rebranded to Faculty — a name that may be familiar as it’s one of tech firms granted access to UK citizens’ health data as part of the government’s COVID-19 response data platform. (That Faculty contract — providing “strategic support to the NHSX AI Lab” — had a value in excess of £1M.)
The firm has in fact won a swathe of UK government contracts in recent years, since working on the pro-Brexit Vote Leave campaign with senior government advisor (and defacto data guru in chief), Dominic Cummings. So you can at least see one clear thread running right through this national data strategy.
In another component of the plan that could open up startup opportunities, the government says it wants to expand on the current “Smart Data” initiatives — such as the Open Banking scheme — to enable service switching and innovation across more sectors via regulated data sharing.
On this it says it will bring forward primary legislation to give people “the power to use their own data to find better tariffs in areas such as telecoms, energy and pensions, and open the doors to disruptors in every part of the marketplace”.
“The government is committed to an economy where consumers’ data works for them, and innovative businesses thrive. We expect that, in time, the extension of Smart Data will deliver new and innovative services, stronger competition in the affected markets, and better prices and choice for consumers and small businesses, including through reduced bureaucracy. Competitive data-driven markets can reduce friction for business and drive start-ups, investment and job creation,” it adds.
UK business groups have welcomed the government’s plan. In a response statement, Felicity Burch, director of digital and innovation, for the CBI, said: “Data underpins the modern economy and is essential to businesses in every sector from logistics to retail. It’s at the heart of global trade, competition, and innovation in areas from health to climate change. We welcome the National Data Strategy as a vital step for the UK be at the forefront of the data revolution. Lessons learnt in the coronavirus crisis must power our economic recovery – crucially, unleashing the power of data in a way that commands trust and empowers people.”
The government’s PR also includes a very supportive statement from Darren Hardman, general manager for Amazon Web Services (UK and Ireland) — with the tech giant eyeing massive upsides in any wholesale public sector shift to big, interoperable, cloud-hosted data. “Making more effective use of data and cloud computing is key to the UK’s long-term economic growth and the continual improvement of our public services,” he suggests. “We welcome the launch of this consultation on the new National Data Strategy, which will be instrumental in ensuring the UK remains one of the world’s leading digital nations.”
Other responses are more circumspect — with a warning of the risks of “over-collection” and “inappropriate use”, of data coming from Dr Jeni Tennison, VP at the Open Data Institute.
“People and organisations of all kinds are facing big challenges over the next few years. Data can help us all to navigate them, increasing our understanding of our changing world and informing the decisions we make. Data can also cause harm, for example through over-collection and inappropriate use. At the ODI, we want data to work for everyone, which means ensuring it both gets to the people who need it, and that it is collected, used and shared in trustworthy ways,” she said in a statement.
“This National Data Strategy consultation is an important opportunity for us all to explore and influence how data should be used to support the UK’s economy, environment and communities, and we look forward to the debate.”
The consultation on the UK national data strategy can be found here.
The EU recently announced its own strategy aimed at boosting data reuse to drive economic growth — although that focuses on industrial and non-personal data, with rules for personal data sharing continuing to be regulated via the GDPR.