A House of Lords committee has called for tougher rules for big technology firms.
As regulation of the digital world has failed to keep pace with its role in our lives, the House of Lords Communications Committee has recommended a new, overarching regulatory framework so that the services in the digital world are held accountable to an enforceable set of shared principles.
In its report Regulating in a Digital World, the committee noted that over a dozen UK regulators have a remit covering the digital world but there is no body which has complete oversight.
As a result, it says regulation of the digital environment is fragmented, with gaps and overlaps, while big tech companies have failed to adequately tackle online harms.
The Committee recommended a new Digital Authority, guided by principles including accountability, transparency, respect for privacy and freedom of expression.
It said adoption of these principles will help the industry, regulators, the government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all.
The committee’s chairman Lord Gilbert of Panteg said: “The government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.
“Self-regulation by online platforms is clearly failing and the current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation.
“Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people’s lives.
“Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of service.”
Among its recommendations for specific action were:
- A duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. Given the urgent need to address online harms, Ofcom’s remit should expand to include responsibility for enforcing the duty of care
- Online platforms should make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. Major platforms should invest in more effective moderation systems to uphold their community standards
- Users should have greater control over the collection of personal data. Maximum privacy and safety settings should be the default
- Data controllers and data processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred
- The Government should empower the Information Commissioner’s Office to conduct impact-based audits where risks associated with using algorithms are greatest. Businesses should be required to explain how they use personal data and what their algorithms do
- The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. Greater use of data portability might help, but this will require more interoperability
- The Government should consider creating a public-interest test for data-driven mergers and acquisitions
- Regulation should recognise the inherent power of intermediaries