APGDA and ReEnTrust host webinar on rebuilding and enhancing trust in algorithms

On Wednesday 2nd December 2020, the All-Party Parliamentary Group on Data Analytics and the ReEnTrust were delighted to host an online briefing on rebuilding and enhancing trust in algorithms. 

The ReEnTrust is a research project funded by the Engineering and Physical Sciences Research Council (EPSRC) under the Digital Economy programme and is a collaborative effort bringing together academics from the University of Oxford, University of Edinburgh and the University of Nottingham. Their current body of work informs both policymakers, businesses and economists by suggesting new approaches and technologies that will enhance trust in algorithms. The research also seeks to develop ways in which online platforms can make the most out of such technologies.

The event was chaired by APGDA Chair, Daniel Zeicher MP, and included contributions from a number of senior researchers from the ReEnTrust following the publication of their recent policy paper.

Mr Zeichner opened the meeting at 11am and presented an overview of the work of the APGDA. He noted the growth in the use of artificial intelligence and developments within public policy ahead of the recently published National Data Strategy

The first speaker was Dr Jun Zhao of the Department of Computer Science at the University of Oxford. Dr Zhao began by noting the long history of debate about trust in algorithms and how they have been applied to computer programming and wider society. She noted that although algorithms themselves are not “intelligent” in a conventional sense, they nevertheless are rapidly increasing in scope and size as part of the wider increase of data and digital technologies. 

The second speaker was Dr Philip Inglesant, also of the Department of Computer Science at the University of Oxford. Dr Inglesant spoke about the scope of the research and the way in which the findings of the report had been developed to influence recent policy developments, such as the Government’s Online Harms White Paper. He also discussed how the project had developed a range of workshops engaged with participants from across different sectors, as well as more technical “sandbox” programmes to explore how trust and algorithms can be developed in more detail. 

Dr Inglesant then went into some of key findings of the report, including different attitudes towards trust in online resources between younger and older internet users. These include the impact of social media, and debates about how notions of “trust” may be less important to data-driven technologies impacting society than their overall reliability. In reference to views regarding an overall lack of transparency for a lot of online resources, Dr Inglesant noted that people are far more likely to trust algorithms if they meet their needs. 

The final speaker was Dr Ansgar Koene of the School of Computer Science at the University of Nottingham. Dr Koene provided an overview of some of the substantive policy issues on public awareness and overall confidence in digital platforms and online services. In particular, he referred to foreign and domestic efforts required to improve trust and accountability from the public into algorithms. Dr Koene referred to both international efforts to arrive at policy consensus, such as the Internet Governance Forum, as well as scope for AI and Digital Technologies to influence the British Government’s wider Industrial Strategy. 

From a policy perspective, Dr Koene spoke about the need to develop trust as part of the wider take-up of devices making use of the Internet of Things (IOT) as well as the importance of developing a multifaceted approach to algorithm policy. He reiterated the points raised by both Dr Zhao and Dr Inglesant regarding the basic assumption that different sections of the population have different needs and demands from data-driven technologies. He also referred to the Centre for Data Ethics and Innovation’s recent review into algorithmic bias and the challenges associated with incorporating such reports are going to translate into new regulatory requirements. 

Daniel Zeichner then led an open question and answer session from attendees. There were a number of contributions regarding the report and the wider challenges associated with trust in data ethics, including from Lord Wallace of Saltaire, who noted the divergence in approaches to trust between younger and older people with regard to the use of data-driven technologies. Participants also noted how data trust issues relate as much to the platform or institution as they do to the technology used. 

Other attendees noted the impact of trust on how the future regulatory framework would be developed, as well as the perception that public attitudes towards the government use of data was considerably less pronounced than perceptions pertaining to providing sector data. In particular, participants mentioned the importance of the NHS in terms of conveying trust to the general public around AI and medical data, in addition to the challenges associated with this if this group of professionals also don’t understand the principles themselves. 

Health data in particular was a theme explored by the APGDA in their recent event with the Wellcome Trust on Understanding Patient Data

The roundtable is the latest in a series of events being conducted by the APGDA into building and developing policies to improve trust and transparency with data-driven technologies. The Group will be launching their latest report into place-based data ethics in early 2021 in collaboration with Manchester Metropolitan University.