User login

Call for evidence: data and technology ethics in education, health, smart vehicles & policing

Call for evidence: data and technology ethics in education, health, smart vehicles & policing

27th November 2018

Policy Connect and All-Party Parliamentary Group on Data Analytics launch landmark inquiry into data and technology ethics

Taming the new Wild West: Data and Technology Ethics fit for our time

Data’s role in our lives has increased but regulators are struggling to keep up with the pace of change. It’s clear that the UK, like many other countries, is facing a data and technology governance gap. This is why a cross-party inquiry from Policy Connect and the All-Party Parliamentary Group on Data Analytics is researching clearer principles and best practice standards for data use.

To remain at the forefront of the technological global race, the UK – through both public and private enterprise – must address ethical concerns now, rather than fix problems as they arise and be left on the back foot.

The Data and Technology Ethics Inquiry will concentrate on the areas of trust, ethics and good governance. This includes public trust, business confidence, and the trade-offs between privacy and progress that are inherent in technology developments and big data. The research will also examine the need for good governance in the tech sphere and accountability and redress when the first line of trust is broken.

With the backing of Jisc, Deloitte and the Association of Chartered Certified Accountants, the inquiry will focus on areas of society that feel the huge benefits of using data and technology, but also demonstrate unique ethical concerns around privacy, lack of transparency in algorithm bias, surveillance, and potential for fraud and exploitation.

Its recommendations will be targeted to advise the newly-established Centre for Data Ethics and Innovation.

The call for evidence can be found here.

The areas that will be explored are:

  • Healthcare: New medical treatments and medicines may result from “mining” NHS data, yet personal medical data is highly sensitive – how should we balance the scientific benefits and personal privacy?
  • Automated vehicles: Self-driving vehicles would represent a leap forwards in technology, with potential social and safety benefits, yet clear transparency,  and liability and accountability area hard to determine when something goes wrong in a system that uses algorithms that may not come from a single developer.
  • Education: Universities and colleges can learn a lot about students’ behaviour and use predictive analytics to target those with specific needs or who may be about to drop out of a course, significantly improving their life outcomes. However monitoring student attendance, library use and internet browsing may be viewed as undue surveillance – is there concern over the legitimacy of university decisions on individual students based on potentially biased algorithms?
  • Policing: Predictive analytics using databases of crimes and individuals can  prevent organised crime and terrorism, giving police tools to help distribute and plan resources. However there are outstanding ethical questions about how data will be collected and processed, concerns about false positives, and where the limits of acceptable behaviour lie in this space.

These topics will be explored through roundtable discussions and a call for evidence which will run from 13 November until 31 January.

Professor Luciano Floridi, University of Oxford (Oxford Internet Institute),Academic Advisor to the inquiry, said: “Clarifying the ethical direction in which digital solutions should develop is essential to avoid preventable mistakes, minimise negative impact, enable fruitful innovation, and therefore ensure that the digital revolution will deliver its great positive value for all and for the environment we share.

Paul Garel-Jones, AI partner at Deloitte UK, inquiry sponsors, said: “Understanding the risks and limitations of AI, data and technology is crucial to ensure its usage is to everyone’s benefit. Equally, we know that trust is built though ethical and well-governed use. We are delighted to be contributing our expertise to the debate, drawing from our own journey and that of our clients’ across multiple industries and the public sector.”

Phil Richards, Chief Innovation Officer, inquiry sponsors Jisc, said: “The technologies at the core of the fourth industrial revolution will also transform the student experience, creating a new ‘Education 4.0’ underpinning future teaching, learning and student wellbeing. Jisc’s national learning analytics service and associated code of practice has helped establish a UK consensus there; it is vital we build on that, in relation to appropriate ethical use of wider key data sets and AI, so the UK can operate at the global forefront of Education 4.0, and prepare our learners and citizens for the radically different world that lies ahead.”

Narayanan Vaidyanathan, Head of Business Insights, Association of Chartered Certified Accountants (associate member APPG Data Analytics), said: “Digital ethics is an important area for professional accountants who must act in the public interest as they face new types of dilemmas in a digital age. Being part of this fast-evolving discourse will enable them to build their knowledge, exercise judgement more effectively and ultimately fulfil their responsibilities to their organisations and to society.