Algorithmic Credit Scoring and the Regulation of Consumer Credit Markets

Ce contenu n'est pas disponible dans la langue sélectionnée.

Abstract

This presentation examines the growing use of alternative data and machine learning by credit providers to assess the creditworthiness of borrowers — a trend described as ‘algorithmic credit scoring’— and the implications of this trend for the functioning and regulation of consumer credit markets. It finds that a key consequence of algorithmic credit scoring will be to reduce and ultimately reverse the asymmetry of information between lenders and borrowers, wherein lenders typically know less than borrowers about factors affecting the latter’s creditworthiness. Rather, lenders could know more about borrowers than they know about themselves. Whether this has a positive or negative impact on consumer credit markets depends on the relative value attributed to different, often competing, normative goals: in particular, efficiency, distributional fairness and privacy. The paper highlights key considerations relating to each of these goals due to the proliferation of algorithmic credit scoring, and identifies how this socio-technical development challenges the traditional assumptions and modalities of consumer credit regulation.  

Bios

Main speaker:

Nikita Aggarwal is a Research Associate at the Digital Ethics Lab, as well as a Research Fellow and doctoral candidate at the Faculty of Law. Her research examines changes to the regulatory landscape occasioned by the proliferation of data-driven technology, particularly due to advances in machine learning (‘artificial intelligence’). Her other areas of interest include internet policy and regulation more generally, as well as the ethics of data-driven technology.

As the Research and Course Design Fellow in Law and Technology under the ESRC-funded project ‘Unlocking the Potential of AI for English Law’, she is researching the educational skills gaps in legal education and training generated by recent technological development, and is helping to design and deliver a more interdisciplinary approach to law and technology education at the University.

Prior to entering academia, Nikita was an attorney in the legal department of the International Monetary Fund, where she advised on financial sector law reform in the Euro area and worked extensively on initiatives to reform the legal and policy frameworks for sovereign debt restructuring. She previously practiced as an associate with Clifford Chance LLP, where she specialized in EU financial regulation and sovereign debt restructuring.

She earned her law degree (LLB) from the London School of Economics and Political Science, and is a solicitor of England and Wales.

respondent:

Ignacio Cofone is an Assistant Professor at McGill University’s Faculty of Law, where he teaches Privacy Law, Business Associations, and Artificial Intelligence Law. His research explores how the law should adapt to technological and social change with a focus on privacy and algorithmic decision-making. In his latest projects, he proposes how to evaluate harm in privacy class actions and how to prevent algorithmic discrimination.

Before joining McGill, Cofone was a research fellow at the NYU Information Law Institute, a resident fellow at the Yale Law School Information Society Project, and a legal advisor for the City of Buenos Aires. He obtained his law degree from Austral University, an MA from the University of Bologna, a joint PhD from Erasmus University Rotterdam and Hamburg University, where he was an Erasmus Mundus Fellow, and an LLM and JSD from Yale Law School.

This event is organized in collaboration with CIPP.

Registration is required. Please register bellow. Registration is limited. First come, first served.

A request for CLE accreditation has been made to the Quebec Bar Association.

Aggarwal / Cofone
Reconnaissance de formation continue pour le Barreau du Québec (1h30)
Reconnaissance de formation continue pour la Chambre des notaires (1h30)

Ce contenu a été mis à jour le 16 mars 2021 à 16 h 39 min.