The French authorities must immediately stop the use of a discriminatory risk-scoring algorithm used by the French Social Security Agency’s National Family Allowance Fund (CNAF), which is used to detect overpayments and errors regarding benefit payments, Amnesty International said today. On 15 October, Amnesty International and fourteen other coalition partners led by La Quadrature du […]
Except a computer isn’t making decisions here? An investigator is making decisions, the computer is sorting the investigatees. Even if that wasn’t the case it wouldn’t be ambiguous who to blame, it’s clearly on who decided what goes into the algorithm and how it should work
Edit: even beyond who would be legally responsible, as evidenced by this article and others like it people are already holding policy makers and anti welfare instigators responsible. The fall guy being one step removed from the crime doesn’t change who made it happen
Except a computer isn’t making decisions here? An investigator is making decisions, the computer is sorting the investigatees. Even if that wasn’t the case it wouldn’t be ambiguous who to blame, it’s clearly on who decided what goes into the algorithm and how it should work
Edit: even beyond who would be legally responsible, as evidenced by this article and others like it people are already holding policy makers and anti welfare instigators responsible. The fall guy being one step removed from the crime doesn’t change who made it happen
Well, it didn’t quite turn out that way, did it?
Sadly not :(