> We need legislation forcing companies to manually review algorithmic decisions that impact people's lives, and have a proper appeals process.
What you want is human judgement, but it will be hard to legislate that. What can easily happen is the human parroting back the underlying reasons for the decision. In this case "We have reviewed your case, and according to our records the account is associated with the problematic account"
I have had this happen to me in bureaucratic situations with no computers involved.
That seems like the same problem we have hear just now you get the headache of the courts on top of the decision, how many people avoid court now for much more impactful and legitimate situations simply because they can't afford the time and money?
What we want isn't review of automated decisions, what we want is openness, transperancy and clarity in the process. The problem people have isn't so much the appeals process it is the opaqueness and seeming arbitrary nature of the whole thing.
There are problems with how transparent you make things though (i.e., giving away the underlying signals). There’s a moving target between fraudsters and risk teams at companies where the fraudsters will try to run just up to the edge of alerting systems without passing over, then scale and repeat it.
If the signals used are made public, fraudsters will win every time. It’s the same with search engines- if they publish how a score is calculated, people will game it immediately.
Maybe the signals should be required to go through a review with authorities? Idk.
I came here to say this too. A lot of the anti-fraud stuff is a closely guarded secret that changes all the time. I don’t even think that legal would let us disclose it even if we wanted to!
That isn’t to say there shouldn’t be some kind of way to escalate an appeal to a real human.
Of course keeping the fraudsters from DDOSing the crap out the appeals process will be a challenge! Because I could see them doing that…
What you want is human judgement, but it will be hard to legislate that. What can easily happen is the human parroting back the underlying reasons for the decision. In this case "We have reviewed your case, and according to our records the account is associated with the problematic account"
I have had this happen to me in bureaucratic situations with no computers involved.