Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have sat across from a call centre in a government office and listened to the workers running through the written version of algorithms. I've spoken to people who worked there and heard how crushing it was to know that someone was getting screwed but to be totally unable to do anything about it because the policy dictated their reaction. And I've worked with charities and listened to the other end of those phone calls; people screaming that their kids are going to be taken away because their benefits have been delayed and they can't afford to feed them.

The underlying assumption of this piece seems to be that turning decision making over to algorithms reduces positive discretion. But the humans in these situations frequently have no more discretion than the machine does, and inefficiency also has a human cost. It seems false to me to pretend that what these algorithms are doing, at least in terms of the majority of their immediate effect, is qualitatively different.

What you're losing when you encode something as an algorithm is the insight that you get from having humans in the loop. Intuition; the things that people haven't thought to measure yet. That's the weakness in any statistical technique - you need a human to lend numbers relevance; to say what is important to know the relationships of; otherwise they're just a sequence of events.

But you need to start off with a system that leverages human strengths in order for that criticism to make sense. Human judgement only has an advantage in a system designed to use the different sorts of value that it offers. If your call centre worker is not truly responsible for the outcome of the call, and if you don't regularly attempt to get feedback from them to inform policy decisions, then it makes no difference if they are replaced by a machine. They were being treated as one to begin with, and the value that they added to the organisation by virtue of being human; of having professional judgement; was being thrown away anyway.

All this does, in a lot of cases, is make existing flaws more obvious.

The exception I can think of to this is the criminal justice system, where there are examples of positive discretion. However, there are also examples of negative discretion there. There are many stupid laws on the books, and selectively enforcing those laws allows you to screw, more or less, whoever you want. It's not surprising that a system that would mechanically implement those laws would produce undesirable outputs, it's just that it's finally being applied to people who have the power to say something about it, (and, perhaps, have their concerns taken seriously enough to alter policy.)

For all that there is a loss in the case of the criminal justice system, there is also a gain: Encoding something as an algorithm makes the flaws in the process more apparent.



Exactly this.

I often describe programming as creating tiny bureaucracies.

You put some information into a "form" (e.g. a search bar). The front desk bureaucrat (mouse, keyboard, screen, etc.) sends it off to other bureaucrats and they follow a bunch of rules to process it and give the front desk some new "paperwork" to give to you (e.g. the resulting web page).

What we are doing with automated algorithms is getting rid of the human bureaucrats and replacing them with "robotic" bureaucrats. That can be a really bad thing depending on the context, but even the human bureaucrats in many cases were already ~ robots.


That said, once you can automate this stuff, there are new types of policies you can make which would be too complicated otherwise, and that is a place where we can unwittingly innovate into new ways of hurting people.

On the other hand, exposing the inherent inhumanity of strict bureaucracy via conversations about automation may actually be a force of awareness and change. An opportunity to explicitly create "human integrations" at key touch-points where people would otherwise fall through the cracks (think API hooks where you can integrate PagerDuty).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: