Life in a Quantified Society

Life in a Quantified Society

I think of myself as a digital native even though today’s kids are really what people think of as digital natives. But growing up in the ’80s in Palo Alto, California, was ground zero for the computer revolution. It was exciting. Everyone around me—my parents— they all worked in this field, and they all had great hopes for what computers were going to do for our lives. I don’t know that any of them envisioned that we’d all be carrying them around, staring and walking into people all the time. But it was a really exciting and fun time. Honestly, big data is just the fact that you
can store a lot of data. The cost of storing data is so cheap that,
basically, you can store everything. What happens to big data is what people decide to do next. An algorithm is just a recipe. It tells you what ingredients you need, and then what steps you need to do to complete the task. Computers are basically run by algorithms, but the way that people mostly experience an algorithm in their daily life, for instance, is the Facebook News Feed. It looks at what items you’ve clicked on, what you’ve read, who your friends are, and chooses amongst all their posts to figure out which ones to put at the top of the feed. And so the algorithm is making the decisions that newspaper editors used to make about what to put on the front page. So that’s a fundamental difference in how
you experience news, right. The human decision making, the automated decision making. And they both have their benefits, and their downsides. Well, the problem with computerized decision making is that it doesn’t have any sort of moral compass. So, for instance, one that I recently wrote
about is, criminals are assigned a risk score of whether they might commit a future crime. So at the time of arrest—and this is happening across the nation— in their software that’s used, they ask a bunch of questions and assess these people, whether they’re going to be criminals in the future. But when you have a program like the one I looked at, which just assigns you a one through ten score of risk of committing a future crime, it’s very hard to dispute that. Because how do you say, “I’m not a seven, I’m a four,” right? That’s a hard argument to make in a courtroom, and the judges use that to inform decision making along the way. So sometimes whether you’re going to get out on bail, or at sentencing, what type of sentence you might have— probation versus drug treatment versus going to prison. So these scores really do have huge implications for people’s lives. So that type of algorithm is one, I think,
that deserves more scrutiny, and more public awareness. If you say, “Oh, I just wrote the program,
but I don’t know why it generated that ten score for you.” Then we’re just taking accountability out of the system, and that’s where I’m concerned. As a journalist, for me, my job is to really
hold people accountable for their decisions. I think that, honestly, the world is getting
to be a better place. There are fewer wars, and less poverty, everything is getting better. People are trying really hard to fix problems. But we’re also far more aware of how bad it is because, actually, we have such great visibility through computers, right, through the internet. We can see everything terrible that’s happening, and we’re trying to take it on and fix it. And I hope that we continue to try. And I think algorithms are a big piece of
trying. I think it’s great to try, but we just have
to try to remember that computers are not magic. And that we have to make sure that we don’t embed our biases into them, and give them full rein to make decisions without accountability.


  1. Post

Leave a Reply

Your email address will not be published. Required fields are marked *