Cathy O'Neil

Image of Cathy O'Neil
Every system using data separates humanity into winners and losers.
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
I would argue that one of the major problems with our blind trust in algorithms is that we can propagate discriminatory patterns without acknowledging any kind of intent.
- Cathy O'Neil
Collection: Algorithms
Image of Cathy O'Neil
We can't just throw something out there and assume it works just because it has math in it.
- Cathy O'Neil
Collection: Math
Image of Cathy O'Neil
There's less of a connection for a lot of people between the technical decisions we make and the ethical ramifications we are responsible for.
- Cathy O'Neil
Collection: Decisions We Make
Image of Cathy O'Neil
There are lots of different ways that algorithms can go wrong, and what we have now is a system in which we assume because it's shiny new technology with a mathematical aura that it's perfect and it doesn't require further vetting. Of course, we never have that assumption with other kinds of technology.
- Cathy O'Neil
Collection: Technology
Image of Cathy O'Neil
I know how models are built, because I build them myself, so I know that I'm embedding my values into every single algorithm I create and I am projecting my agenda onto those algorithms.
- Cathy O'Neil
Collection: Agendas
Image of Cathy O'Neil
Because of my experience in Occupy, instead of asking the question, "Who will benefit from this system I'm implementing with the data?" I started to ask the question, "What will happen to the most vulnerable?" Or "Who is going to lose under this system? How will this affect the worst-off person?" Which is a very different question from "How does this improve certain people's lives?"
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
The most important goal I had in mind was to convince people to stop blindly trusting algorithms and assuming that they are inherently fair and objective.
- Cathy O'Neil
Collection: People
Image of Cathy O'Neil
The public trusts big data way too much.
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
It's a standard thing you hear from startup people - that their product is somehow improving the world. And if you follow the reasoning, you will get somewhere, and I'll tell you where you get: You'll get to the description of what happens to the winners under the system that they're building.
- Cathy O'Neil
Collection: People
Image of Cathy O'Neil
Obviously the more transparency we have as auditors, the more we can get, but the main goal is to understand important characteristics about a black box algorithm without necessarily having to understand every single granular detail of the algorithm.
- Cathy O'Neil
Collection: Goal
Image of Cathy O'Neil
For whatever reason, I have never separated the technical from the ethical.
- Cathy O'Neil
Collection: Reason
Image of Cathy O'Neil
We don't let a car company just throw out a car and start driving it around without checking that the wheels are fastened on. We know that would result in death; but for some reason we have no hesitation at throwing out some algorithms untested and unmonitored even when they're making very important life-and-death decisions.
- Cathy O'Neil
Collection: Car
Image of Cathy O'Neil
So much of our society as a whole is gearing us to maximize our salary or bonus. Basically, we just think in terms of money. Or, if not money, then, if you're in academia, it's prestige. It's a different kind of currency. And there's this unmeasured dimension of all jobs, which is whether it's improving the world.
- Cathy O'Neil
Collection: Jobs
Image of Cathy O'Neil
By construction, the world of big data is siloed and segmented and segregated so that successful people, like myself - technologists, well-educated white people, for the most part - benefit from big data, and it's the people on the other side of the economic spectrum, especially people of color, who suffer from it. They suffer from it individually, at different times, at different moments. They never get a clear explanation of what actually happened to them because all these scores are secret and sometimes they don't even know they're being scored.
- Cathy O'Neil
Collection: Successful
Image of Cathy O'Neil
Evidence of harm is hard to come by.
- Cathy O'Neil
Collection: Harm
Image of Cathy O'Neil
When people are not given an option by some secret scoring system, it's very hard to complain, so they often don't even know that they've been victimized.
- Cathy O'Neil
Collection: People
Image of Cathy O'Neil
My fantasy is that there is a new regulatory body that is in charge of algorithmic auditing.
- Cathy O'Neil
Collection: Body
Image of Cathy O'Neil
When I think about whether I want to take a job, I don't just think about whether it's technically interesting, although I do consider that. I also consider the question of whether it's good for the world.
- Cathy O'Neil
Collection: Jobs
Image of Cathy O'Neil
Micro-targeting is the ability for a campaign to profile you, to know much more about you than you know about it, and then to choose exactly what to show you.
- Cathy O'Neil
Collection: Campaigns
Image of Cathy O'Neil
With recidivism algorithms, for example, I worry about racist outcomes. With personality tests [for hiring], I worry about filtering out people with mental health problems from jobs. And with a teacher value-added model algorithm [used in New York City to score teachers], I worry literally that it's not meaningful. That it's almost a random number generator.
- Cathy O'Neil
Collection: Meaningful
Image of Cathy O'Neil
The training one receives when one becomes a technician, like a data scientist - we get trained in mathematics or computer science or statistics - is entirely separated from a discussion of ethics.
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
That's what we do when we work in Silicon Valley tech startups: We think about who's going to benefit from this. That's almost the only thing we think about.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
Especially from my experience as a quant in a hedge fund - I naively went in there thinking that I would be making the market more efficient and then was like, oh my God, I'm part of this terrible system that is blowing up the world's economy, and I don't want to be a part of that.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
I don't think anybody's ever notified that they were sentenced to an extra two years because their recidivism score had been high, or notified that this beat cop happened to be in their neighborhood checking people's pockets for pot because of a predictive policing algorithm. That's just not how it works.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
Most people don't have any association in their minds with what they do and with ethics. They think they somehow moved past the questions of morality or values or ethics, and that's something that I've never imagined to be true.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
I think what's happened is that the general public has become much more aware of the destructive power of Wall Street.
- Cathy O'Neil
Collection: Wall
Image of Cathy O'Neil
The disconnect I was experiencing was that people hated Wall Street, but they loved tech.
- Cathy O'Neil
Collection: Wall
Image of Cathy O'Neil
There might never be that moment when everyone says, "Oh my God, big data is awful."
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
You'll never be able to really measure anything, right? Including teachers.
- Cathy O'Neil
Collection: Teacher
Image of Cathy O'Neil
I think big data companies only like good news. So I think they're just hoping that they don't get sued, essentially.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
The Facebook algorithm designers chose to let us see what our friends are talking about. They chose to show us, in some sense, more of the same. And that is the design decision that they could have decided differently. They could have said, "We're going to show you stuff that you've probably never seen before." I think they probably optimized their algorithm to make the most amount of money, and that probably meant showing people stuff that they already sort of agreed with, or were more likely to agree with.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
I set up a company, an algorithmic auditing company myself. I have no clients.
- Cathy O'Neil
Collection: Clients
Image of Cathy O'Neil
Occupy provided me a lens through which to see systemic discrimination.
- Cathy O'Neil
Collection: Lenses
Image of Cathy O'Neil
I think there's inherently an issue that models will literally never be able to handle, which is that when somebody comes along with a new way of doing something that's really excellent, the models will not recognize it. They only know how to recognize excellence when they can measure it somehow.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
We've learned our lesson with finance because they made a huge goddamn explosion that almost shut down the world. But the thing I realized is that there might never be an explosion on the scale of the financial crisis happening with big data.
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
The national conversation around white entitlement, around institutionalized racism, the Black Lives Matter movement, I think, came about in large part because of the widening and broadening of our understanding of inequality. That conversation was begun by Occupy.
- Cathy O'Neil
Collection: Thinking
Image of Cathy O'Neil
Google is so big you have no idea what a given person does.
- Cathy O'Neil
Collection: Ideas
Image of Cathy O'Neil
An insurance company might say, "Tell us more about yourself so your premiums can go down." When they say that, they're addressing the winners, not the losers.
- Cathy O'Neil
Collection: Might
Image of Cathy O'Neil
The NSA buys data from private companies, so the private companies are the source of all this stuff.
- Cathy O'Neil
Collection: Nsa
Image of Cathy O'Neil
People felt like they were friends with Google, and they believed in the "Do No Evil" thing that Google said. They trusted Google more than they trusted the government, and I never understood that.
- Cathy O'Neil
Collection: Government
Image of Cathy O'Neil
People are starting to be very skeptical of the Facebook algorithm and all kinds of data surveillance.
- Cathy O'Neil
Collection: Data
Image of Cathy O'Neil
I wanted to prevent people from giving them too much power. I see that as a pattern. I wanted that to come to an end as soon as possible.
- Cathy O'Neil
Collection: People