人工智能现在在决定谁获得劳动的工作中发挥着关键作用,他获得了被指控犯罪的贷款和更多的高等学位。但是,研究人员的最新工作表明,驾驶AI的算法在某些情况下,甚至在我们社会中加速了不平等和不公正背后的偏见,特别是对妇女和颜色的人。嗯,现在,Netflix的新纪录片称为编码偏见,解决了这个问题的头,我们已经让董事稳固,他可以把它绑在这里谈谈它。如此Shawny,告诉我们涂层偏见与这份新纪录片有什么作用。好吧,非常感谢让我。涂层偏见遵循麻省理工学院研究员喜悦贝尔和温妮的令人惊叹的发现,即商业上可获得的面部识别并不是黑暗的面孔或女性准确,也是将我们带到广泛偏见探索的兔子洞中。我们在我们每天互动的许多技术中造成的算法危害。三年前,当我开始制作这部电影时,我很重要,我甚至没有真正知道算法是什么。我知道人工智能的一切都来自STEVEN Spielberg或Stanley Kubrick等科幻创造者的心灵和想象力,或者[未知]斯科特,并且我没有完全意识到你在开业范围内所说的内容哪种算法机器学习AI越来越成为机会的守门人。 Deciding such important things as who gets hired, who gets what quality of health care, increasingly even who gets the vaccine, how long a prison sentence someone may serve. And so as I started to understand the extent to which we are outsourcing our decision making, to machines through a groundbreaking book of Cathy O'Neill called weapons of mass destruction. I simultaneously came across a TED talk that Joy gave and began to realize that these same systems that we're trusting so implicitly with decisions that is sent Essentially our changing human destiny have not been vetted for racial bias, or for gender bias, or more broadly that they won't hurt people have unintended consequences and caused harm, and in some cases they haven't even been vetted for some Standard of accuracy that is shared outside of the company that stands to benefit economically. And that's when I really began to realize that everything that we love as people have a democracy Access to fair and accurate information, fair elections, equal rights, fair housing, Fair Employment, all the gains that we've made over 50 years, in terms of civil rights could be rolled back in the name of trusting these algorithms to be neutral when they're not. And that sort of set me on the journey to make the film and kept me engaged in the years making it. What's being put forward is this idea that the algorithms are choosing it, the computers, it's gonna remove bias from the equation. That's like the assumption But when you dig a little deeper really what your documentary has done is expose that in the not only has it made it less Bias. But in some cases,actually more bias. So can you talk a little bit about that?>> We have some misconceptions about what AI can do and can't do. And I think when it comes to artificial intelligence, that all of the knowledge has been in the hands of the few. So all the Power, has been in the hands of the few. And it wasn't really, until making this film, that I could even start to discern, what is actual science. And, when big corporations, are, selling us a bag of tricks, some bogus, baloney, pseudoscience. In the name of profits, and I think we tend to think of technology kind of like our gods, and they're sort of more like our children, reflections of ourselves and even the flaws in ourselves that we are not wanting to reproduce and I really came to understand, first of all that as human beings bias is an innate human condition. It's not just in some bad people. It's in all of us, and it's unconscious to all of us. And those biases can get encoded in the technology that we're programming. And sometimes even in spite of the best intentions of the programmers, that's what's so scary about this stuff. Take for instance, an Amazon sort of hiring algorithm that was used to Ideally make the world more fair. Let's take the human beings and human bias out of the equation and leave it up to the machines. And that AI system was looking at who got hired, who got retained over a number of years and who got promoted. And lo and behold, the machine began to Essentially discriminate. Pull out every applicant that it could discern was a woman. And that happened even in spite of the best intentions of the programmers. And that goes to show just how problematic it is to make these predictions based on data on a And a past that is written with systematic inequalities. It doesn't take too much to understand why these things happen. Machine learning and even what's now called Deep Learning, which is a lot of what AI is about where programmers aren't programming every instance in but what they're doing is Teaching algorithms, how to learn and discern patterns and then to replicate those patterns. And so you can see if you fed a bunch of data, just like you're saying about what applicants were successful or which ones got promotions, Which one, you know maybe had went on to earn more. All of these things would naturally code in some of these biases that we know have been part of the system for a number of reasons that have been explored by social scientists and people working in corporate HR departments. The algorithms in that case are just then taking that and accelerating that and saying, Great, we want to we want, but candidates that look just like this or are just like this. Absolutely. And I think what is so scary about this process is that sometimes we don't even know an automated decision maker has sort of denied us an opportunity and these systems are complex. Completely opaque to us. Just to give another example along the lines of what you're saying, Steve Wozniak, one of the founders of apple and his wife applied for the same Apple credit card. And Steve Wozniak was complaining. Why did my wife get a lower Credit score that I did on the Apple credit card, when we have all the same assets, we have all the same income. And it could be that the scoring system is sort of picking up on the fact At that maybe women had have had a shorter history of access to credit, a shorter history of access to mortgages in this country. And what is so frightening about that is I assume Steve was an ex wife is going to be okay. But for the rest of us, we don't know why we are being denied credit or have a less of a line of credit than a man who has the same background as we do. And you can see the way in which the. This could infringe on civil rights advances that we've worked so hard for over 50 years. My other favorite line in the documentary is data is destiny. And so that gets to some of what you're talking about. And there are the obvious economic parts of what we're talking about. But there also are some other kind of scary societal ones. Maybe you could talk a little bit about the apartment complex in Brooklyn. What is so scary is when Democracies start picking up the tools of authoritarian states with no democratic rules in place. And that's what happened in a housing complex, just not far from where I live in Brooklyn where you know Brooklyn, third generation of Brooklyn Heights in that building, Trinity Moran and Acimia Downes didn't even know what biometric data was when their landlord tried to install it in their housing The complex and they're building. This is a building that already had cameras and a security guard and key fobs. And it was unknown to them why they needed this other level up. And again, these people are not going into Maximum security prison. They're just trying to go home and live their lives a dignified life. And it's important to say that the same landlord actually owns property in upper income communities in Manhattan. But that's not where he tried to put the facial recognition system in. And what is so helpful is not only did Tournay Moran and Isma Downes organise their friends and their neighbours to fight against their landlord installing this kind of racially biassed invasive surveillance technology. You The place they live, and one, they kept the landlord from doing that. But they also inspire the first legislation in the state of New York that would protect other residents from that same invasive surveillance. That goes to show sort of why make documentaries because you It reminds me that everyday people can make a difference and that not everybody who is a superhero wears a cape. And one thing that I hope coated bias does is pull out a chair for all of us and give us all a place at the table because these systems are deployed on all of us. And then they are impacting more and more of our civil rights. So it's imperative that we all just get literate about the systems that will define our future. The part that was explored in the documentary on this was this idea of a very famous idea in the tech industry that the future is here. It's just not evenly distributed. And we often think that to mean that people who have money or inside access or are Live on the cutting edge that they have early access to all of these technologies and then eventually it kind of trickles down to everybody else. But what the documentary turned that on its head saying that in this case like with AI, it's actually lower income people, people who have less access to Politicians and influence that that the technology is being deployed against them on them and then being essentially used to understand what the Let's dumps whether it's law enforcement or corporations trying to profile people for commercial reasons, and then learning with the idea that eventually deploy it to everybody else. That was really interesting and I think a powerful insight from the film. Yeah, ->> Absolutely. You're you're citing the work costs of Virginia Eubanks who's profiled in the film and wrote a groundbreaking book called automating inequality. And it really talks about how communities of colour low income communities are often the place Place where big tech experiments quite like that housing complex where there's very little threat of people standing up for their rights. It just makes clear that we all need to understand how these systems work,>> sort of one of the bedrocks of democracy, is that, we have those rights of privacy Because if you don't know who is going to be in power and what they could use it for. Is there anything more you wanna say about that? That seems to be one of the most the biggest sort of underlying thrusts of the work. I did not realize until making this film What a complete psychological profile, can be built about each one of us by name, using data that's either brokered between private companies, or whether it's Usurped by nefarious third party actors. It is power that I have never seen in any other context at any other time in society. I mean states have wanted for years to have This kind of power, in the 2016 election, you had Cambridge analytica, create the psychological profiles by name, of the 100,000 people, it believed could swing an election. And just started marketing to you misinformation to those 100,000 voters. And what Cambridge Analytica did as a leak is Facebook's business model every day of the week and twice on Sunday. And in the film coated by as I give an example that st of two fatty sites from A study that was published in nature by Facebook where they just did a slight manipulation and it said I voted with your friends faces and it literally turned out tens of thousands of voters more. To the poll, and it just showed that Facebook could swing elections with very slight manipulations of its algorithm with without even anyone knowing. This is what is at stake here in our democracy in the film you, It was important for me to show a global context to data protection. And so I sort of show the Chinese example where you see someone, a facial recognition software. What happens when an authoritarian regime has unfettered access to your data and that's combined with a social credit score that is not just based on your behavior but your friends behavior and Everyone in democracies see that saner like and they lose their mind. But at the same time I feel that we in democracies are closer to that right reality than we may think. We're not thinking about the way algorithms What Cathy O'Neil calls algorithmic obedience training is sort of remaking our society and our behavior. And I think there's part of us that all thinks like, Cool, I could buy a candy bar with my face. I could I could pay for dinner with my face without it. Really thinking what we're losing in this sort of race to efficiency? And I think for me, it sort of begs the question is the goal of human civilization to be as efficient as possible to go as fast as possible? Or is it to build a society based on the inherent value and dignity of every human being. And if it's the latter, then we need to actually think really radically differently about the way that we're building our technology. Encoded bias, there are two examples that take place on the streets of London where the police are trialing facial recognition. And at the time the UK was part of Europe and protected by the GDPR. Which is I think the only legislation in the world that really starts to put data rights in the civil rights and human rights framework. And I think it's really important to say that I had to go to Europe to get that footage because here in the US the systems are being used by police in secret. And there's no Way that as a journalist, I would have access to that kind of, that would make that process transparent for me. I think what is so important here is that this intersects with Every right and freedom that we enjoy as free people have a democracy. If you go to a protest, and you know police can scan your face where you go if you have probation status or immigration status or you're on public assistance or vulnerable of any sort you If you post something on Facebook, and the algorithm sort of hides it to the bottom of the feed, like it did when Elizabeth Warren talked about regulating Facebook Do we have free speech? And that is what is at stake when our public squares and democracies are moving to these private corporate spaces of technologies and increasingly becoming techno pressies and we're expecting our democratic. Rights to translate and they haven't. And so we really need some policy changes to make sure that our democratic values are encoded in these technologies that will define the future. That brings me to this question that last year we had on the program, Dr. Ruder Benjamin, professor at Princeton, who wrote the book race after technology and she Talk about these kind of competing narratives in society of like, the technology will save us, right? Like it will remove bias it will make things more equal. It will solve the problems of the world and the problems of society. Then that's the Silicon Valley version. And then there's the Hollywood version of like the technology will slay us, right? We have the Terminator we have all of these narratives these films movies that show I robot So many others, and that these are competing narratives right now that are having a tug of war in our world and do you see that it feels like coded bias is is helping to sort of deconstruct a little bit of this idea as well. I love the comparison, I have to say I didn't realize until making coded bias that there's always been this sort of dialogue between science but fiction filmmakers and writers and thinkers and AI developers. And I think You know, both have been sort of dominated by men>> [LAUGH] in a particular way- yeah. It's sort of like limited homogeneous kind of imagination. And I think that when we have 14% of AI researchers are women. Literally half the genius has been missing from the conversation. And I think that our imagination about what these technologies are capable of. We've had a stunning lack of imagination. You know, I was on a panel with Kathy O'Neill and she talks about its data science. So let's treat it like a science. Let's not treat it like a faith based religion. And I think part of it is that we can't yet discern what is real science and what is pseudoscience and part of it is what Meredith Broussard calls techno chauvinism This idea that we think that technology is this white knight that's gonna save us all. And that a technological solution is always the best solution and I Would, and I think much of the cast of coded bias would challenge that notion that a technological solution is always the best one. I think often when I speak to technology companies, there's this impetus to think, It was just bad data. Garbage in, garbage out, we'll just fixed the data set and we'll come up with a perfect, super intelligent algorithm to run society. And I don't think that's what I'm saying at all. I think that it is really about creating a more humane society and creating technologies that are in service of the inherent value of every human being. And I think in terms of technology, I love tech. And in the end, it's it they're really just tools and I've seen sea change that I never thought was possible. Possible. In June 2020, I saw IBM saying that they would get out of the facial recognition game. They won't sell deploy it. They're done. They disrupted their whole business model. Microsoft said they won't sell it to you Law enforcement and Amazon said they would press pause. And I think that was sea change we never thought was possible partly because of the brave scientists uncoated bias, partly because of this wave of science communication and science literacy around these technologies and what they They can do, and partly because we had the largest movement for civil rights and equality that we've seen in 50 years in June 2020. And people were starting to make the connections between racially biased invasive surveillance technology in the hands of law enforcement with no Nobody that we elected giving oversight and the communities that are most brutalised and have the most to lose. And I think that that gives us a recipe for how change can happen. Then we need to grade science unencumbered by corporate interest. We need mass scale literacy around how these systems that impact us all work. I mean, a 10 year old is going to start using this In the fifth grade, that's when we need to start educating around tech. And we need to engage people, pushing our policymakers to protect our civil rights. And so to me We're really in a moment where the cement is still wet, where we the people have a moonshot moment to really push for greater ethics and greater humanity, and greater fairness around these technologies That will shape the future. And so it's my hope that you know people will go to the code of bias comm take action page and support a great organisation like the ACLU, the algorithmic Justice League, Electronic Frontier Foundation me heyday so many others. I've seen time and time again how a small group of people can really turn the tide. And so it's my hope that we will all participate the coated biases really that rally call that we all have a place at the table to shape how these technologies are used in the 21st century. [MUSIC] Sony, thank you for the powerful documentary for taking this important issue and really, expanding and enriching the conversation on this topic. We recommend people go and take a look at it COVID bias they can find it on Netflix and thank you so much for being here to talk with you about it. Thanks so much. It was an honor