Minisode #3 - Coded Bias Debrief, Uprooting Colonialism in Tech, Unpacking Objectivity, and How to Take Action!

In this Minisode hosts Dylan and Jess reflect on current events, systemic racism in tech and beyond, how to stay connected to the deeply systemic work that is needed to uproot colonialism, and much more. Every month The Radical AI Podcast releases a Minisode reviewing the previous month's episodes and updating listeners on insider news from the Radical AI world. As always we invite you to please subscribe, rate, and leave a review to show your support!

Check out our fancy new YouTube channel, we’ve finally uploaded all of our full episodes!

15+ Books by Black Scholars the Tech Industry Needs to Read Now from the Center for Critical Internet Inquiry

Coded Bias Recorded Q&A (must register for free to watch).

minisode 3_mixdown1.mp3 transcript powered by Sonix—easily convert your audio to text with Sonix.

minisode 3_mixdown1.mp3 was automatically transcribed by Sonix with the latest audio-to-text algorithms. This transcript may contain errors. Sonix is the best audio automated transcription service in 2020. Our automated transcription algorithms works with many of the popular audio file formats.

Welcome to Radical A. A podcast about radical ideas, radical people and radical stories at the intersection of ethics and artificial intelligence.

We are your hosts, Dylan and Jess. And welcome to our third minisode. If you haven't listened to a Mini said before, you can expect to hear a little bit of updates about what Dylan, I've been up to behind the scenes. Some shout outs to some events and people and organizations and also just some current events, as well as a longer extended debrief, the last month of episodes that we have released on the podcast.

And oh, my God, just we have so much to cover in such a short amount of time on this many.

So so first and foremost, we want to thank you all so much for continuing to download and listen. Over the short two months that we've been around since launch, we've had over sixty five hundred unique downloads and it's just been a wild ride. We continue to be incredibly humbled for you all tuning in time and time again with these conversations that we're having with these amazing and visionary leaders in our world right now and in the tech industry and in academia. Their voices are so important and it's such a privilege for us to be able to share those voices with you.

And we also wanted to mention that it wouldn't be right for us to release and record this episode right now in the context of current events that we are living in without recognizing what's going on in the world around us. And we're talking about the murder of Jorge Floyd, the riots and protests, and finally facing and acknowledging all of the systemic racism that is surrounding us not only in the United States, but in the world at large. And we are just really grateful to be able to be having these conversations with some of the people who are really great leaders in this space. And speaking of great leaders in the space, we actually had the opportunity last night, on Friday, June 12th, to attend to a virtual film festival where we got to watch the film Coded Bias, which was created by Shalini Kintyre. And this was part of the Human Rights Watch Film Festival, which is followed by a live online Q&A session.

And you guys, this is such a cool event. We were so stoked to be a part of it. If you haven't seen the film coated bias, please go and watch it as soon as you can.

It's still available on the Human Rights Watch Film Festival Web site. It's just nine dollars to rent it. I think it's available for rental for at least another few days, if not the next week or so. And it follows the journey of joy while I'm wienie and the fight for algorithmic justice.

There's not much I could say that would do it justice because it's just so moving.

But I really, really recommend, if you are at all interested in this space, to go watch it. And even if you're not a researcher or an academic, it is relevant to everyone. Believe me. And this panel, which was just filled with like academic celebrity, who's to me was so amazing. So this future joy while and wienie herself. Dr. Sify, a Nobel. That broad Rallen Chilena Kintyre. Deborah Brown, who is moderating it. And they were kind of just debriefing the film and then also some of their experiences with disseminating this recent research in the first place. And there's actually a recording from the Human Rights Watch Film Festival. Of the Q&A session that was created. And I think that it's available for everyone for free. So if you weren't there for the live Q&A session, I really recommend you go and listen and watch that, because it was really incredible. Some other really quick shout outs and updates. If you hadn't noticed, we have new branding and we are so excited about this. Dylan wrote a nice blog post on Medium about some of our struggles and our lessons that we learned in the process of creating our new logo. And we want to shout out to Maria Deathless for working so hard and diligently with us for weeks to create that and iterating on the process. And we also want to shout out to show McLarens, who created our first logo, which was so helpful for us when we were launching the podcast.

And another shout out that we want to do is to Jenny La Joya, who actually is the person who created our theme music that you hear in the intro and outro of all of our episodes. And we haven't shouted out her name yet because we keep forgetting. So now, thank you so much, Jenny. We are in love with your music and we hope that our listeners are enjoying it, too. And the last update is the YouTube channel that we finally launched. Or at least it was launched a little while. We finally started putting videos on it. If you're looking for a different medium or platform to listen to some of our episodes or some other community updates, we are actually finally being active on our YouTube channel. So you can look us up at the radical Azi podcast on YouTube and you will see all of our full episodes and some excerpts like Timbits excerpt from her episode regarding the current events, which we will go into a little bit more in our debrief, which speaking of his time and now that we have gone over all of the at least immediate updates. It's time to now go into our monthly debrief from the last four and really five episodes of the radical A.I. podcast.

The interviews that we will be debriefing in this many soad are our interview with Karen Howe, who is the a reporter for the M.I.T. Technology Review. Our interview with a baby, Barney, a cognitive science species' student at University College Dublin. And so Joe, who is a history Ph.D. candidate at Stanford. Our interview with Timna Dabru, who is the co-lead of the ethical team at Google. And finally, our interview with Yushi Milner, the executive director of Data for Black Lives.

And let's kick it off with our first episode, which was Tech, Journalism and ethics. Where is the truth anyway with Karen? How so in this episode?

I think one of my initial takeaways that I actually really like she explained in the interview that because she understands what it's like to not understand things like machine learning and artificial intelligence, that's why she's so motivated, or at least one of the reasons why she's so motivated to try to create tools and stories and activities and really great resources for other people who also don't come from computer science to be able to utilize these resources to understand things like algorithmic bias and A.I. ethics, which I thought was just really great, because I've been in that situation before and I've also been the person in the computer science classroom who doesn't understand what's going on, at least when I was initially learning computer science. And honestly, all of the time, even as I'm still getting my P.H. deal right now. And so I just I love when people tried to take difficult technical concepts and turn them into really great resources that are understandable by anyone, whether they come from a computer science background or not. That was one of the one of the first takeaways that I really enjoyed about Karen's interview. And what about you done? What were some of the takeaways for you?

You know, I just had a lot of fun with his interview. I had a good time talking about, you know, marriage and commiserating with Karen about, you know, friends having kids and not having kids myself and things like that. So I just I had a good time of the interview also. I think I mentioned in the interview.

I've always been interested in in journalism. So to listen to some of the stories of someone who's really blazing some trails right now in in journalism and in a high ethics journalism was just it was just really cool to hear how she sees her job, and especially when she started talking about this concept of objectivity and about truth and how there isn't necessarily.

A thing called truth, Ray, is this idea that she was bringing up about it being consensus information. So this interview was also a way for me to distract some of my philosophy itch, because as I'm doing some moral philosophy where, you know, this concept of truth and objectivity that we sometimes take for granted, especially in computer science spaces, that there's something that know actually exists out there that is more than just this inter subjective space is it was it was just a good conversation for me and reminded me of some of the things I believe and also some of the things that I don't believe.

Yeah, that seems to be a recurring theme in our interviews is this idea of truth and objectivity and whether or not it's actually something that exists or something that we can capture through data. And that was actually something that we talked quite a bit about with. And so, Joe and I know this is a little bit out of order, but says episode was our bonus episode, which is the first time we've done this at all with our podcast. And that was just such a fun process where we kind of went through an old interview that you had done before we met, actually. And we kind of just talked over and paused the interview and gave our running commentary, which was just like a just a fun experience. But yeah. Says episode. We already talked way too much about it during the episode. So if you'd like to hear much more about our thoughts, we recommend that you go and listen to the history that defines our technological future. But I'll just say one quick takeaway from that is for me at least, this idea that data is political and this is a phrase I feel like I've been repeating over and over again in my head and also out loud to all my colleagues. They're probably sick of it ever since we recorded this episode. I just keep saying it over and over again. Data is political.

Yeah. So obviously, this was an interview that I did before we got together to do this podcast.

Just and I mean, I love the conversation with you.

And so at the time and listening back on it, there are so many things that I wish that I had done or said differently. So it was it was a cool experience to hear my perspective.

And then also your helpful critique of different things that I could have done better, differently, because I think that just like even in the two months of work that we've done together, we've grown so much as as interviewers and in really centering their stories. So while I'm still very proud of the interview and I think the work that doing so is doing is amazing, especially bringing history and archives into this conversation about justice and algorithms, I. I learned a lot and I just, you know, was the most editing that we ever did in an episode and putting everything together. And it was just a lot of fun. I don't know. I had I had a really good time of listening back. It was a really good learning experience for me.

So I'm glad that you enjoyed the editing process, because as our resident editor, you're the person who spent the hours editing that episode. So I'm glad that it was enjoyable for you and I.

Well, yeah. I mean, enjoyable. Yeah, it was it was fun. The recording was more fun than the editing.

But the one thing I did want to circle back to, and this is something that, as you said, is coming up a lot in our conversations with people, is that like everyone just seems to be in dialogue with the Enlightenment right now with like this idea of like objective science.

That reason is the thing that has to prevail, that these basic concepts that were, quote unquote created or at least like instituted in the 17th century in Europe at the same time that colonialism was coming up. It's like we keep finding ourselves in because I know the same thing that you did. We keep talking about objectivity. And I'm wondering to myself, like, what is that about? And it's like, well, we really never got over. It got got Overlake we could this colonialism or the Enlightenment context that it came out of. It's still so much a spectre in everything we do, including in the technology space. And in this time when we're grappling with white supremacy and trying to dismantle systemic racism. It's so important as our guests continue to remind us, to look at the context.

And that's not just, you know, the context right here and right now, but it's the context through hundreds and hundreds of years of oppression mostly, and again, of that systematic racism.

And this is something the TIMNA brought up a lot in her interview where we actually we asked her to speak directly to to you all listeners about. So we'd recorded the episode. And then about a week later, the world had changed considerably. And so we'd estimate to give a message to to everyone responding to what's going on. And she graciously accepted. And her message in part was about how it's not just it's not just the United States, it's not just here. This is a global context that we are interacting with.

And I want to circle back actually to when you were talking about the enlightenment and colonization. And I think that we can't even use the word colonization without bringing up, obviously, our interview with a babe over honey, which is titled Robot Rights Exploring Algorithmic Colonization. And that's actually one of her papers is the algorithmic colonization of Africa, which is basically leading to the fact that not data is not the only thing that is political here. Algorithms are political as well. And when we're talking about this idea of objectivity and if there's even such a thing as objective truth when it comes to the world, when we take that idea and we put it into our algorithms, we run into even more problems. And this was actually my biggest take away from a baby's interview was her her comment on the fact that machine learning algorithms are increasingly trying to predict the inherently unpredictable. And this is one machine learning tries to predict some sort of social outcome for a human being. This could be something like recidivism risk. So they're likely to likelihood to re commit a crime. This could be their worthiness of credit based off of their trustworthiness when it comes to money. And this is all just to say that human beings are super, incredibly complex. I don't think that I could even predict what I am going to do today in a social context. And I am myself, and I know everything there is to know about me, arguably more than anyone else or anything else in the entire world. And I still couldn't predict exactly what I'm going to do given any context. So how in the world are we supposed to expect an algorithm that is fed nothing but data that's incomplete? How can we expect a machine learning algorithm to predict a social outcome or a social action of a human being when we can't even do it ourselves?

One of the quotes from a baby's interview that has really stuck with me, especially with the riots and the protests of the last few weeks, has been when she said that colonialism is not a bug, it's the feature.

And that although some people argue that, you know, we're in this post-colonial space, that we really can't reasonably argue that when so many of our structures derive directly from a colonial history.

And so, you know, I really like invite listeners. So to let that sink in for a second. Like, what would it mean if colonialism is not the bug, but it's the feature of the systems that we're in? And I think this really does tie into a lot of what Timna was talking about, which was was more structural, and we had a wonderful conversation with Dr. Gaber about.

Really, like who who is pulling the strings in all of this? And we talked about, you know, is there a difference between the academy and industry?

And Timna essentially said, you know, sort of in part there might be, but really it's the same people. It's the same group of people that run both industry and academia.

And just you and I spent so much time and, like, thinking about, you know what guess we're gonna have on the show and stuff like that of say, OK, well, you know, these people are representing academia on these people, representing industry. And we want to balance, you know, across the board of diversity, including that kind of diversity of different fields.

And it kind of took me a back when she said that because she's I think she's right.

It's like when I think about even Ivy League institutions and the amount of people who are currently teaching in Ivy League institutions, who are also the people who are like the primary consultants in the tech industry, which is nothing against any of those people.

And it's also like there is a particular pattern and there's a particular structure that that creates. And it makes you wonder, you know, does that have something to do with this colonial history that we're living out of right now? Yeah, that's a really good point, Dylan.

And I think it's something else that actually stood out to me about Timbits interview while we're talking about this idea of a select few people being in the room who are making these big decisions at large tech companies and who are sitting high up on the totem pole in academia. Is the topic of representation, which was really pervasive throughout the entire interview with Timna and after that conversation with Tim.

And I still wondered. I mean, there isn't any quick fix here, right? It's like it's such a deep structural change that needs to happen. And that's what I see in the protests happening right now.

Is this hopefully the shift towards trying to make lasting structural change in our systems where racism and sexism and all sorts of other isms, I guess, are just so deeply ingrained in them. And yes, I think got to the heart of this with some of the questions that she was asking, like really asking these holistic questions of.

She said, you know, it's we have to address some of these things specifically, but it's also, you know, some spiritual questions as well.

And so you said, you know, how do we respond with love, which is the same thing that we heard from our interview with Dr. Rihab Benjamin, this concept of love, how do we bring love into it?

And yet she also asked, you know, how do we create community right now? How do we hold space for people?

And then another question that she asked is, you know, after all these protests are.

But over whatever that means, you know, what happens when folks are out of jail and what happens when the protest ends, like people need community spaces of healing and organizing and justice and restorative justice spaces. So how do we create these spaces even within, you know, our workplaces, even within academia, even within all these different spaces that that we frequent? It's not just this. OK. Well, we just we do the diversity hire and representation isn't just this shallow thing. It's really asking us to.

And it's inviting us, I guess, to ask these really deep structural questions of like how do we restructure these things that we take for granted and these systems that we take for granted in order to build something that can be sustainable for everyone?

Because right now, I mean, what's obvious is that these systems out in the world are not sustainable for everyone and they're not looking out for everyone. They're looking out for a select group of people of a certain race, possibly a certain gender.

And we've seen that, you know, the evidence is over the past, however many hundreds of years. So there's still that question. And there's definitely been some hope this past these past few weeks in these conversations and these protests. And I think in the tech industry, there's a lot of work still to be done.

Yeah. And speaking of breaking down systems, this is what Yushi Milnor is all about as the executive director of Data for Black Lives. And one of the things that she mentioned that she's helping to break down is this political data that we've been talking about and how data has historically been used for weaponization. And this is in things like predictive policing, risk assessments, facial recognition, FICO credit scores. There's just so many examples of how data has been used for bad. And she's trying to change the narrative here and instead use data for social change to help create an equitable world. And I just love the data for Black Lives. Tagline. I guess that's on their website. The data as protest data, as accountability and data as collective action. That is just, I think, such a beautiful way to phrase how or some of the many ways that we can use data for positive social change. And I also wanted to take this opportunity to give a shout out to some of the amazing organizations that I've just been doing great work in this space, especially over the last few weeks, who have probably had a lot of the people who are working for them and volunteering for them, working countless hours, as I'm sure there have been many requests for interviews and for other forms of media to share resources.

And so some of these organizations are data for black lives, of course, black and A.I., the Algorithmic Justice League. And also another shout out that I wanted to give was actually from the Center for Critical Internet Inquiry, which is directed by Sify Noble and Sarah Roberts. They came out with a reading list that is 15 books on racial justice and technology by black scholars that the tech industry needs to read right now. And it is linked on their Twitter and on their Web site. And we'll also put a link to that in the show notes as well.

Yeah. And I think to kind of close this this conversation, one quote that really stuck with me, and it is from Joy and also Idina Agarwal and Sasha Konstanz, a shock. So from the Algorithmic Justice League, they wrote this amazing op ed on Medium will also link that in there. And our show notes called IBM Leads More Should Follow. Racial justice requires algorithmic justice. And this is about IBM announcing that they're going to oppose and not condone any use of facial recognition technology for mass surveillance or racial profiling.

And the way that they end this piece is by simply saying racial justice requires algorithmic justice. Take a stand. Backed with action.

And so our hope for you all who are listening out there is that wherever you're at that you take a stand.

Backed with action.

And that might be reading. That might be marching there. There are many different ways from where we're each situated. You know, for Justin, I it's it's hopefully to do this podcast. Right. And to lift up some of these voices.

But wherever you're at, we do really invite you to please take a stand backed with action and to remember these words from from joy and all that.

Racial justice requires algorithmic justice, isn't it? Isn't this, like, extra thick? Right. This is this is pivotal to the fight for racial justice right now.

And we will have many more conversations upcoming about some of the current events regarding companies like IBM and Amazon and their decision to ban. This facial recognition technology, at least for now, and of course, we could talk about algorithmic justice and political data and political algorithms for hours. But as this is a Minnesota, we are going to try to keep it as short as humanly possible. So with that, this concludes the debrief of this month's episodes. And we just want to briefly mentioned that we welcome all kinds of feedback on the way that we interview, the way that we structure our episodes and our hosts and our many sodas as well. So if you want to provide any feedback for us, please don't hesitate to send us an e-mail or reach out on Twitter. And also, if you're looking to help show your support, something that is really beneficial for us is if you actually rate and give a review of the podcast on whichever platform you are watching or whichever platform you're listening to the podcast on, especially if it is iTunes, as that helps us get lifted up by the algorithm. Ironically and as well, if you can subscribe and share with your friends, we just want to get these conversations with these amazing scholars and these amazing leaders out to as many people as we possibly can.

Well, and just now, people can technically watch the podcast on YouTube, although it's not it's not video, but like, you can definitely hit that play button and the little thing will scroll so they can watch it and they can watch the thumbnail. That's right. You can watch this. It's a great there are great thumbnails. And we also just really quickly, Joe, just makes up this.

We want to quickly plug our upcoming interviews that we will be releasing, including with Dr. Miriam Sweeney, Dr. Emily Bender, Dr. Enuma and then Kumar.

Deb Raji and of course, others. And please stay tuned. And thank you so much for joining us for this Minnesota, if you enjoyed this episode.

We invite you to subscribe, Ray, and review the show on iTunes or your favorite pod catcher. Join our conversation on Twitter at radical iPod. And as always, stay radical.

And you didn't say stay radical? No, so seriously radical. I'm feeling a sick. I was.

Automatically convert your audio files to text with Sonix. Sonix is the best online, automated transcription service.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Create better transcripts with online automated transcription. Are you a podcaster looking for automated transcription? Sonix can help you better transcribe your podcast episodes. Sonix has the world's best audio transcription platform with features focused on collaboration. Do you have a lot of background noise in your audio files? Here's how you can remove background audio noise for free. Automated transcription is getting more accurate with each passing day. Quickly and accurately convert your audio to text with Sonix. Sonix takes transcription to a whole new level. Are you a radio station? Better transcribe your radio shows with Sonix.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Sonix is the best online audio transcription software in 2020—it's fast, easy, and affordable.

If you are looking for a great way to convert your audio to text, try Sonix today.

Previous
Previous

Minisode #4 - Advice, Love, and Gratitude: Happy Three Months From Radical AI

Next
Next

MINISODE #2 - Racism and Sexism in AI, Love and Authenticity in Tech, and Celebrating One Month Since Launch!