MINISODE #1 - Contact Tracing, Social Power, and a Thank You!


In this Minisode hosts Jess and Dylan debrief the breaking news of contact tracing apps, socio-political power structures, and reveal future guests for the show.

Every month The Radical AI Podcast releases a Minisode reviewing the previous month's episodes and updating listeners on insider news from the Radical AI world.

As always we invite you to please subscribe, rate, and leave a review to show your support!  

Minisode #1 transcript powered by Sonix—easily convert your audio to text with Sonix.

Minisode #1 was automatically transcribed by Sonix with the latest audio-to-text algorithms. This transcript may contain errors. Sonix is the best audio automated transcription service in 2020. Our automated transcription algorithms works with many of the popular audio file formats.

Welcome to Radical AI, a podcast about radical ideas, radical people and radical stories at the intersection of ethics and artificial intelligence. We are your hosts, Dylan and Jess. Welcome to our first Minisode.

What is a minisode you might be asking yourself? Well, in Minnesota, something that we're gonna be doing about once a month or every four episodes, these are going to be about 15 minutes or so, 15 to 20 minutes. And it's really so that Jess and I can update you all as listeners about other things that might be going on in the radical world and also to debrief a little bit more about past episodes that have occurred since the last mini sode.

So first for this minisode, we want to thank you all so, so much for your support.

When Jess and I dreamed of this and did our first interviews and prepared to launch, we had no idea that the support of the community was going to be this large.

And we are so, so grateful for all of the retweets, all of the followers, all of the listens. And it's just been incredible for us to see this idea and all of our hard work grow into something that people seem really interested in, which is really, really cool. So thank you so much for your support over this first.

Really, it's only been a week while we're recording this since our first week of launching.

Yeah, seriously. Ditto to that. It is incredible to be a part of such a welcoming and accepting community and maybe this just speaks to the values that everybody in my ethics community really embodies. But I definitely have been feeling very warm and welcomed in the past week or so. Dylan Knight both have felt that way. And for anyone who is new to the podcast and hasn't heard much of us, or maybe this is the first time you've heard us, we wanted to just take a moment to give you a little bit of an introduction and a clarification for who we are and what our mission is here. And you can also hear us go much more in-depth about who we are and why we're here. And our very first episode, which was a shorter welcome episode where Dylan and I did a short little interview of each other. But for those who just want a short summary. Our mission is to lift up the voices and ideas in A.I. ethics and the tech industry that have historically been kept on the margins. Our vision is that these conversations that we have with our guests and our interviewees and with each other will help create a future of A.I. ethics that's fundamentally representative of a diversity of stories, voices and ideas. And we dream of a field in the future that is accessible, bold, transformative. Some might even say radical for all individuals and communities so that they can use design and engage with A.I. technology.

And one thing that I've been reflecting on this week since launch and everyone's interest in us launching these episodes, especially one of the episodes we'll talk about later in this many sowed about Apple and Google's partnership is that people seem ready for these conversations like theirs.

If you're listening to this right now, you probably fall into this category.

There's a real hunger to talk about these radical issues, these issues of representation and gender and fairness and accountability and privacy, and to look at those in new and nuanced ways that really benefit everyone. So if you fall into that category, which again, I'm pretty sure you do if you're listening to this, many said thank you for being part of that conversation with us and trusting us to bring you these conversations in a way that we hope can transform the field again together with you. So for this episode, as you might have heard, as we've gotten into the swing of things for the first three episodes that we launched, we actually already debriefed inside those episodes.

A lot of our thoughts.

So for this particular many Sode, we will not be going into great depth of of those episodes. And we invite you to listen to the fullness of those episodes, which will probably be longer episodes than are episodes in the future. For more information on that.

But for this particular episode, we would like to talk about Dr. satah guest, this episode, which was again, all about that Apple and Google partnership where they're going to be releasing API eyes and then themselves integrating into their operating systems, contact tracing for the novel coronavirus, which we are all facing down right now and trying to figure out, well, what are we going to to do about it.

And this was breaking news. We wanted to make sure to get an episode out as soon as possible. And so during the interview, we spent a lot of time listening to Satans Insight, which I found really powerful. But Justin, I didn't get a chance to digest a lot of the breaking news. And so we want to take a few minutes right now just to kind of say what what stood out to us. So just what what are you sitting with right now in particular about that episode and about this news?

Well done. I'm not going to lie. I'm sitting with a lot and I'm not sitting very comfortably. But there were a lot of scary takeaways from this interview and not because of what Seita said. Actually, a lot of what she said put my mind at ease. I think it was just the reality of what's happening around us. And it's especially the reality of the actions that Google and Apple are taking together on these issues. I have a really hard time trusting big tech companies, especially when they band together. So I'm trying to look past that to see the potential benefits of these technologies. But something that's just really looming over me from our conversation with say was when she stated that I have basically a quote from what she said. She said they have gone around any democratic process. They in this case, meaning Apple and Google and used their typical way of entering institutions and bigger structures, which is through the pockets of their users and becoming the de facto standard. And when I heard her say this, this just reminded me of the fact that tech companies that just do whatever they want so much because there's so little regulation in place and it's so hard to regulate tech when you don't know what it's capable of.

Have you ever heard of Lawrence Lessig before? So Lawrence Lessig is this philosopher who wrote this concept, this idea. I think this might have in the name of the paper that he wrote. It's called. Is law, and he was explaining how when it comes to code, especially with technology, when we write code, it's so hard for us to know what it's capable of and to know what the potential harms and unintended consequences and misuses of it might be that we can't regulate it before it's created. So code ends up coming out before the law that regulates it comes out. So in effect, code is law. Because whatever is written in code becomes the law and it is the driving force for government regulation and say it is. Thoughts really reminded me of that in this case, that Apple and Google, by taking the action to put this contact tracing into an OS and making decisions like that, that's just sort of bypassing government regulations and local authorities is becoming law. They're doing the same thing that Lessig was warning against and that is really hard for me to sit with.

Yeah, well, it's becoming law and.

And they're not right. They're not becoming law in in the classical.

Since they're almost superceding that law, right, which is which is what you're saying, there isn't time to get that regulation behind it.

And that's it. So at the end of the episode, I think I said I was feeling overwhelmed. And I think more specifically what I was feeling overwhelmed about is that there seemed to be so few external checks and balances on this partnership.

And so no matter how much I trust Apple, no matter how much I trust, you know, Google engineers, no matter how much I trust, you know, the whistleblowers and the various think tanks and policymakers.

This seemed to be a decision.

That was made between two different companies that are in is now going to impact all of us, and I am not convinced that there are enough checks and balances, especially external. I would say I guess in external institutions checking in on how this is going to play out like it's an exciting idea in theory. And I think I understand why even experts think this is you know, it could be a good idea. And I think there are so many potential downstream harms and impacts that just have not been analyzed and maybe can't be analyzed right now.

But it still feels like we're rushing in to this new age without having all the stakeholders in the room.

So one of the pieces of feedback that we got about this topic and about the episode is that there was and this is something we wanted to cover but didn't necessarily get a chance to is how is this especially people opting in to contact tracing apps? How is this going to affect marginalized folks or folks that already have a history of surveillance, either companies surveillance or state surveillance?

Like if if I was someone who was part of an identity where I did not believe that they were looking out for me and actually believe that they were constantly surveilling me, I would not say, oh, yeah, of course, of course.

Iphone, you can have all of my information. Or of course, I'm an apt and opt into this app where you can trace me. All right. Like I would have a lot of fear around that. And I'm just not seeing a lot of folks talking about that or publishing that. And I think that's a key question that we need to be asking ourselves before we move forward.

Definitely. I think even taking that a step further, talking about just the public as a whole and where our power lies when it comes to this kind of information in this kind of news. It is true that everything is happening very, very quickly. And it almost needs to be right now because of the virus and how quickly this has panned out. So that part sort of at least makes sense and it seems to be a bit unavoidable in this situation. But then we ask ourselves, OK. So what can we do? This is something you and I were sort of asking each other at the end of Sadie's interview when we were having our initial reactions. What can we do? Are we powerless? Is this all up to Google and Apple or whatever centralized authority is able to do something in this space? And then we can say, I mean, whether it's just the general public, whether it's marginalized groups, no matter who we're talking about. What can you do if you choose to not be surveilled in these ways? And I was asking myself this earlier, like, what can I physically do as someone who even understands how to code and maybe make an app for my phone or maybe hack into my phone and get some of this information out? I could choose to not use my phone. I could choose to not take my phone with me to the grocery store. I could choose to not opt in and not turn the Bluetooth functionality on. I could choose to not update my OS. Sure, I could choose to not do that. But then we have this weird moral tradeoff. By choosing to not comply with these semi's surveillance measures, we might be perpetuating the spread of this virus and we might be making quarantine last longer. So what is the choice here? What do we do?

And I think you use the million dollar word here, which is power. And I think sometimes us in the technology sphere like to think that code or technology doesn't interplay with our sociopolitical world out there.

And I think one of the things that that we're claiming in this podcast is that I heard Seita talking about is that no, actually there are decisions about power relations being made every time we release the code out into the world, every time that we release a new technology, like there's an interplay with the pre-existing social world that's out there.

And if we're really going to create a world that is fully accessible and just and really supports the rights of everyone, we I think I really have to lean into that political analysis or at least a social analysis or sociological or or what have you.

But something that says, no, actually, there are real consequences here out in the social world.

And if we begin with those analysis as opposed to just, you know, coding for coding sake or even in the sense of urgency, because I hear what you're saying. Right. It's it's important that we do something about this virus. That's.

A rampaging across our world and there is definitely a chance that we are going to cause more harm than good or there's gonna be unintended consequences unless we nuance our analysis to be grounded a bit more in the social and social consequences. This is what I believe.

Yeah, definitely. And it speaks to a lot of the things that we talk about on this show to a lot of the topics that come up. We have to really take these issues with an interdisciplinary lens. We can't just think about it from a political perspective or from a software engineering perspective or from a designer's perspective. We have to keep in mind the interests of all kinds of communities and all kinds of domains.

So thankfully, the answers to what can we do and what should we do and what must we do to have agency and to create a world in which all of us can have our privacy and all of us can be protected is not just up to us. And that's one of the reasons why we started the show, is so that we can all both listeners, ourselves and our guests, we can co-create answers to these ridiculously complex questions and problems. And so before we close today, we wanted to give a brief teaser and shout out to some of our upcoming experts and guests on the show for episodes that we will be launching within the next few months. Just a few names to what your appetite.

We have Shamika Goddard, Sarah Myers, West, Lily Irani, Karen Howe, Miriam Sweeney, Emily Bender, Sarah Porter and many others.

And we are so excited that they chose to support our podcast by coming on as guests. And we are so excited for you to hear their insights on some of these topics that we've brought up today and much, much more.

And speaking of support, we can't thank all of you enough for all the support you have shown in this last week of our launch. And if you'd like to continue to show your support in a way that will really help us out, we ironically would love to get the support of algorithm's online with our podcast and to do so. It would be very helpful if all of you could write and review the podcast, if you like what you hear. So not just by listening to the podcast, but also providing a numerical insight to how much you enjoyed it. And also, if you can and are willing, we would love to give you a little bit of a homework assignment to maybe share this episode and others with anyone who you think might be interested in these topics or anyone you know who engages with A.I. technology. If you enjoyed this episode, we invite you to subscribe rate and review the show on i-Tunes or your favorite pod katcher. Join our conversation on Twitter at radical A.I. Pod.

And as always, stay radical.

Quickly and accurately automatically transcribe your audio audio files with Sonix, the best speech-to-text transcription service.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

More computing power makes audio-to-text faster and more efficient. Do you have a lot of background noise in your audio files? Here's how you can remove background audio noise for free. Better audio means a higher transcript accuracy rate. Do you have a podcast? Here's how to automatically transcribe your podcasts with Sonix. Automated transcription is getting more accurate with each passing day. Automated transcription is much more accurate if you upload high quality audio. Here's how to capture high quality audio.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Sonix is the best online audio transcription software in 2020—it's fast, easy, and affordable.

If you are looking for a great way to convert your audio to text, try Sonix today.

Previous
Previous

MINISODE #2 - Racism and Sexism in AI, Love and Authenticity in Tech, and Celebrating One Month Since Launch!

Next
Next

Episode 1: Welcome to Radical AI