Feminist AI 101 with Eleanor Drage and Kerry Mackereth


Good Robot (1).png

What is Feminist AI and how and why should we design and implement it?

To answer this question and more in this episode we interview Eleanor Drage and Kerry Mackereth about the ins and outs of Feminist AI.

Eleanor and Kerry are both postdoctoral researchers who are working on the “Gender and Technology” research project at the “University of Cambridge Centre for Gender Studies” and in association with the Leverhulme Centre for the Future of Intelligence. In this project, they are working to provide the AI sector with practical tools to create more equitable AI informed by intersectional feminist knowledge.

Eleanor Drage’s work focuses on the application of queer, anti-racist, and intersectional methodologies applied to technological processes and systems. Eleanor is a Research Associate at Darwin College, in Cambridge and Cambridge Digital Humanities (CDH).

Kerry Mackereth’s work examines histories of gendered and racialized violence and considers how contemporary AI may reproduce or legitimise these histories of violence. Kerry is also a Gates Scholar, a research associate at St. John's College, Cambridge, and a research associate at Cambridge Digital Humanities.

Follow Kerry Mackereth on Twitter @KerryMackereth

Follow The Good Robot Podcast on Twitter @TheGoodRobot1

Check out The Good Robot’s Episodes Here

If you enjoy this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.



Transcript

E-K_mixdown.mp3: Audio automatically transcribed by Sonix

E-K_mixdown.mp3: this mp3 audio file was automatically transcribed by Sonix with the best speech-to-text algorithms. This transcript may contain errors.

Speaker1:
And. Welcome to Radical N, a podcast about technology, power, society and what it means to be human in the age of information, we are your hosts, Dylan and Jess. And in this episode, we interview Elinor Drage and Kari Mackerras about the ins and outs of feminist AI.

Speaker2:
Eleanor and Kari are both post-doctoral researchers who are working on the Gender and Technology Research Project at the University of Cambridge Center for Gender Studies and an association with the Home Center for the Future of Intelligence. In this project, they are working to provide the AI sector with practical tools to create more equitable A.I. informed by intersectional feminist knowledge. Eleanor Drage, whose work focuses on the application of queer, anti-racist and intersectional methodologies applied to technological processes and systems. Eleanor is a research associate at Darwan College in Cambridge and Cambridge Digital Humanities. Kerry Macarius Work examines histories of gendered and racialized violence and considers how contemporary A.I. may reproduce or legitimize these histories of violence. Kerry is also a Gates scholar, a research associate at St. John's College in Cambridge and a research associate at Cambridge Digital Humanities. So we were originally connected to Eleanor and Kerry because they are also fellow podcasters. In fact, yesterday, June 1st, Eleanor and Kerry just launched their own podcast about technology, gender and feminism. It's called The Good Robot and already has an amazing lineup of guests. But if you'd like to hear more details about their podcast, make sure you stick around for the Altro of this episode. And now we are so excited to share this interview with Carrie and Eleanor,

Speaker3:
With all of you. Today, we are on the line with Carrie Imakura and Eleanor Drayage, Carrie and Eleanor. Welcome to the show.

Speaker4:
Thank you so much for having us.

Speaker3:
Definitely. And today we are discussing the one on one of feminist EHI. So before we get into the EHI part of this, let's talk about feminism, because this is a bit of a sticky topic. So I was wondering if maybe both of you can start off by defining in your own words what feminism is to you. And Eleanor, why don't we start with you?

Speaker5:
Thanks, Jessica. It's a joy to be here. Thank you for having me. So for me, feminism is this common sense response to the fact that people continue to be killed, bullied and otherwise mistreated on account of social power imbalances. So that's where we begin. Feminism is a complex genealogy of many, many movements that have emerged globally in response to this violence, abuse and degradation. Which have been cultivated by different forms of the patriarchal oppression over time, so it has this really long and varied history, it's not one story of one set of values, but of course, many feminist movements have been and continue to be shamefully exclusive and many have been transformational sources of love and solidarity. I always like to think of not just what feminism is, but what it does. And broadly speaking, I think feminist work, combat structural injustice, which means that it includes and it must include an intersect with critical masculinity, is critical race studies, krip theory and many other areas which develop our understanding of why people suffer and why people inflict suffering on others. So at its best, feminism builds communities, solidarity, friendships and even romances.

Speaker3:
Wonderful and Carrie, what do you think feminism is to you?

Speaker4:
Absolutely. I guess for me, the kind of feminism that I subscribe to is about radically overhauling the sexist, racist, ableist, homophobic, capitalist and prison centric world that we live in in order to try and create another world where we can live fully, inexpensively. And I think one of the complicated things that it's really important to recognize is what a fundamentally contested ground feminism is and the way in which many feminist movements have also been highly exclusive, have been racist, have been khosro, have been anti trans is sadly still are. So to me, I think feminism shouldn't be taken as a neutral good. It always has to be critically examined. And there's a really long history of this when we think about the genealogies of protests that constructed within the British national consciousness in terms of how we think about what it means to be feminist. Right. But there's also such extraordinary work led by particularly feminists of color, particularly black feminists and various kinds of solidarity collectives that to me offer far more promising roots to a different world.

Speaker1:
So from one contested term to another, we're also bringing technology and A.I. into this conversation. How does I fit into a feminist framework? And again, let's start with you, Eleanor.

Speaker5:
We want to define I broadly. So let me start with technology and maybe let me start with how gender is also a technology, so. Gender scholars might note that in 1987, De Laurentiis explored how gender as a technology of difference, and what she meant by that is that it's a way of supporting human hierarchies that place certain bodies at the top of the ladder and all the ones that various gradations further down. And people might question how you can possibly say that gender is a technology, isn't technology, a phone or a hydraulic engine. So when I say technology, I mean something that's crafted in order to accomplish an objective. And that's a very difficult definition to use. And it's one that's often used in relation to A.I., something that accomplishes a goal or objective. But I take it from Banal Stigler, who saw technology as an interface through which the human interacts with its surroundings. So for him, techne and he uses the Greek derivation, which means making or doing. Technique crafts the human body through its engagement with the world. And so it's so important to remember that we need to resist the temptation to see humanity and technology as these two discrete units.

Speaker5:
They're mutually constituted. They shape one another. And we need to remember this in relation to A.I. So this means that technology like A.I. as a technology is more than just a tool or an extension of human capacity. And humanity is more than flesh and blood. We have never really been just human in the sense that we have often understood humanity to be cognitive and social development is implicated in it and indebted to technological systems. And that's where feminist post humanist work comes in. So you have Rosie, but I don't see Jane Benneton account at all exploring these under acknowledged interactions with humans and their environments that support our existence. And scholars from critical race studies like Paul Gilroy, Sylvia Winters, Carmen Jackson, Alexander were Hallier who remind us of which groups and populations have historically had their humanity denied to them. So these populations, when men or white people have also surprise, surprise being seen as technologically incompetent and they're still strongly underrepresented in the in the tech industry. Maybe carry a little bit more about minutes and then we can talk about bias.

Speaker4:
Yes, sure. I guess I'm always kind of coming at my own scholarship thinking about I through very much the lens of Asian-American feminism. But also, you know, drawing on my own experiences is like a mixed race woman, always thinking about the ways in which gender is very much constructed with race. And so the reason why I mention that is because for me, in my own feminist approaches, what really interests me about artificial intelligence is the concept of the artificial and the ways in which certain kinds of personhood holds. Drawing very much here from the phenomenal scholarship of and I'm Lynn Chung has for some people, the artificial has been really central to the construction of personhood. And so she draws this extraordinary sort of genealogy of the concept of the yellow woman and how the humans construct into relation to the object. And so, of course, you know, from feminist perspective, I think there's a variety of ways in which, of course, this false binary between nature and culture has been such a central problematic and feminist analysis. But to me, I think the really exciting trajectories of that kind of analysis is bringing in critical race theory that is in of itself looking at how the nature culture binary has been so central to constructing various kinds of racialized hierarchies, various kinds of racialized narratives, and the ways in which certain racialized subjects, including the speaker of the yellow woman that points to have only been made legible through reference to the artificial. So that's that's a very abstract answer. Sorry. I guess that's the particular trajectory that really interests me at the moment.

Speaker3:
Well, actually, let's let's try to nail down from this abstract concepts to something more specific and concrete. So I'm wondering, because when I think about feminism, a very specific image comes to mind that seems like a very activist space. And it seems like something I would imagine like a feminist protest, but I don't necessarily imagine a specific technology. And so I'm wondering if if either of you can give an example of a technology that you would consider to be feminist. I maybe even a technology that you would consider to definitely not be feminist.

Speaker5:
I maybe Kerry, we can talk about Monken a little bit because I would definitively consider that to be a feminist technology, because the way that we're looking at not only what it means to create good technology, but how to maybe shift this idea of bias and how we look for it is about seeking justice. And I really like Rick Lurie's idea of not of asking how do we shift power in an AI? And that means creating technologies that are built for justice, optimizing for justice. And so an example of a technology that I think does this is Monken and maybe you can say a little bit more about that, or is there another technology you think that does the trick even better?

Speaker4:
Yeah, of course there is an app. The CEO of the app is Goswami. There's a number of co-founders of this app which is explicitly attempts to facilitate conversations around female genital cutting and allowing people to practice having these really difficult conversations around FGC, around gender based violence and also arousal of health and sexuality more broadly. And I think that these kinds of spaces being created through AI enabled apps can be really generative, allowing people to develop and express famous ideas in a way that can feel very intimate and very safe. And I say that as someone who, as Eleanor knows, is often very pessimistic, very sceptical about some of the feminist premises of these new technologies. So I guess it's helpful for me sometimes to see these sorts of more radical ideas and projects being deployed

Speaker1:
And then to follow up. I really like just this question about what like the negative space of this stuff, like what feminist technology isn't, because I could imagine an answer. I can imagine a world in which the answer is what technology has been for thousands of years. But I'm wondering if maybe there's more nuance that I'm missing there, especially right now, is we're talking about A.I. and is developing these developing systems that are coming from very specific places such as Silicon Valley. If you could say more about kind of the current landscape of technology as it relates to feminism.

Speaker4:
Yes, absolutely. Sure. I think for me, my personal kind of area of concern around this idea of like what kinds of technologies are really not feminism? What should we be deeply concerned with for me would have to be around predictive policing. And for me, I think the kind of feminist that I'm interested in has to be explicitly anti khosro. But I think as well, something that I'm starting to pursue my own work is thinking about the ways in which gender based violence, sexual violence, domestic violence are coded into these technologies. Right. If you go to, for example, the FBI's communications about predictive policing in 2013, we have explicitly said that these technologies will. Not try to predict or account for domestic violence, because these are, quote, crimes of passion, right? And so we have this very this very horrible, misogynistic, sexist idea that these crimes are somehow justifiable, that they don't count because they're to do with sort of uncontrollable emotional excess. And so I think a delicate line that I'm always trying to walk in my work is saying on the one hand, yes, we absolutely need to look at the ways in which these technologies are reproducing these very old sexist ideas about what kinds of violence count while also making sure that these critiques do not get used to say, oh, the solution is to make these predictive policing technologies better at recognizing these forms of violence. But to say this needs to be a broader critique of predictive policing technologies and other similar technologies because of the ways in which they reproduce and entrench and optimize existing really, really damaging and violent relations of power.

Speaker5:
Between your two questions, what is a feminist technology? It's a technology that does good, not just a good technology, because I think a lot of practitioners still believe that you can build better technology and that better technology will be less harmful technology. We would say feminist technology is technology that actively does good and that it can be from. All the technologies that are listed in data feminism, nor inclined and authentic, not tears, but that looks that feminist data collection methods and how they use what they use, for example, collecting information on femicide in Mexico, that kind of data collection efforts. That is a feminist process equally, as Carrie just said. The technologies that are most worrying to feminists are ones that. Attempt to not so much identify a body or recognize a body, you know, this is a gendered person or this is an asylum seeker, but actually create our idea of what it means to be a gendered body or an asylum seeker. And a few examples that I find most pernicious at the moment are. Germany's federal Office for Migration and Refugees is using a voice biometric technology that either validates or rejects asylum claims based on whether the accent of the claimant matches their account of where they're from. And that's interpreted by the people who are listening. So. This reflects the. These are national interests. This is border control, foreign policy that comes together to produce an idea of what an asylum seeker sounds like. And Pedro Oliveira is a sound artist who's doing some really beautiful work on this. And then closer to home in the U.K., we have this photo tracker that I used recently to put my passport photo on online so I didn't have to go and get it done and pay my eight pounds or whatever. And it's half as good at identifying the faces of black women as white men, equally recidivism risk. So we're very you know, we know this very well. These are things that we are particularly worried about and where feminism is really good at intervening.

Speaker3:
So in this conversation, we've talked a lot about things that I was not expecting to talk about, like race and cultural background and ethnicity alongside gender, and I think from my background, especially since I don't really come from the social sciences, I always equate feminism or I've tended to equate feminism with just like women and women empowerment. But it seems like it's actually more than that. And in a word that I actually learned more recently that I'd like to admit is intersectionality. And I think that's kind of embodying a lot of the stuff you're all talking about here. So I'm wondering if you could maybe give us a quick one on one of intersectionality and how that applies to feminism.

Speaker4:
Yes, of course. So intersectionality this term being coined by the black legal scholar and feminist Crenshaw, but having a much sort of longer genealogy and history as a way of understanding various kinds of social categories of characteristics and vectors of power is being fundamentally entwined with one another and illustrative of one another. Right. So that we can't understand gender without taking race, class, disability and other kinds of categorization into account. And so I think what's really generative about this approach is that it doesn't say, you know, oppression can be understood in an additive way. It says no, there are distinct kinds of oppression. There are distinct kinds of violence that only make sense when we bring these things into conversation with one another. So like to give a personal example, there are certain kinds of actions I always understood about as being about gender. Right. So, for example, I was taught never to toot my car horn for fear of experiencing violence on the roads. But it wasn't until I was first driven around by wonderful white female friend of mine, someone offshore immediately to touch her horn. And my immediate thought was like, What are you doing? Like, what are you trying to get done to us? When I kind of realized, like, oh, actually, no, that isn't just about gender. Like, there's very much a kind of gendered and racialized configuration there that is resulting in that fear of violence. Right. And so I think, you know, what to me is really valuable about intersectionality in the space of EHI is that there is this tendency in data collection to collate and tag certain categories like gender, like race, like age and these very discrete ways. And of course, you know, for me, I'm always understanding my feminist work is saying, like for so many racialized people, like, you know, normative gender sexualities haven't even been accessible or sort of had that kind of same resonance because of the ways in which they're so bound into whiteness. Do you and expand a little bit on that in terms of how sexual approaches to I kind of informs your own work.

Speaker5:
Yeah, absolutely, and those technologies are really important, and we go back to the company where a collective and what they said about if black women are free, then everyone can be free because that means the toppling of these hierarchies. It is, however, a genealogy that originates in the oppression of black women. That's what we we mean. We understand by intersectionality. It's a tool that is useful. And we must remind ourselves of that in all the work that we do. Examples of this, all the ones I just gave. So why are black women why why all photo check is less, much less good identifying black women than just women? I mean, by just women, I mean white women. Why why is that oppression seen across the board with the compas age? It's not just race and gender. This is also age composters, far less good being accurate about how likely it is that someone will reoffend if they're a young black male. So these all these intersecting axes of oppression that come together to make EHI particularly harmful for certain groups.

Speaker1:
I really appreciate. I think Carrie, was one of your definitions or points about feminism in terms of that. It's self-critical almost by definition, it sounds like. And one of the spaces that we're talking about again is that technology design space, again, in this Silicon Valley, which has is often critiqued for not being self-critical in the same way. And so I'm wondering how folks who are listening to this podcast who might not be the they're not convinced. There's not they're not convinced that there's an issue here and they're not convinced that feminist A.I. could possibly help. What would you say to those people that might help them maybe understand a little bit better?

Speaker5:
Well, maybe we can talk a bit about the kind of work that we're doing and what feminism does in that space, because we're working on D biasing A.I., which is sort of a bleak term anyway. What does it mean to be biased? A.I. How can I be made less harmful? And we know that mathematical definitions of fairness vary. So can feminism help be more specific than feminism? Do work in the space? And again, I would direct people towards data feminism because it has really nice specific examples of how feminist theorists that so precious to us like Bell Hooks, can be mobilized to do what in data spaces. And they give really nice specific examples. We are looking at data ethics, a kind of now it is a multi-million pound space that everyone has a DNI team, everyone has a data ethics framework and they're expensive. You can get consultancies. And so we're asking whether feminism can do something a little bit different in that space. Usually, Asiatics is grounded in this horribly Aristotelian understanding of what ethics means, but as we've just described, feminism works on much less wobbly ground because it's invested in the lives and needs of real people and real communities.

Speaker5:
And we've detailed that a little bit so it can help us understand that addressing harmful bias in A.I. has got to be a culturally situated and specific effort. In other words, it's unlikely that we're going to find an effective universal ethics framework, this pot of gold at the end of the rainbow. We don't think that's possible. Instead, it can help us understand that attempts to make A.I. less harmful, often framed by us, is something that can be located and extracted from a system to return to a state of equality or neutrality. Now we know that data is never raw and A.I. is never neutral. Those are things that have also come out of feminist work. So it's both technically inaccurate and missing the point to attempt to do this. A.I. produces harmful outputs that disproportionately affect marginalized communities because A.I. is very good at exacerbating and accentuating existing power structures and feminism is very good at understanding how those power structures operate. So this makes for some really delightful collaborations.

Speaker4:
Yes, I think for me, the way in which I hope I would get people really interested in this question of like what can feminism actually do is I think feminism really drives home to questions. Right. Which is who is this for and what is at stake? And to me, you can't do feminist work without that constant drawing back to you know, when you say that a technology works, for example, who does it work for and how does it work and why does it do that? And I think we see this time and time again in our own work, which is the technology is deemed to be, for example, accurate enough or efficient enough so that it can be sold, so that it can be marketed. And then, of course, though, when you look at who it doesn't work for, when you look at who's been sacrificed in order to get a product to market, it's always the same people who are the most marginalized. It's always the people who have historically been sort of pushed to the margins, ignored and excluded, and those needs are not met. And so I think what I hope to get someone who is a little bit skeptical, interested in what this feminist questions is to say that. Yes, there is a process of labor, there's a process of learning that has to happen, but ultimately, you know, if you want to create technology that really, as Eleanor mentioned, that does not just claims to be good, you have to be thinking about who is getting left behind, who is not getting involved in this process. And that goes beyond all of these very sort of merit inclusion and to like a really transformative project of questioning, kind of the very foundations of what these technologies are for, who they serve and what they do.

Speaker5:
On that note, I would also add that feminism topples the hierarchies of knowledge and A.I., and by that I mean. To understand an algorithm is specialist knowledge, and Tenwick Ebru calls the people that are in charge of the gate gatekeepers of knowledge, and they are also the gatekeepers of power in A.I. And what feminised no one can do is say, well, hang on a minute. There's lots of other people that have a really good understanding of how an A.I. system works. For example, the participants, the people that are exposed to those systems, the people whose data is taken from them in order for the systems to work. And what we're trying to do is encourage people to take this much more holistic understanding of what is. And look at across the development and deployment pipeline, who knows what. So that those ideas can be shared a little bit because you can't have kudasai as if, as Carrie just said, the teams aren't diverse and it's not doing something that's good. If you're not involving the people that are affected, those people must be part of the process from the design phase. And also and this is something that everyone can do, whether you understand or not what I is you can say no. If you don't think it's a good tool to use, we can all say no. And there's various ways we can do this. And as Jesse, you said. Feminism is an activist practice as well, so we can all participate in an activist movement that, you know, whether you're signing a petition or you're taking to the streets. These have been effective and we can all resist.

Speaker3:
Let's actually follow this train of thought with action for a moment. And I want to talk about gatekeeper's a little bit more, because this is something that I think I see as a pretty big hurdle and like challenge in the instantiation of feminist AI globally. And so I'm wondering in my mind, I visualize this, this board room full of powerful gatekeepers who are most likely a homogenous group in Silicon Valley, who hold a lot of power and might not be interested in losing a little bit of money over lowering some accuracy metrics on their AI in order to improve accuracy for these marginalized groups and to help improve people who are in these intersectional spaces, it help improve their experience on these platforms and also to help battle and combat some of these systematic oppression that is being encoded into these systems. And so I'm wondering, not from an individual perspective, but maybe from a more corporate perspective, what do you see as some of the possible solutions to breaking up the gatekeepers in the room or maybe making the gatekeepers look a little bit different and and think differently and hopefully help combat some of these problems?

Speaker6:
Yeah, thanks for that question. I think that it's key to remember that corporations are, of course, fundamentally driven by profit. They're accountable to their shareholders and they will be for the foreseeable. So some people think that this means that what motivates corporate activity and I will always be fundamentally at tension with diversity and inclusion schemes, were the drive to make technology completely safe for everyone, saying that there's lots of really well-intentioned, kind people working in the industry who recognize that they're either over or underrepresented in that space and are keen to be part of this drive to increase diversity. There's ways that this can be done short term and long term. On the short term, women are reportedly less likely to apply for a job that they don't feel 100 percent qualified for, and masculine language on job applications actively deters women from applying. So when encouraging more women to apply and to be hired in the e-space, we can look at these kinds of statistics and and work out different hiring strategies. We can also try and change our minds about what an engineer looks like and think about how we identify the attributes that a good engineer should have in different kinds of people. To me, this is that this is the kind of near-term work that needs to be done, and I strongly believe this is very important. This is the kind of incremental change that Laura Donaldson terms, political tator control. I quite like this metaphor of the daily act of brushing your teeth. That needs to happen in order to have a kind of long term effect. Feminism, though, asks a little bit more. It asks that we do this kind of work, but it also is a bit more ambitious in what it wants the world to look like. So it calls for fundamental shifts in favor of people who are currently disenfranchised by the status quo. I think what this means at industry level is that feminism can ask the impossible, and that's why it's potentially more likely to change the state of the field than perhaps an ethics framework of a code.

Speaker4:
I think for me it comes down to the political will of the leadership often where I think actually, despite the fact that I am a cosmic pessimist, that can be really transformative steps forward, I think, and how corporations engage seriously with the kinds of problems that activists in this area have been flagging now for like really quite a long time. But I think it ultimately comes down to do you think this is your top priority and does this matter enough to you that you're willing to put in the work to make the changes, even if it seems like there are short term costs to doing that? And I think often sometimes I get frustrated in the context of institutions where people would tend to blame the problems that they're experiencing on sort of structural issues or kind of like systemic sexism and racism that lead to, for example, they're just not being enough job candidate. So there's just not enough people here with this particular interest. And of course, the systemic problems are very real. But we know that, you know, very, very well intentioned, powerful figures in these and these institutions in these companies actually can make those changes and they can start laying down the groundwork that are not going to solve these systemic problems, but can make really serious headway into transforming the culture of a company or transforming one institution is like.

Speaker5:
They also worth saying that everyone across the board is saying the same thing, that regulation is the key to making better, i.e. less harmful. And I'd like to see regulation move in a slightly different direction. For example, people are still trying to make gender recognition software. Now we know that. Gender is not something that can be recognized by a machine, it's mutually constructed with race. It's a it's something that isn't an internal attribute that can be drawn out by a technology and externalized. And yet there's so many weird ideas about what it means to be biased and A.I. system. The draw on these essentialist understandings of gender, like Sentir is a recruitment app that is probably in a very well-intentioned way trying to, and I quote from the website, strip gender from the back and front end of its system. And this is unlikely to be effective as it doesn't reckon with gender as a socio cultural system. And also that sentence sounds pretty bad, doesn't it? But these attempts to move remove gender entirely from a system really misunderstand what gender is. And I think it's low hanging fruit, some really easy work. We can go in and and help people develop their understanding of what gender is the system.

Speaker1:
It sounds from what you both are saying, that there's also an invitation here from a feminist critique. It doesn't just have to be this top down regulation, but that there's also individual agency in these spaces as well, that it can be there can be a bottom up as well. And I guess one question that I have to close is about the future. And I'm wondering from both of you and Carrie, we can start with you. What are some of the future directions that you can see feminist approaches taking to technology? Or perhaps what do you hope the impact of the feminist critique can be on technology development into the future?

Speaker4:
That's really interesting. I guess instinctively I'm really interested in, I guess, the radical potential of feminist activism and organizing to ask people to profoundly reject or to ask to click, pause really on technologies that are being developed that are going to be harmful to many, many people in communities. Right. And so for me, a lot of this has to do with border control. And we just saw, for example, this week, huge, huge excitingly in Glasgow. Right. And immigration removal being stopped by a huge number of organizers who simply refuse to let the van leave or refuse to let them deport the two men who they were trying to take into a detention centre. And for me, it's a horrible reminder of the kind of violence is that the home office in the UK is committing like very, very often. That was also so heartening to see. Like, well, that kind of community organizing that on the ground sort of mobilization can bring about real changes. And we've also seen that in the tech sphere, right around activist groups playing such a significant role in bringing the public attention, the ways in which various forms of violence and injustice like Al Gore are happening right now with a lot of new technologies. But they're just going to keep happening. And so I think for me, particularly, again, in relation to the Castro space, in relation to border control, but also hopefully spreading out further as it is sort of growing awareness and publicity around the real risk of these technologies. I like to think about this kind of radically transformative project of feminism, which for me involves both fighting for survival on the ground, but also like having the ability to kind of say no completely and think really differently about the potential for technology, whether it should be doing.

Speaker5:
I thought that example from this week, Kerry was amazing. I cried watching the van surrounded by people in Glasgow. It was like truly, truly moving. I am loving the conversations that we have with with our industry partners. I think that, you know, obviously you rely on people to be incentivised. And if they are, you can help them explore it a bit more. It's it's such a privilege, really, to be able to talk about something which I find so central to my life, gender studies, thinking about sexism and racism. But you know how interesting really it is of how power structures form. I wouldn't be here if we didn't also find it profoundly interesting. And I think a lot of people do, too. And we're very lucky to be part of a discipline that can and must move beyond the academe and reach the public and grow with the public. It stems from the public. It doesn't come from the academe. There's this really well, there should be a really good movement, this osmosis between the two. So I'm really enjoying that work. I, I love to battle with a question of how to incentivise people that really aren't incentivized and really don't want to hear and aren't interested. I'm sort of so psycho psychopathic, actually not afraid of that. And I really enjoy the hard conversations as much as the easy ones. So I think that there is work to be done and it's possible and it's interesting and. I hope everyone will be motivated to want to look a little bit more into I, I know that it's not just an algorithm. The algorithm has become too cool for A.I. It's much more than that. It's about the people who are also paid so little to do the data labeling work. It's about the way that it's deployed. And all of us experience this at airports going through border control. We can all be involved. We can all be interested and will understand.

Speaker3:
This is usually the point in the conversation where we ask where people can connect further with both of you. But there's actually a very exciting way for people to continue to hear more about these topics and is actually the the way that we were connected, that Dylan and I were connected with both of you, is that you have a podcast coming out on a lot of these topics. So could you just quickly tell us what this podcast is about and how people can find you?

Speaker4:
Yes, absolutely. So our podcast is called The Good Robot, which is very much a provocation rather than a statement where we ask a whole lot of really fascinating people, you know, what is good technology? Is that even possible? And then what does feminism have to say about it? And again, understanding feminism pretty expansively. So we have a range of really exciting people on the podcast who some of whom are very much invested in feminism and gender studies, but others who are invoking like a huge range of different approaches to technology. But I think what they all really doing pretty seriously is grappling with that question of like what does it mean to sort of have technology that is actually doing good in the world, that is trying to combat these various kinds of injustice that we sadly see happening so regularly that we see happening daily? Eleanor, do you think to add about the podcast,

Speaker5:
Feminism means such different things to different people, as we've said today, and it's been great to hear some of those, and we define feminism very broadly as the Buddhist methods and lots of different kinds of things. So that's been exciting to see. Ben, thank you very much for mentioning it. It's very kind of

Speaker1:
Also full disclosure. Jess and I are both guests on that podcast. So we have a kind of podcast exchange going here, which there's no there's no sponsorship involved or anything. But we just I wanted to name that it's very exciting to to be a part of this project with both of you, Eleanor and Carrie. And I'm really excited to listen to the other episodes as well. And I feel great about honor. And as as we do move towards closing, I just also want to say, yeah, we'll put links to that, to your podcast in the show notes and definitely let people know when it's live. But for now, Carrie and Eleanor, thank you so much for joining us today and for a wonderful conversation.

Speaker5:
Thank you. We really enjoyed being on.

Speaker2:
We want to thank Carrie and Eleanor again for joining us today for this wonderful conversation and instead of our usual outro where we debrief some of our immediate takeaways from this conversation, Dylan and I thought it would be best to send you all over to the ultimate debrief, an entire podcast series. As we mentioned briefly before, Carrie and Eleanor's podcast, which is called The Good Robot, just launched yesterday. They already have four amazing episodes out with Anita Williams, the venerable Tenzin Riodoce, Priya Goswami and RN Catherine Heils. And also you might hear some familiar voices in upcoming episodes. You can follow the good robot on Twitter at The Good Robot one, and you can also find links to their website in the show notes for this episode. So head on over there and give them a listen. As for this episode, that's it for this week. For more information on today's show, please visit the episode page at Radical. I beg. If you enjoyed this episode, we invite you to subscribe, rate and review the show on iTunes or your favorite podcast to catch our new episodes every other week on Wednesdays. Join our conversation on Twitter at radical iPod. And as always, stay radical.

Sonix is the world’s most advanced automated transcription, translation, and subtitling platform. Fast, accurate, and affordable.

Automatically convert your mp3 files to text (txt file), Microsoft Word (docx file), and SubRip Subtitle (srt file) in minutes.

Sonix has many features that you'd love including world-class support, transcribe multiple languages, automated transcription, enterprise-grade admin tools, and easily transcribe your Zoom meetings. Try Sonix for free today.