Episode 18: Surveillance, Stigma & Sociotechnical Design for HIV in Dating and Hookup Platforms with Calvin Liang, Jevan Hutson, and Os Keyes


18_ Surveillance, Stigma & Sociotechnical Design for HIV in Dating and Hookup Platforms with Calvin Liang, Jevan Hutson, and Os Keyes (2).png

In this episode we interview the interdisciplinary research team of Calvin Liang, Jevan Hutson, and Os Keyes around the motivation and research behind their paper: "Surveillance, Stigma & Sociotechnical Design for HIV". This paper analyzes the approaches that 49 online dating and hookup platforms have taken when designing for HIV disclosure. Calvin, Jevan, and Os point to bottom-up, communal, and queer approaches for design as a way of potentially making the tension between disclosure and risk easier to safely navigate. Their paper will be published in First Monday's Special Issue on HIV/AIDS and Digital Media in the Fall.Calvin and Os are PhD students in Human-Centered Design and Engineering at the University of Washington. Jevan is a data justice advocate, human-computer interaction researcher, and recent graduate of the University of Washington School of Law.

Follow The Research Team on Twitter:

Calvin Liang @cal_liang, Jevan Hudston @jevanhutson and Os Keyes @farbandish.

If you enjoy this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.

Relevant Links from the Episode:

Surveillance, Stigma, and Sociotechnical Design for HIV on arxiv (link will be updated when the paper is published in First Monday in the Fall 2020).


Transcript

JCO _mixdown2.mp3 transcript powered by Sonix—easily convert your audio to text with Sonix.

JCO _mixdown2.mp3 was automatically transcribed by Sonix with the latest audio-to-text algorithms. This transcript may contain errors. Sonix is the best audio automated transcription service in 2020. Our automated transcription algorithms works with many of the popular audio file formats.

Welcome to Radical A.I., a podcast about radical ideas, radical people and radical stories at the intersection of ethics and artificial intelligence. We are your hosts, Dylan and Jess. In this episode, we interview the interdisciplinary research team of Calvin Liang Javin Hudson and ask Keys. Calvin and us are P.H. D students and human centered design and engineering at the University of Washington. Jevin is a data justice advocate, human computer interaction researcher and recent graduate of the University of Washington School of Law.

In this panel, we discuss the motivation and research behind their paper, surveillance, stigma and socio technical design for HIV. This paper analyzes the approaches that 49 online dating and hookup platforms have taken when designing for HIV disclosure. Calvin, Jevin and US point to bottom up communal and queer approaches for design as a way of potentially making the tension between disclosure and risk easier to safely navigate. Their paper will be published in first Monday, a special issue on HIV and AIDS and digital media in the fall. We are grateful to these scholars for their important research and are excited to share our conversation with Calvin, Jevin and us with all.

All right. We are here today with Jevin Hudson, Calvin Liang and Oz Keys, and they're all the authors of the paper surveillance, stigma and socio technical design for HIV. And Jevons gonna start us off by telling us what this paper is about.

Awesome. Thanks so much for having us. Does this paper sort of starts from the really complicated position it is to sort of live with HIV in sort of the digital age?

Hiv is not only bound up in incredible amounts of individual, personal, personal to structural discrimination and stigma, but also sits top like a really torrid history of violence and particularly surveillance and criminalization. And as sort of HCI researchers, some of us with CREB experience working on sort of dating platforms into platform design. We're curious about sort of the ways in which dating platforms designed for HIV. Right. HIV is a complicated experience in the outside world. And the digital world plays out, particularly in the context of online dating and digital intimacy. And as researchers, we've seen efforts on dating platforms to, say, afford individuals the ability to disclose their HIV status. And many of which were sort of taken as an attempt to, you know, be stigmatized or otherwise begin a conversation and create opportunities for users to discuss their status openly and in a way that doesn't, you know, sort of result in downstream discrimination. But in sort of our approach to this, we are obviously informed by the critical history of HIV activism, but also the particularly pernicious role the state has played. Right. In surveilling and incarcerating HIV. And that's what sort of struck us in sort of the initial approach to like, look, folks are designing for HIV is there was not a ton of attention to sort of the structural forces at play for folks with HIV.

It's not just I'm on a dating platform, right. That I exist in sort of a network of other users who have my information, where that information can go, what it could do to me, how it might result and sort of maybe downstream prosecution or other other forms of issues. Right. What sort of privacy on these platforms. So that's sort of the first word of how we get to this. And sort of the goal for our study was to take a look at a broad base of popular platforms to get a sense of what this HIV design work looks like and try to think about, you know, how that impacts issues of HIV surveillance, stigma, criminalization, and sort of reflecting on how we do sort of critical socio technical design. And sort of from there, like the one of our biggest takeaways is like, look, we're not designing to think about. We're not designing in a way that's sort of critical of the ways in which HIV exists in the world, particularly with respect to sort of meet medical, legal infrastructures, HIV surveillance, criminalization. And our sort of goal is, you know, we don't necessarily offer the solution right. To how we navigate disclosure or stigma, surveillance and criminalization, but rather point towards the critical design practices that it's like, look, if we are going to design for HIV, if we're going to make dating platforms work, whether that's through stigmatization or through disclosure, we need to be mindful of the other structures, particularly the state and the ways in which private companies exist in operation with the state, with law enforcement, as well as public health authorities.

Right. So that we give greater autonomy and sort of dignity to folks with HIV. So we're not just, you know, designing to destigmatize without recognizing that the state exists. And we need to be cognizant of its, you know, its consequences. And I guess it's about sort of how we sort of personally relate to this work. I've been living with HIV for a number of years, and I think a lot of this comes out of my own sort of personal experience. Right. With do I disclose who's seen whether I disclose who's going to use this information described or have this information? Is it going to other people? How is this going to impact me? Right. And it's just such a complicated balance in a space. It's already complicated enough. Right. Love is love is hard. Right. Love and sex is hard. Right. And conference, I think, before all of these great new opportunities. But I think we sort of forget these sort of complex networks that people exist in when we're deciding for, you know, sort of pro social ends.

And that's one of the reasons why we wanted to have you all on the show as well. In addition to the fact that this is based on what I've seen, like a fairly under represented, I guess, academic pursuit.

I haven't seen a lot of papers in the eye space looking at HIV or even dating platforms.

And so I think that's just such an important place for us to to go into. And also some of the Twitter posts that you posted when you were so excited that this article was published were taking that more personal route, which is why we want to talk about like and put a face to it, I guess, of like, why is this important that we talk about it from your personal experience or from the communities that you're interacting with?

Yeah, well, I mean, I guess sort of bouncing off what I said before, like like these like we can sit as researchers, as academics and sort of think about these problems in the abstract, but they play out. Right. There are folks who have been incarcerated because of their status. Right. There are folks who have suffered various forms of violence and sort of downstream privacy harms. Right. And I can you know, I said it was just a privilege. Right. I have not been incarcerated. And frankly, the state does not. Look for me right here, incarceration, right? I'm not a person of color either living with HIV. Right. There are other communities in which these these practices play out. But I think it's it's sort of stepping back, whether it's says in my role as a tech policy advocate or in other areas, we need to be critical of our interventions. Right. Like, if we're going to suggest like we need to go take this route, does I need to move this route? We need to be both in conversation with folks were impacted by these issues. Right. But also, we are aware of the ways in which these interventions might like reify other, you know, problematic practices. Right. Like, if discussing HIV on Guider means the state is going to know about it or they're selling this to third party advertisers, I'm denied insurance or denied a particular rate.

Right. These are these are things people are thinking about. These are things that users are thinking about or having to navigate on a daily basis. And we sort of hope to recenter that and be critical how we think about platforms, because it's that some people might not think that dating apps are important things. As we have certain rich written. There are both a host of individual instructional outcomes that are shaped by how it is we interact in sort of digital intimate spaces. I think it's important to be attuned to the folks who like living with HIV is not fun in the dating world. Right. Like, that's like that is not a fun experience. Right. But it's an experience filled with stigma, with hatred, with discrimination, with other forms of rejection that like especially in the spirit, intimacy is like deeply, you know, takes a toll on your own sense of self respect and support. Right. But on top of all of that, you also have the state, right. You also have these other violent infrastructures, right. That you are now having to navigate on top of, you know, just having HIV while existing at all.

Right. And I think we have to think about each of those layers. Right. Because even some of this work might not necessarily attend to those infrastructures are trying to get towards like, how can people with HIV live with dignity and exist in, like, digital intimate spaces? And there's important work. And we are we are we are not the first ones to suggest, you know, designing for these sort of social ends. Right. There are folks who have done really important work around HIV disclosure and doing platforms, and you have to credit them as well. We just want to continue to sort of not only push that design space to think more critically, but also make sure that, as you know, a community of social computing researchers, if we listen to advocates and we listen to communities, you know, they want to decriminalise HIV, right. They want to remove these sorts of statutes and otherwise think more critically about HIV surveillance. And I think our community has, you know, as a group of social committee researchers, as folks who might work in tech policy, we need to be critical of the sort of infrastructures in place that shape. And for not only our research, but, you know, the platforms we're thinking about.

I think I mean, I would echo much of that.

I think that Jeff is also getting at it like or at least alluding to indirectly, as I understand it.

The cool question is that you tend to ask people, which is how do you think about radicalism and the way Japan keeps talking about what it means to be radical or what it means to be critical. You know, one of the things that I think this book tries to do and one of the reasons that it's important and also one of the things that makes it like Bekesi critical is people think that being critical means showing up and saying no, that things wrong go bad, go away.

And might my joke a lot of the time is, is that practitioners dislike critical scholars for the same reason the surgeons just like us. As far as they're concerned, we're the ones who show up explaining how they killed their patients.

And that never makes you popular. And also, it's really tempting to blame you because you don't want to keep showing out with the death certificates.

They were perfectly fine until you showed up. But what it means to you to do critical work is not to say that, like, this thing is bad. Right. Maybe that is a good way of developing, like disclosure oriented, like efforts to address like HIV and transmission on social media platforms. Like maybe there are safe ways of doing it. But what we mean when we say critical is to sort of take a step back and examine the assumptions that are shaping like the choices we make. The the options we see is available to us.

So in this case, going, hey, like you're thinking about this as a like to act a problem of like that's the person you disclose to. And then there's the person disclosing. And there's the interplay between them. But there are a lot of other factors which exist, which you're not thinking of. And you should really, you know, take them into account. And, you know, this this aligns a lot with radicalism where radicalism is to.

Is in some ways built around like. Questioning why not?

Not necessarily saying, like, we must do X, but asking why we think that X is what we should do. And trying to almost like, check out working and make sure that, like, X is actually the best outcome and not just the best outcome we can think of. If we leave all of our other assumptions intact.

Yeah. And so I very much came to this project, does kind of critical design researcher, but then also, I guess you could say an X expert user of these apps.

I think to your point about the community not necessarily thinking about these things or this space. I think related to design and design choices Slate. I think it's important to recognize that these choices have. You know, a range of consequences for real, like real consequences for real people. I think Jevin just outlined a few of his personal experiences, but I really just emphasize that, like. Did you design choices affect people's lives in real ways and sometimes in harmful and in difficult ways? And so to kind of highlight that for people who are making these decisions, I think was also a really important aspect of this paper.

And so for folks who may not be very familiar with the current issues that are at stake here when it comes to HIV and the stigmatization. Could someone speak a little bit about the current power imbalances that exist when it comes to HIV outside of digital technology? And then how a dating apps might exacerbate some of those power imbalances?

Yeah. I mean, you could think about these to start like the disproportionate balance between information and power and the state. Right. When you're when you're diagnosed with HIV.

In many jurisdictions. And save for my jurisdiction. Like, immediately that information is sent to the public health office immediately. You are then contacted by a contact tracing worker, which we've heard about in these various contexts and coded. But this contact tracing person from the public health office will quite literally come to you. And you have to document every last person that you had slept with in a given period of time, say, a year. Right. To describe in detail what those sexual encounters were like. Right. Where did you ejaculate? Like, how many people were there? What time of day? Like these sorts of things. And then you collect all of that contact information and they tell you either you can call these folks and tell them the day you get tested or we will anonymously. Right. And that's just one sort of, you know, sort of imbalance. We have a person who's now been diagnosed with a disease that carries its own history. Right. Like my my mother lost most, if not all of her friends, all of her gay friends when she was when she was my age. Right to this. Right. Like, you already have a person in a vulnerable position and you then have the state extracting all this information. And if someone you know, who's assumed to be privacy attorney, I'm reading all of these things. Right. Like, I'm able to go through and be like look like I know when the state has this, what the state can use it for, what exemptions is they could use. Right. Because the state can say for national security reasons, take my HIV data and otherwise it what's right.

And that's just like an individual microlevel where the state has all this power, all this information, because you worry public health threats. Right. And where that information is, it's I'm not sure. Right. I cannot tell you where that list still sits. Like this was in New York when I was diagnosed. I don't know who has access to that. Right. But then we think about, you know, platforms. Right. You have sort of this imbalance where it's like, look, if you disclose it, it's public. Right? Like it's effectively a public platform. We're not going to take any sort of responsibility for your information. And you disclose it. Right. Which itself. Similar sort of imbalance where I am this sort of risky, vulnerable actor that is having to sort of carry the weight. Right. Not only carrying the weight of the condition and the stigma that comes with it. Right. But having some degree of power or attempting to understand where my information lives, what it could do to me. But for most people who live in HIV, you're not a critical data scholar or a privacy lawyer. Right. Who might have say that sort of incumbent tools to even have a sense of that. But even with those tools, you're still powerless. Right. Not only to the private organizations that might hold this information. Right. But to the state that has this information and can use it for a variety of reasons. Right. We're we're just talking about the context right now in the United States. Right. Like, HIV is a global a global issue. And there are global regimes of surveillance, global resume's incarceration that this plays into.

Right. They already have a fairly powerless, you know, at least individual my sort of experience. And then you have the state private flatworms having the control of that information could do go right.

Such as maybe a little bit of the imbalances. But there are there play. I mean, we can think about particular imbalances like sort of like racialized imbalances. Right. The ways in which HIV has particularly struck communities of color and the ways in which the states sort of a carceral arm has particularly targeted those communities for HIV surveillance and sort of downstream prosecution and incarceration. Right. Like, it's not necessarily white gays in New York City. Right. Who are being arrested by the police for HIV. Right. It's not a discount that there are a host of ways in which the NYPD played out against people in New York. But there are there are balances in the way in which HIV impacts people, particularly HIV surveillance. And in criminalization. It's like I mean, for for a lot of folks don't even realize it, like Guantanamo Bay. Right. Its entire inception was for HIV positive Haitian migrants. Right. Like so many parts of our own history are built up the ways in which we have excluded people with HIV, particularly you persons of color and foreign folks with HIV. And it's only until recently. Right. Like the Obama administration, where some of these immigration related HIV protocols. Right. Were sort of dismantled. Right. And that's sort of goes, again, into this sort of power imbalance or thinking about where HIV is not only a viral vector, but it's a vector of state power. Right. A way in which the state can decide where you can go, who you are, who you can talk to, who you can engage with.

Right where you can travel to. Right. And in many ways, you know, folks with HIV don't necessarily have the autonomy or the agency to combat that. Right. Without facing repercussions from the state or, you know, from from regular people. Right.

The only two small things, I guess I would add to that is it's just that I think it's worth highlighting the way that these inequal.

Teams can play out and practice is not just because of these sort of mandatory like diagnosis reporting link, but also because of how heavily that's wedded to access to treatment. Like, if you want access to retroviral meds, then you have to have a diagnosis. They don't just hand them out the street correspondingly.

Like that process of sort of contact tracing is basically inescapable. If you want to not just die Fusco's, but concepts like bio politics is like very applicable. We give you the right to laugh, assuming that you do everything exactly the way that you want. And this particularly difficultly play out, you know, when when other aspects of criminalisation come into it. So, for example, we know that there is a disproportionate sort of infection rate and epidemic in trans communities, particularly for trans women of color. And I often think when they're talking about sort of contact tracing and everything else, if you are someone who is, you know, structurally screwed with to the point where you are dependent on things like sex, work for like life, or if you are someone who didn't pick it up through sex at all, who picked it up through, you know, intravenous drug use and so forth. Like, if the way in which you came to be infected all the way in which you might prove a vector for other people is itself criminalized irrespective of HIV. How do you disclose that? Like, how do you as a sex worker, walk into a clinic with a random doctor? Whether very premise of the interview is that they are legally obliged to pass on some of its contents to the government and to help with what you want and say, well, I'm a sex worker and I'm doing this thing which is illegal. And here is all of the people act on it with a.. All of the people who love, you know, commission sex work. And some also committed a crime. You know, this this very much rolls downhill. And even this phenomenon of so widespread contact, strict contact tracing just to contextualize it.

The thing it always reminds me of is in the U.K., so there are different security clearances. I used to be friends with a couple of fancy civil servants, which is the only reason I know this.

And the highest one, the one you need X you need to pass to get access to top secret material is called developed vetting. And it involves, amongst other things, an interview tracing your previous sexual contacts to, well, look out if any of them are Russian spies. For someone to gain access to, like retro virals, we require them to undergo an equivalent amount of contact tracing and vetting an interrogation about their private life that we do to give them access to top secret classified government material.

Just by default. And that's fundamentally ridiculous.

Yeah. And I say I think about power imbalances kind of on two levels. The first is like a platform level. I think a really I think the core have a slick, easy kind of counter to our argument is like just don't disclose your status on these platforms. And I think to that, I'd say if there's is this actually really unfair pressure to disclosers your status?

Because often you have to, like, offer up that information in order to kind of unlock like a wider range of sexual and romantic experiences and potential partners. There's this great kind of concept that I think about a lot called privacy unraveling. This is done, the work done by Mark Warner and and others. But here they they talk about how late by not disclosing, you know, from an individual, not disclosing in a sea of other people disclosing your status. Is there are assumptions made about you just by not even just by not saying anything. And so I think in this example, we can see that, like, regardless of what you do, people are going to base its judgments off of you, based actions on you. And so there is this really unfair pressure to just give this information to the platform. We also think about it from an individual level, just thinking about how. Disclosing your HIV status on these apps like it, you're you're very much trusting another person. The person that you're sharing this information with, with. I didn't know not to harm you or use this information against you in any way. And so there's also this imbalance. You know, when you're when you're kind of giving this information out, just because I can say a majority of dating apps also incorporate location tracking. And, you know, Grinder will tell you the distance between you and another person down to like, you know, feet like a really kind of. Small measurement. And so thinking about just like personal experiences of like, do I feel safe right now with this person knowing how far I am from them? There's also that kind of like power and balance in terms of.

Individual to individual? No, I think it's some.

I don't know how many of my friends who are actively using dating apps actually know what data is being tracked and kept and where it's going. So for folks who may be surprised at some of what you all are talking about, could you just give land like a dating app one to one? Like, what is Grinder or Tinder? What are they looking at? What are they keeping? Where is that going? And again, that kind of like why does that matter? Element. Oh, also this. How does that connect to the state? The other part of this is like.

How is that information, that data that they're collecting yet then getting to the state as being sold off? Kind of.

Yeah. So, I mean, I can't give you like the full broadbrush in the dating apps have a variety of different policies. And as we sort of analyze most that for a number that allow HIV disclosure, only a few actually have HIV specific policies and most of them tend to be either one, like your information is effectively public. So once it's disclosed on the platform. Right. Anyone who comes to your account and sees it, it's they ours. Right. We don't protect against that grinder. Recent really, really sort of a statement on third party data sharing with respect to HIV data. There were multiple controversies where we're we're grinder's sort of third party data is being sold to advertisers. There were some issues where there were HIV researchers who are running access to that data. But generally speaking, like anything that's in your profile in terms of dating platforms is considered public information. It's as if I wrote it on a sign and then walked out in public. That's how the dating app sort of views the information you're disclosing. Right. They have the ability, right. To go to be more protective, to create other sorts of protective receipt regimes. But generally speaking, there aren't massive provisions on sharing with third parties. I think we we looked specifically on like HIV specific disclosures. Not many apps that even allow HIV disclosure address HIV specifically. So it sort of falls into their sort of general, you know, sort of platform policies.

And how it relates to the state is a few different ways. Right. Like one like the state can get access to third party to third party data through private companies. Right. Second, the state can directly partner with platform. Right. To attempt to get this information for research, which we've seen, you know, sort of spurred a controversy with Granger, where we had HIV researchers from a state university wanting to access that information. Obviously, you know, not super in accordance with user expectations are on the use of that data. Right. But the state is able to use platforms, whether intrusive by contract. Right. And actually engaging the platforms themselves. Right. By accessing third party data and also, like the state can use platforms in ways that are actually against the platform's interests. Right. We talk about the ways in which law enforcement could use platforms freshly investigated. Right. Whether that's talking to people on platforms, whether for for drug busts or for HIV. There has been really great reporting on the Egyptian state forces using Grinder to entrap and sort of ensnare sexual minorities for breaking various laws around homosexuality and other issues. So there's ways in which the state can engage both directly with a platform with third party sorts of folks, as well as sort of using the platform against, say, maybe its terms of service, even if that's not squarely sort of prohibited.

But but generally speaking, dating apps and we sort of conclude as well aren't doing enough. Right. Like, if we're if we're going to think critically about designing for HIV, there has to be a component of how we protect HIV information, not only from other users on the platform. Right. But from third party individuals as well as the state. Right. I mean, that's a political judgment on the part of the platform. But I think if platforms are serious about social justice and intimacy, which according to some marketing materials, they are right. It requires a critical relationship. Right. To these sort of external forces, even if they are public health authorities. Right. And I think that's a part of our work as well. It's like we can't assume the public health is just this overriding good in all instances that justify, you know, the creation of these are Mr. Like. Well, let's just give the state the data. Is it just public health? Right. Because public health surveillance has resulted in a variety of forms of harm. Right. But in too many ways, we sort of, you know, make hygiene. Ecorse sort of criticized this notion of if it's public health surveillance, we don't hold it in the same thing. We think about other forms of invasive surveillance. Right.

Yeah. I mean, I guess I would just say, like, I agree with everything the judge has said. And I also think that there is a interesting. Sort of like that there's an interesting contrast here with this data, and we'd like the grind every action and then with things like the Snowden leaks a few years ago, you know, so, so minor anecdote.

Wikipedia ended up suing the NSA of intercepting people's Wikipedia sort of browsing.

And I was actually the research, unlike data analyst who put together all the information for that brief, which was really interesting, because occasionally I wake up and then remember that I sued the NSA once and then promptly forget, because that's a big thing to think about. But I think it's interesting to compare that to this.

Like, look, it turns out that the US government is tracking what you look up on Wikipedia and instantly a top 10 website is at the Seventh Circuit Court of Appeals, like yelling at the U.S. government. That is legislation introduced that are Senate hearings. There are New York Times front pages. It gets to the point where the US government has to pass a law retroactively saying that what the NSA it was legal, which is basically as close as the US government gets to fixing something.

Is saying, OK. Not saying, OK, that's fucked up, so we're going to start, but saying that's fucked up.

Oh, wait. No, it's not. Take taxis. No, it's fine.

Whereas when you look at, like, the grinder sort of sailed to China or look at like the general HIV, like surveillance infrastructure in the case of the greatness, I don't like.

Let's be honest, at least 50 percent of the reason that he got any media coverage whatsoever was like the massive sign of phobia that's currently going around, particularly in times like government relations. Oh, like the Chinese government can't be trusted with this data, but we can.

But more generally, like, I don't see any laws being introduced about this data collection or setting minimum standards. I don't see any corporations going to court to push off the gag orders that are on top of the user data requests to get this data out. Like there's a lot fewer eyeballs on it and there's a lot less effort. And I can't help but feel that a big part of it is because first it's seen as as in some ways like naturalized like the public health infrastructure. And the idea of like HIV surveillance is older than the Patriot Act. But second, because like HIV is that disease for queers, like that's how it seemed. That's how it's treated.

And Google has like an organization that cares enough about, you know, tonguing right wing ring pieces that it appoints the CEO of the Heritage Foundation to its ethics board to make like Newt Gingrich happy has very little to zero incentive to, like, go toe to toe with the US government on the front page of The New York Times and in court for the sake of queer people like.

And this that this is the thing that sort of gets me the most is that this disparity is, you know, also makes it self apparent in where we put attention. As. Researchers and also as just like people who have the ability to phone up that Congress critters and yell at them for not doing things. And and that is a thing I would like to see a hell of a lot more of is not just not just people looking at our paper and going like, gee, tech companies should be more self-aware, but also being like.

You know, when you when you found the time to yell at the government for these fifteen things, but you were nowhere to be seen on the gay blood ban, which persists or restricts like HIV related restrictions in immigration or the constant naturalization of surveillance. And everyone should read Stephen Meldrum's next paper, which gets some of the really, really terrifying ways that this is being expanded right now. It does make me. You know, to yell at movies and just grind on the U.S. government, but also yell at my colleagues.

Yes. And I'd also say I worry that a reaction to this paper and our kind of concerns around state surveillance is that people would say this is all conjecture and like you're all just like making up a problem that doesn't make sense. And I think it's important to just recognize that there, you know, the fact that Slate, just a handful of companies own like that kind of all the dating apps, you can think of thinking about how there's just this, I guess, database of really personalized, sensitive information, just in many cases unprotected, and then thinking also about the history of criminalization of HIV and to be gay or queer in the US or in the world. I think that those things kind of combined cause a lot of concern for these communities, and it should be a bigger concern for everyone overall. And so, yeah, I guess I just bring this up to say, like, we're not just, like, blowing whistles to for the sake of it. Right. It's like we're trying to make everyone aware of this this thing that is potentially very dangerous for everyone.

Well, let's talk about some of the some of the specifics then, so we can get to some action items here for not only the listeners, but people at tech companies, people and policy. So in your paper, you analyzed forty nine dating apps and platforms. That's a lot. You probably learned quite a bit, I'm sure. I'm curious if you can explain some of the questions that you asked about the design practices in terms of prevention, stigma reduction policies, and then also how those questions can help inform us to build better design practices to help with this destigmatize de stigmatization. And this app, HIV prevention in general.

Yeah, Pappi, just happy to chat with that. So we ask sort of variety of questions when we sort of approached each of the platforms right. Squarely around like one like does the platform allow you to disclose HIV status, sort of what those option looks like, what those disclosures look like, whether they were disclosures around for quote unquote, safer sex practices, questions around whether or not you could search Sorge or filter based on HIV status or based on sort of HIV related information, like on, say, you can indicate on some platforms that you're on prep, which is pre exposure prophylaxis. This was a questions we also asked questions around like is HIV expressly mentioned in your policy? Right. Like, what are the policies protecting this information? Is it explicit trying to take it from warbly user perspective? It's cool. I'm calling you a privacy policy. Is HIV you to know for the most part. And happy to to Kelvyn as well. Onto other sort of writing questions.

We could approach all these platforms sort of documenting in large part. I think part of it as well is a lot of these interventions around HIV have been limited to sort of crude by gay platforms. Right. There's the sort of overriding stereotype that, like HIV is a sort of a gay men's problem. It's also a problem for color and women and other people. Right. And I think part of our city as well as like cool. Like there's some intervention to get a sense of what about the other platforms? Like what's Tinder doing right? What's Bumble doing right? What are these other platforms that we consider to be either a generic or otherwise large scale dating platforms, for the most part, doing anything. Right. Like their their art. As you'll see in sort of one of the charts we have in the paper is, for the most part, platforms aren't designed for HIV. Right. They're not to celebrate Chavy disclosure. They're not designing their policies for. There is little to no attention that HIV exists. Right. Right. Which in some of those sort of reinforcement that we think about, it's cool if all of this intervention is happening on deep platforms. Nothing's happening on St Paul forums.

How might dislike reify certain assumptions around, like, who HIV is a problem for? Right. Who the user with HIV is. Right. Which in itself is a problem. Right. That these other platforms have ways in which they could engage or otherwise help users who have HIV. Like there are women who live with HIV either or straight people who live with HIV. Right. And so those sorts of overarching sorts of things we think about. And Happy Kalvin and other folks want to jump on that as call.

I wouldn't add too much other than saying no to large categories that we look for where the design features another policy. That's right.

And so, like, you can learn a lot from both of these things. And I wanted to maybe put some numbers. So Kevin was saying this. And so, like, you know, we look at 49 apps and only eight of them mentioned HIV in their policies. And five of those eight were clear, specific and honestly, more specifically geared for gay men. And then three more generalized apps. Wherever the ones that mentioned HIV in their policies. That's really scary to me. The fact that I guess just a majority of apps aren't, I guess, confronting HIV in their policies and protecting their users in that way.

And on that as well, we also tracked it used to sort of narrow within the broader scope of platforms. It's not only just HIV disclosure. Are there affirmative efforts to destigmatize? Right. Like, what other sort of information is provided around HIV, whether that's attempts to sort of normalize or destigmatize the condition and disclosure around the condition? What was information like? Sexual health info? Like where can I get tested? What does it mean to have HIV? What does it mean to be HIV undetectable? Right. Information that could otherwise help folks navigate their own condition, different conditions. We're just navigating the intimate space, our own. And sort of to Calvin's point as well. For the most part, it's not happening. Right, for those hard. It's limited. Right. To more platforms targeted towards gay men. I think some cool design features that we sort of uncover and talk about a little bit. What if my favorite. I just love writing Daddy Hunt in papers. But they have a stigma free pledge, right. Where instead of allowing users to disclose their condition. Right. Put the onus on the user to put their information onto the world. They sort of reverse it and say, like, cool. Are you comfortable interacting with dating or otherwise, you know, sexually engaging with someone who has HIV? Right. And then you get quite literally a little badge on your on Your Honor profile. I forget to star or not. It's your stigma free pledge. Right. And what we sort of like about the future and I think is a positive development is cool. Right. Instead of asking an individual HIV to disclose that information out there, to render themselves vulnerable to the harms of all of the folks who are on the platform State Providence, RI, right now, individuals can create a marker that's like cool, like I'm not going to, like, brutally attack you when you disclosure HIV satisfying person.

Right. Or I'm not going to, like, reject you in horrible ways. Right. Based on you disclosing in a chat. Right. It allows you as a user to like, cool. I have to navigate the stigma. I have to navigate all of these all sorts of things around my status. Now, I can be like cool. Like I can readily point out at least five people in, say, a square mile from my house who, like, aren't going to hate me because I appear to be right or who, like, maybe might have the language or the tool to at least talk to me about it. Right. And so we also know besides disclosure, we consider these other destigmatizing efforts and think that that's a part of it as well. Right. That information provided in other sorts of design markers can be a more creative way to allow folks with HIV to live with dignity or otherwise navigate easier. Right. It's not necessarily we've solved HIV by having stigma free patches. Right. But still, I think, has a big impact for users. Right. Particularly trying to find someone without having to do the legwork. OK. I'm going to talk this person now. I need to make sure they have some degree of, like, progressive understanding about how HIV works and that it's not going to, you know, the host of things.

So one of the things that we most appreciated about the paper is that it had this interdisciplinary focus to it. And also for you all, as a team, you are coming from different backgrounds, from a law background, from a design background. And as we look towards closing the other thing, the last thing that I really appreciate about your paper was that it seemed like you had some clear design ideas and possibly also leading ideas about what to do about some of these issues and how to address them both in an interdisciplinary lens and then also in a comprehensive lens. And I'm thinking specifically in your abstracts, as you point out, that you point to Bottom-Up communal and queer approaches to design as a way of potentially making that tension easier to navigate, but also, it seems like to address some of these deeper concerns. And I'm wondering if you could say more about what those ways might be.

I mean, I guess more perhaps tangible design recommendations that we put forth are kind of. You know, just being more explicit, the platform being more explicit about what they're doing with their users, data giving the user a little bit more agency and what happens with that? And. And in. In the deletion process and like the storage of this information kind of making it clear in their policies that they're not going to hold on to that forever.

And what they will do, what they'll do with that information and to finish out this panel and this discussion, it will be wonderful if all of you could provide just one last comment or maybe a piece of advice or just something that embodies some of your biggest takeaways from this project.

Sure, I'll start. I think my two sentences or sort of my some of this one decriminalise HIV, like we shouldn't be criminalizing our way out of a pandemic. I think we are learning this with Kivett. And I think we've learned this time and time again is that we're not going to solve the HIV crisis with cops. I think on two other levels. Right. Public health surveillance is not an overriding good. We need to be more creative in the ways in which we of build public health, surveillance, infrastructure and then sort of by last minute notice more than two sentences. But design has to attend for law policy history. Like we can't we can't be out suggesting how we designed the world if we don't attend to the ways in which the world has fundamentally oppressed, subjugated an otherwise, you know, sort of marginalized Friday. Folks. People. Critical technical practices. Good work. Interdisciplinarity, interdisciplinary. It's great. And Calvin and Orser. Awesome. And I know it's been it's been really it's been a treat working with them. And I think also just unpacking a lot of the personal stuff. And this like just to, you know, to let to live this and then to write about it, I think is cathartic in some ways to be able to share that experience, but also to, you know, to attempt to unpack it further. Right. This isn't the panacea. We have not solved everything. This is the beginning of a, you know, a conversation that we hope to support and hope other folks can sort of take and run with. And, you know, disarm information of power from the state when possible.

I would 100 percent agree that like Calvin and Jovanna also, this is one of those cases where truly, like the friends you make along the way are the ones you need. Well, like, I was already friends with both of them, although I can't I still can't work out how I know Calvin unless I just like ran into Calvin like visit days of buttholes for like, oh, you look super gay and let's be friends. I guess my my other main takeaway, though, would be to echo what General was saying about, like factoring in the state and factoring in history and to sort of like redouble and be emphasize it, stop just. Working with the state and assuming that that is the same as it being good. And in fact, this is like a very specific like almost some tweet if you do a paper about how you are collaborating with cops on developing new software for tracking sex trafficking victims and you states that you personally refused to take an opinion on whether sex work is moral or not. And also the the cops pinky promise that they wouldn't use the software to track down voluntary sex workers. And so you believe them because you are they are cops. You are a bad person and also an idiot. You should not trust cops if you are going to trust cops. You should not have a place in our discipline. And if this is bewildering to people who are listening, I would really, really recommend reading on Enemies in Blue, which is a fantastic book. Currently, 50 percent off through officer and AK press on the long history of why the police are not your friends. I'm going to take a different spin on this.

I think speaking to designers and technologists and builders. Right.

I think it's important to just recognize that design has a lot of power and it's really easy to, I guess. Invalidate this work. And the things that we're we're kind of challenging here.

But like, I just so strongly believe that design has the ability to reproduce social norms.

But then it also has the ability to challenge reality and like, reshape how we think about HIV or dating or. You know, whatever. And so. I guess. Yeah. Like tacked onto orses point, like designers, just how this responsibility to factor in the law and like histories to the oppression into these design decisions, because ultimately they have these real lived experience related consequences for people. And I think it's it's, I guess, easy to forget that design has individual effects on people. So don't forget that that I guess I'd say.

We want to thank you all so much for the work that you all are doing, and we'll make sure to link to the paper and also to your Twitter or wherever people can follow your individual research or collective research in the future. And again, just thank you so much for joining us today.

We want to thank Calvin, Jevin and us again for joining us today for this great conversation. And our first panel with a special thanks to First Monday for publishing their article. And Kate and Mara for editing the HIV and AIDS addition of the publication. So, Dylan, what is your immediate reaction from this panel?

I was really grateful for the vulnerability and the openness and honesty that that everyone on the panel showed up with. I thought it was a really down-to-earth conversation about why this matters and why design choices matter and how we design our technology even when it comes to something like dating apps. And just I don't know if you've been on dating apps recently. You don't have to disclose. But I just in my experience, I would never think about the massive amounts of data that is being collected and like where where that's going. And then you take into account, you know, this topic of sexual health and HIV and AIDS. And there are just so many topics that I feel like are not discussed in the same way that they should be. When we talk about technology design, yeah, definitely.

And I mean, this is something that keeps coming up again and again in our interviews and on this show is the different systems of power and how technology plays into those systems of power. And so I'm going to add one more thing to that list of something that is synonymous with power and that is design decisions. And just like you were saying before, I mean, the people who create these platforms, they probably aren't thinking through the possible unintended consequences or the potential surveillance and incarceration and bio politics of something as simple as adding a checkbox to a dating app profile. But those decisions have important consequences, and those design decisions really matter.

Yes, since this conversation I've been reflecting on.

Concepts of stigma and technology. And there was something from this conversation that really stuck with me about this idea of public health and like the idea of like the common good for public health, especially, well, like we're recording this during the pandemic.

And how often that concept of like, well, this is for the public health gets used to design technology and. When I think I was Jevin, who is talking about, like all the different ways in which that gets used against people with HIV, that you're constantly reminded that you're not for the public good, that you're not for the public health. And you get put in this box and you have to just constantly disclose and constantly, you know, come out as being HIV positive and just how complex all all of that is in terms of stigma and how you design to be liberated for.

For folks who are impacted by HIV and not just create more cycles of shame and oppression. Yeah.

This is even something that Charlton said in our our most recent episode with Altec is Human. And he was talking about, you know, there's there's different roles that computing technology can play in our lives. And it's up to us to decide if we want that role to be harmful, if we want to use technology to help out with, you know, systematic oppression and to help create things like a surveillance state and only use it to benefit large tech companies and government bodies. Or we can choose to flip the narrative and we can create technologies that empower and like you were just saying, you know, liberate and help and create positive benefits for society. And this is a perfect example of of how to maybe go about doing that in a scenario that might not even think about normally.

So for more information on today's show, please visit the episode page at radical A.I. dot org.

If you've enjoyed this episode, we invite you to subscribe rate and review the show on iTunes or your favorite pod catcher. Join our conversation on Twitter at Radical, a iPod. And as always, stay radical.

Radical, radical, I'm going to say it again. Stay radical, stay.

Stay right. Stay right. I call.

But almost like I think I have no comment.

Automatically convert your audio files to text with Sonix. Sonix is the best online, automated transcription service.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

More computing power makes audio-to-text faster and more efficient. Sometimes you don't have super fancy audio recording equipment around; here's how you can record better audio on your phone. Better audio means a higher transcript accuracy rate. Lawyers need to transcribe their interviews, phone calls, and video recordings. Most choose Sonix as their speech-to-text technology. Rapid advancements in speech-to-text technology has made transcription a whole lot easier. Automated transcription is much more accurate if you upload high quality audio. Here's how to capture high quality audio. Create better transcripts with online automated transcription. Are you a radio station? Better transcribe your radio shows with Sonix.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Sonix is the best online audio transcription software in 2020—it's fast, easy, and affordable.

If you are looking for a great way to convert your audio to text, try Sonix today.