Anti-Trust: Congress and the Tech Lobby with Anna Lenhart


Anna Lenhart.png

What should you know about Anti-Trust regulation nationally and internationally? How does the tech sector drive policy? In this episode we interview Anna Lenhart

Anna Lenhart is a researcher for technology policy and democracy at University of Maryland’s iSchool Ethics & Values in Design Lab. She recently served as a TechCongress Fellow with the House Judiciary Committee Antitrust Subcommittee and supported the investigation into Facebook, Google, Amazon and Apple.

Follow Anna Lenhart on Twitter @AnnaCLenhart

If you enjoy this episode please make sure to subscribe, submit a rating and review, and connect with us on twitter at @radicalaipod.



Transcript

Audio automatically transcribed by Sonix

this mp3 audio file was automatically transcribed by Sonix with the best speech-to-text algorithms. This transcript may contain errors.

And.

Welcome to Radical A.I., a podcast about technology, power society and what it means to be human in the age of information, we are your hosts, Dylan and Jess.

In this episode, we interview Anna Lenhart, a researcher for technology policy and democracy at University of Maryland's High School Ethics and Values in Design Lab.. She recently served as a tech Congress fellow with the House Judiciary Committee Antitrust Subcommittee and supported the investigation into Facebook, Google, Amazon and Apple.

This seems like a particularly timely episode to air right now, considering the shifts that are going on behind the scenes at Amazon and some other of the major tech companies. And so it is our pleasure to share this interview with all of you today.

We are on the line today with Anna Lenhart. Anna, welcome to the show.

Hi, it's great to meet you.

Great to have you on. And today we are talking about antitrust, amongst many other things. So could you just start us off by telling us a little bit about what antitrust even is and how it relates to A.I. ethics?

Yeah, yeah, it's a good question. So antitrust kind of the first.

Let me just start with I am not an antitrust lawyer. I'm not a lawyer at all. I have a master's in public policy, an undergraduate degree in engineering. And overall, I consider myself a data scientist and data ethicist. But I worked on antitrust this year because I received a fellowship through Congress, which is a program that puts skilled technologists in congressional offices to support policy work. So I had the amazing opportunity to work with Congressman Cicilline, his team on the House Judiciary Antitrust Subcommittee of the House of Representatives. And specifically, I was supporting their investigation into Apple, Amazon, Google and Facebook. And I was asked to join not for my antitrust scholarship, which I have done, but because really what the investigation was looking at was really trying to ask some hard questions about the existing antitrust law. So existing antitrust law is about a century old. So it certainly was not.

I'm sorry, it's a half a century old. It's from the 50s. Can we cut that aside OK? Yeah.

So is it the purpose of the investigation was really to look at the state of existing antitrust law and existing antitrust law really was created in most of the precedent was set well before these digital markets, so well before people thought about selling things online through an Internet.

And so what the investigation was really trying to ask is, do we need to change these laws or is there a way to use them more effectively in this new marketplace? And specifically, what makes the digital market unique is the way it uses algorithms and data.

So, for example, Amazon, when you go to the website and you search for a product, it is able to rank all of the sellers products that are in there that are in its large repertoire of products that it sells and not only ranks them, it also chooses one item to put in the buy box and about 80 percent of sales go step by box. So this is an incredible amount of power that Amazon has in which products get sold online. And not only that, Amazon also has their own lines of business that intersect with Amazon.com. So first they have Amazon basics, which is just sort of their own products that they can sell and potentially favor through this algorithm. And then additionally, they also have their own delivery service fulfillment through Amazon by Amazon. And what we found and what a lot of researchers and market participants have found is that sellers that use eBay, so fulfillment by Amazon have a better chance of getting that buy box. And so the question for us is policymakers and people studying antitrust, is that anti-competitive? Does that hurt the market? And if so, do we want to do something about it from a policy standpoint? So these are the types of questions we were thinking about and looking at and looking for evidence of and really trying to start a conversation about. Algorithms are certainly one and then also the use of data. Right. So Android, for example, has the play store. And the best way to get if you're an app developer and you want to get your app onto an Android phone, you're probably going to play store. And what that means is that Google has an immense amount of data on your app. Now, what do they do with that data? Are they just collecting it to make sure that they're serving customers properly or are they using it to see which apps are growing most quickly in certain regions so that they can create a competitor? Really important questions if you're thinking about a fair and competitive market, so that's really what I've been working on and thinking about as those types of questions.

Yeah. So I as I mentioned before we started this interview, I'm very ignorant about antitrust. And I before I had prepped for this interview, I was even thinking, I think about the wrong type of trust. So before as you've kind of gone into what's relevant right now in this age of information, in terms of antitrust and keeping markets competitive, and I'm wondering if we can even take a step further back for folks who are ignorant like me about like antitrust laws in general, about why they why they started and what were they trying to regulate or change in the market.

Yeah. So antitrust laws were designed to protect consumers and ensure a competitive market where new ideas can flourish and innovations can flourish. So recently this has meant protect consumers from high prices. But there's a really important ongoing conversation right now about what is high prices mean in a digital market where users are paying in time, attention, personal data, these types of things. So for myself, as someone who's thinking about technology policy broadly and less about legal precedent or legal definitions of consumer welfare, for me, I'm thinking about sort of fundamental democratic principles. So monopoly power, especially in markets that have become necessary for daily life. And I would argue that the digital market is necessary to daily life, are threats to democracy. So for one, it means that a remarkable amount of decision making power lies in the hands of leaders who were not elected. So today, you know, the CEOs and, you know, other executives within Apple, Amazon, Google and Facebook are determining what information we see and when. I mean, we certainly saw that play out over the course of the election. But they're also making decisions about how our data is used and stored and analyzed. And none of these decisions are transparent. And additionally, the longer this goes on, without regulation, without oversight. Users start to become sort of desensitized and start to find these practices as acceptable and normal until a lot of what's happening with this concentrated power is that they're setting standards for the Internet, for the digital economy. Additionally, this is all leading to a loss of choice.

So one feature of democracy that is captured in a functioning open market society is this idea that consumers can, quote, unquote, vote with their dollars. So, for example, say I have two convenience stores within walking distance to my house and one offers a living wage and benefits to employees and the other doesn't. I can choose to shop at the corner store that is most aligned with my values. And so monopoly power starts to take this choice away. And I actually think Facebook is a great example of this. So, you know, they are the dominant social networking sites. There are a lot of investigations that happened in twenty twenty that confirmed this. While, you know, you can turn to other places such as Tech Talk and YouTube for media consumption, in terms of your social graph, who you know from high school and who they know, your cousins and they're your cousins, cousins, you really can only get that in one place. And as an advertiser who's trying to reach friends of your target customers, Facebook is your only choice. So this summer during the advertiser boycott against Facebook, Zuckerberg was actually able to tell his employees, and this is a quote, We are not going to change our policies. All these advertisers will be back on the platform soon enough. And I just I just think this shows an inability of consumers to hold Facebook accountable. And we see this democratic principle being lost in many of the digital markets today.

So what happens with this monopoly that these companies hold then? Because if we're not able to leave and they have all this power, I'm assuming this is something that you're working on then with the Congress is like, well, so, OK, there's there's antitrust that needs to happen here. What do we do? Yeah.

So this is what the House Judiciary investigation was all about. Why does monopoly power in the digital markets matter and what do we do about it? So it's important to mention that the House Antitrust Subcommittee did release a report. It's over 400 pages. I do not expect your listeners to read it all, but I do encourage them to check it out. Look at the table of contents. There's probably some interesting stuff in there for this community and we can certainly put it in the show notes. But it covers digital markets kind of the landscape broadly and includes a range of recommendations, recommendations for actions that Congress can take. So I do want to clarify, this is different than the antitrust cases that we're seeing be introduced by the state AGS, the FTC, the DOJ. Those are cases under existing antitrust law. And the recommendations in the report are really about new laws and new actions that Congress can take. So. There's a spectrum of recommendations, including kind of just increasing capacity for antitrust oversight. There's also structural separation. But as a data scientist and as people who design and build technology, the recommendations that I think are interesting for us to weigh in on are things like how do we curb self-propagating? So, for example, what defaults are important for accessibility and ease of use and what are anticompetitive? You know, research shows that and human behavior just sort of indicates that users do not change defaults.

And we know this has huge implications for privacy. But what does it mean for search engines and browsers and email applications that are not owned and operated by the dominant platforms? How do they get access to users? You know, you've had several speakers on this podcast who discussed fairness and algorithms and even potential policy approaches for minimizing discrimination. You know, can we steal some of those ideas around audits and bias tests to make sure that Google does not preference YouTube videos over other video platforms in search? You know, I think that's an exciting conversation. You know, the other recommendation area that was discussed in the report is around interoperability and portability. So, you know, if a new social networking platform were to emerge, should Facebook be required to let me post to my grandmother's wall through my account on a new service? If so, what is needed to protect that data? Do there need to be any limits on its use, that type of thing? So all of this is to say that there's an array of options for Congress to consider and a lot of space for data and democracy expert to be involved. So I think it's an exciting time.

Now, can you talk a little bit more about the report that you're referencing? You know, it just a little bit more about what went into it, how it came to be and what you hope the impacts going to be.

So when I first started the Tech Congress Fellowship, I thought of Congress the way I think most Americans do, which is that it's a government body. It creates new laws and does appropriations. But really, another big part of Congress's work is oversight both of the executive branch, but also of industry. So when I started the fellowship, the House Antitrust Subcommittee was about halfway through their investigation. And this was a really thorough investigation with a lot of information. I can give some insight into that. First, the subcommittee sent requests for information to each company and received over a million documents in total. There were many nights in twenty twenty where I was up until one a.m. to a.m., three a.m. just reviewing documents. You can really get lost in it. I would even have dreams about documents. So it was an interesting year for sure. Additionally, the subcommittee sent requests for information to 80 market participants. So businesses that rely on or compete with the big four companies, but then unofficially, many more companies came and spoke with us, former employees, start ups. A lot of people really contributed to this investigation, and I'm really grateful for that. Additionally, there were seven public hearings. So your community is probably most familiar with the July CEO hearing. But the other six hearings were also quite interesting and are all public. So do encourage people to check them out. And lastly, and there's something I don't think gets talked about a lot, but after the major hearings, there is an opportunity for the subcommittee to send what's called questions for the record. And these are just written questions that are sent to the big four companies and they can give a written response. And some of these questions are really wonky. So we have questions about browser engines, API changes and algorithms. And in some cases you get a very boilerplate non answer. But I actually find that the questions themselves are quite interesting. And again, those are all PDF and available on the House Judiciary website, so could be fun to check those out.

I'm curious about how you, as a technologist got involved in this work and what I guess your unique perspective is on this, because it seems like we're seeing in Congress more and more them turning to folks who are coming out of the tech industry to either testify or to speak to what should be done about the tech industry. And you're also inviting other technologies to be part of that conversation. And so I'm wondering about your own story and how you got involved as a technologist in this work.

Yeah, so I was really excited to join the antitrust work because I am very passionate about artificial intelligence and and what the future holds with artificial intelligence.

So I think the ability to use data and to make predictions and build these models that can really make things more efficient, I think has huge promise in areas like medicine, agriculture and like a host of other even potentially smart cities.

But the more I think about that, the more I also get worried about human rights concerns.

And my concern with having a few companies that control the space is that they will get to set the rules for how this technology is rolled out. And the more unchecked their power is, the more they're going to write those rules to favor their bottom line in their profits.

And so when I came onto the team, I was asked to look into three main areas and they just happen to be three main areas that are probably the most important to artificial intelligence and kind of where we're going market wise.

So the first is cloud computing.

So I was looking at questions around the overall market, which is a very concentrated market, and this shouldn't be a surprise to anyone, you know, cloud computing. And when I say cloud computing, I'm thinking big data centers that are in charge of our storage and compute, often referred to as the cloud, but very much our physical data centers that we access to the Internet. But it's a concentrated market because it's a market that benefits from economies of scale and economies of scope. So right now there's kind of three to five major domestic players. I was looking closely at eight of us, which is about 50 percent of the infrastructure market, and then Google, which is growing quite quickly. And I looked at Microsoft as well as a major player just to understand the market broadly.

And it's it's a fascinating market to think about for a couple of reasons. One, the role that US plays in Amazon's overall business and just the way Amazon is able to leverage eight of us into eight of us is Amazon Web Services, which is Amazon's cloud computing service. But the way Amazon as an overall company is able to use that infrastructure to then build their own products on top. So they essentially own the inputs into Amazon.com, into Twitch and like their whole wide, wide array of products, whereas Walmart, for example, also runs their e-commerce site on cloud, on cloud, on the cloud.

But they don't own and operate their own huge infrastructure as a service business. So they have to rely on that for an input. And so is this interesting sort of tension and I would argue conflict of interest where you have companies that want to work in the retail space, the online retail space, and they essentially have to support their biggest competitors, other major business line. And I think it's really important to remember, too, that us, while it's only about 10 percent of Amazon's overall annual revenues, it is the majority of Amazon's operating profits. So when you are an eight of US customer, you are certainly supporting Amazon and their overall business. And so that tension, I think, is really interesting to dive into. And then I think also critically important is thinking about cloud computing from the standpoint of critical infrastructure for our country, for national security. And getting back to what I was talking about a little bit earlier about choice. So if I decide I don't want to use Amazon because they don't agree with their practices and I am not an Amazon Prime member, I do try to avoid Amazon as much as possible. But the truth is, I use Amazon every single day because government websites are hosted on AWB and critical services that I need and need access to information I need access to is hosted on AWB. And this is a really important question for us to be thinking about as a country. Are we OK with one company being responsible for that much storage of our personal data, but also just ability to turn off all operations? Not that AWB ever would, but I just think it's something we need to really think about.

So you mentioned there were two other roles that you were hired for. What are those? I'm just very curious as to other markets as.

Looking at Zori browsers and browsers is also very similar story, right? It's kind of critical infrastructure is the way we access the Web.

And you saw we saw very similar sort of themes there as well with Chrome and the ability for Google to use Chrome to potentially favor their advertising business and absolutely their search business. And again, just these core conflict of interests there in that space.

So very similar story as we think about critical infrastructure. And then the third market, which was the most exciting to look at was the Internet of Things market.

So specifically voice assistance. I was looking at Alexa, Google assistant, and then, of course, the whole array of hardware that is part of that ecosystem of voice assistance. And I think what was interesting, so I've been following that space for a while as someone who cares deeply about privacy and user rights, I think it's a really interesting space from the privacy perspective because. Not only when you buy an Amazon echo, for example, you as the user put in your username and you give Amazon permission to listen for the word Alexa and then do a short recording, but no one else is giving that Amazon echo permission to do those recordings.

So when I go to my mom's house and she asks Alexa to play your favorite song and I'm speaking to my boss in the background, I did not give Amazon permission to collect that.

And so I think there's some really interesting privacy questions. And again, you only have a few players in the space.

And right now they're really setting a precedent, really setting the rules and kind of what the Opt-In looks like, which contractor is able to get access, all of that type of stuff.

So it's kind of following it from that perspective. And then from the ethics perspective, I got really fascinated. And what does it mean for user interface, like a screen to no longer exists and to just be a voice? And essentially what it means is that your rankings now disappear. So sort of going back to the Amazon example from earlier, when you put a product into Amazon.com, you see the buy box, but you also see listed underneath all the other products you could potentially purchase as well. Now that we switch to voice assistant, all of the sudden that list no longer exists. And when you ask Alexa about a product, it gives you one answer. And so your choices have really, really shrunk. And again, when you just think about this, it's such a few number of companies that are operating in the space and for them to have that much decision power over the answers we get.

To information platform like that, I think it's really concerning.

And so from an antitrust standpoint, no surprise, we saw a lot of the same themes that we've seen in the other markets. So you see a lot of self referencing. So same thing.

There's a lot of initial kind of analysis that shows that Alexa will favor Amazon's products if you ask Alexa to shop for you. We know that there is an ongoing there's been a lot of ongoing debate between Spotify and Apple over Spotify can be the default music device for CRT sorry, the default music player for Siri.

And then, you know, Google, same thing. If you ask a Google assistant a question, it's using Google search to run that query and it's adding that voice query to Google's overall database of queries and so is getting more and more query data. So I think it's a really important market to keep an eye on because that choice architecture is changing and we're seeing a lot of the same sort of competitive, anti-competitive behaviour.

So we know that some of your Ph.D. work is on a democratization of technology. And I specifically.

But I'm I guess my question is about trust in a different sense of like.

So if we're looking at in the US, we're just not are both based and we have this government that is on Amazon, on us. Right. Is on Amazon Services. And we have a Congress that is making laws about how to regulate those exact services.

It seems like there might be some level of a conflict of interest were for me as a lay person who's not in politics, I start thinking to myself, huh? I mean, what lobbying is going on behind the scenes here?

And so I guess my question for you is like, how do we make sense of antitrust and anti monopolies and regulation while we know that the conversations and the tethers between industry and the people who are regulating industry are so tight?

Yeah, it's such a hard question. And I think and I hate to give this answer, but I think it's really where we have to start and it's transparency. So one of the really big issues, especially with the cloud contracts, cloud computing contracts, is that when I went to try and I worked with the IRS, which is a congressional research services, and I called them up and said, hey, you know, I really love your help understanding just how big Amazon market share is in the federal cloud computing space. And when we went in and looked at the data, it was just really, really hard to know for two reasons. One, because only contracts that are in what's called the public trust space. So non intel agencies. So think like HUD, Social Security Administration, sort of you're not defense and intel. They are the only ones that sort of have their contracts kind of publicly listed. But even in those contracts, the US sort of business relationship is actually normally like a subcontract, and that's not reportable or easily easy to find. So a lot of times you couldn't even really tell which of those public trust contracts are actually eight of us. And then over in the defense space, you just really have nothing, with the exception of the few very high profile contracts which get reported on. But there's nothing sort of in a in a public database. So that's the first thing we really need to do, is really just even need to understand the scope of the relationship between our government and these technology companies.

And I do understand that there are security concerns. Like I you know, I think we can be mindful of that. But I think there is a balance of what the public can know. And I certainly think it's more than what the public does know.

And then from there, I think that needs to be a question of is this what we want and how are we going to address some of the conflicts of interest? And I think that's a harder question. What is the relationship between kind of the private sector, which right now is where a lot of our top technology talent lives? And I think that's something we could change. Right. So I do think that there are some discussions around sort of publicly owned cloud computing sort of collaborations which could potentially not involve the private sector or could have a public private relationship. We could certainly hire more talent within the government. So I think we could kind of change some of these things. But without really knowing how deep some of it goes, it's hard. It's hard to even have those conversations.

So along the lines of hiring more of this kind of talent for the government, this is a conversation that happens a lot in the responsible tech community. They keep saying hire computer scientists to work for policies so we can actually make. And you are kind of one of these people, so I'm wondering what what is it like on the inside? Are you met with welcoming warm arms or you met with hostility? What is your experience been?

Yeah, yeah. I think people are really grateful to have to have expertise in general mulvany of any type, but certainly the technical expertise, I think it's really Congress is very underfunded and understaffed in my opinion in general. And that is really it just really, really hard.

It means that talented people have a really hard time working there with the salaries that are given. And normally, you know, if you don't come from a privileged background, you're going to really struggle to work there for long. The staffs are also just very tiny, even if you have a very, very talented staff. You know, we had about four lawyers myself and then some support staff investigating the four largest companies in the world, all of which have huge, huge armies of lobbyists. I think some recent reporting came out that Amazon has as many lobbyists as there are members of Congress is just like a huge fight.

Right. So you can put a lot of talent on the hill. But, you know, it's yeah, it's a big fight, just the first thing I'll say. And then the second thing I'll say regarding the tech talent specifically is I think it helps a lot. But where it hits its limits, in my experience, is that these companies, especially the four that I was focused on this year, are incredibly talented at not answering your questions, like incredibly talented at keeping the way that their products work, the way that they use data, the things that they're investing in secret. And I mean, sometimes it was just a master class just watching the responses that these CEOs would give members of Congress and their ability to just really try to make policymakers feel stupid, to try to make policy makers feel like they don't understand or can't understand. And I think that's really, really dangerous. Like you don't need to be able to code to understand at a high level how products work. And you shouldn't have to. And if you ask a company something and they know what you're asking, they should answer. They should not try to tell you. You didn't ask it the right way or nit pick the language you used. And so that's to me, like one of the things that frustrates me the most about these companies is they don't want policy makers to understand their products. And there's no amount of tech talent you can put on the hill, I mean, unless they can read minds, right?

Like, yeah, well, you just answered the question I was going to have, which is OK, so. So why so why why do that. Right. Like, why would tech companies do that and. It all all of this makes me really skeptical. That's like the feeling that I have around, like talking to someone who worked in. Maybe I'm just too but like, you know, I'm skeptical of Congress. I'm skeptical of the tech companies. I'm skeptical of just how much money is here.

And there's definitely part of me. And maybe it's because it's getting to the end of the semester semester of the Ph.D. for for this year. But that I'm just like in the system work. Right. And obviously, you're someone who who's there in the system fighting to to make changes. But I'm wondering if you have thoughts on the degree that we need to kind of uproot some of these deeply entrenched issues.

Yeah, you know, it's not it's not going to be easy, right.

The money in politics problem is so, so deep and unfortunately is going to be very, very hard to undo because the reforms that are needed are going to require a lot of policymakers in both the House and Senate and executive branch who want to see those changes made, which we're just very far from right now.

With that said, I have a lot of hope.

There are some really incredible activists. And you all know this. You've had a lot of them on your show who are bringing as much transparency to these issues as humanly possible as much as they can. You have some incredible whistleblowers from inside these companies who, you know, they do have the real answer is they're not as skilled as a guy and not answering questions, do they? They can come out and help us and they are.

And then you have some really incredible antitrust scholars who are ready to really push the bounds of what these laws can do and potentially rewrite them. None of this is a silver bullet. I don't think antitrust reform is going to be the silver bullet for holding these companies accountable, which is why I'm also very passionate about algorithmic impact assessments and, you know, improving oversight of algorithms that are used in social services. All of that stuff needs to be done as well. But I think more and more having people who care about this. And the other the other thing of a piece of hope that I can give on this issue particularly is when I was working at IBM, I was working IBM prior to Tech Congress and I was working in the federal government space. And I part of my work, which is really fun, is I got to train the new hires to the people fresh out of undergrad in a sort of responsible A.I. and things that they should be thinking about on their product, on their projects.

And they all were very, very fascinated and passionate about this work, which is very different from when I was an undergrad. I was not thinking about this stuff that much. People weren't really talking about it. Yeah, I just wasn't part of the conversation. But now it very much is.

And these young people who are coming into the field, most of them have either taken in kind of ethics or bias, an algorithm sort of course, or have been following it and teaching themselves as much as possible. So that's exciting.

It's actually one of the things that Don and I really appreciate about some of our younger audience and the audience that's like coming straight out of undergrad is that there is this passion in this fire that's lit underneath everyone right now to try to help with some of these problems. And I'm wondering what what we do with the people who don't have that fire, the people who are the ones that are trying to confuse policymakers into thinking that they need to know how to code, like how do we motivate them to be better?

Yeah, that's a fascinating and hard question and something I ask myself and actually blows my mind a lot of times is like, why don't more people that work in tech like care about living in a functioning democracy?

And I don't know if I have an answer. I mean, like the easy, low hanging fruit answer is just the paycheck that you bring home. But I but I, I don't think that will work forever.

I really don't. And I think you're seeing that with some of the mass exodus that you're seeing at Facebook particularly, which has some of the highest salaries in the field. And I think you're going to see it more and more.

And some of the other companies to the workers are going to start to say I and especially now with potentially a new sort of workforce where people potentially don't have to live in Silicon Valley anymore. I think that's really going to open up the space here, because people are going to be able to say, I don't need to make four and a thousand. I can make one 20 because I don't live in Silicon Valley.

So I think that's going to actually take away a lot of some of the at least the hiring power that comes with these companies. So I I'm hopeful about that, too, but.

Yeah, it's it's all it's all heavy lift.

Well, besides making you solve all of the world's democracy problems, we did want to learn a little bit more about your research and if you could tell us a little bit more about the other stuff you're working on besides Antarctica.

Yeah, so it actually it dovetails quite nicely. So my research is really and how do we involve citizens in the technology policy process so similar to the way that tech CEOs are very good at making policy makers feel like they need to know how to code to have any say or ability to regulate big tech tech people who craft technology policy. There's sort of this narrative that I find very problematic, that the people who should be crafting tech policy should be experts in technology and should either be sort of experts from an academic standpoint or experts from sort of working at these tech companies and that everyday citizens who don't know how to code or aren't really familiar with, say, cloud computing stack shouldn't have any say in the technology. And I think that's really problematic because everyday citizens do interact with the technology every single day and certainly have values. And a lot of opinions are not not even just opinions, because a lot of values and voice to add to the conversation around how these technologies should be regulated. So there are two models that I use a lot. One is called a citizen's jury or citizens panel. One is called a consensus conference. And they sort of blend together in some ways. But these are essentially models where you take a technology and you essentially sort of put it on trial.

And instead of having a bunch of experts sort of frame their work, you bring in place, that is then to be able to ask questions of the technology. So, for example, for my master's research, I did a lot of work on autonomous vehicles. And we I organized a citizen's panel consensus conference on autonomous vehicles in southeast Michigan. And we brought in about 10 community members, all different backgrounds and no expertise in autonomous vehicles. And we gave them some high level background on how autonomous vehicles work, sort of what are the what's the current discussion about them right now? What are some of the safety concerns where some of the promises of the wide range of views on the technology? And then we had a really thoughtful discussion and workshop. And then from there they were able to ask experts their questions. And so they were really able to frame this conversation about this technology and the way this technology would look in their community. And then we all came together and wrote a report. So an official kind of citizen statement that was then presented to policymakers. I was really meant to represent the views of citizens and the value citizens have and want to see emphasized as the this technology is used more in society.

So I'm currently doing something similar with content, moderation, and really this question of Section 230 of the Communications Decency Act, which is under a lot of scrutiny and debate right now, but essentially is the is the law that allows companies like Facebook to make content, moderation decisions. So it essentially tells them that they are not liable for the content that's on their platforms and more importantly, that they are allowed to essentially censor certain pieces of content if they want to. They have the choice to do that. And so it's funny, almost everyone in our country uses some sort of platform that relies on Section two 30. But very few people up until maybe very recently did not know what Section two 30 was, but they're clearly very impacted by it. So my lab is designing a game in which everyday people play the role of both a startup social media company that sets their own policies and sort of writes their own policies. And then they switch to the role of a content moderator that actually moderates borderline content. So like really tricky content. And through this process, they really get this experience of sort of what are the challenges, you know, what are things that could be done to sort of mitigate all platform harms, but also like protect free expression. You get into these really interesting conversations around these values. And then from there, the hope is to bring in experts again, have the citizens use their experience from the game to be able to ask questions of experts and frame the conversation in the way that they would they would like, and then to think about what recommendations they would make both to potentially reform Section two, 30 or not, or a guidance that they would like to share with platform governance more more generally. So pretty exciting. But really, just overall, I see my work is really trying to hit on these narratives of, you know, who has a voice in technology in the way it's used in our society.

And for people who want to follow your scholarship, where can they find you or connect with you?

So my website is code colloquy dot com, and I actually hosted a book club that I think your listeners may be interested in. So normally the book club is held at the local D.C. libraries, but right now its pandemic times there were on Zoom. And really we cover books that are about the intersection of kind of social issues and technology. So we just discussed Ruaha Benjamin's race after technology. We've also covered books like Hannah's Hannah Frys, Hello World Virginia Eubanks, Automating Inequality. And in a few few weeks, we're discussing the code by Margaret Emara, which is super interesting and I think very timely going into the next Congress. But I'm also available on Twitter at A.E Leinert. And one of my goals for twenty twenty one is to tweet a bit more, but not too much.

And so thanks is really great talking to all of you.

It will be sure to include that and all the other links that we mentioned and more in the show notes, but for now, Anna, thank you so much for coming on the show and thank you for helping us understand what the heck antitrust is in the first place. We want to thank Anna Lenhart again for joining us today for this wonderful conversation. And Dylan, I'm going to throw it over to you first. Antitrust, what do we think about it?

Goods and touch antitrust regulation. Good end of discussion. Perfect.

No, I thought that was a really great and illuminating conversation to me.

As I mentioned pretty directly in the interview, I don't know a lot about antitrust law. And honestly, like if I think about a lot of my colleagues in some of these spaces, it's not something that's mentioned that often.

And so first, it's just it's wonderful that people are having these conversations. And I knew Congress was having these conversations perhaps, and air quotes. But and as this conversation with Anna really made me see that these conversations are happening on the ground and are having hopefully real consequences.

I think the thing that stood out to me in this conversation and something that I didn't really know was just the level of lobbying that happens at the congressional level that especially around, you know, when Anna said that Amazon has, you know, pretty much one lobbying individual for every member of Congress, just the sheer size of that lobby is kind of, I guess, not surprising, but it still is shocking to think about.

And that's something I don't entirely know what we can do about.

So I'm happy that, you know, Anna and folks are out there, but especially when she's talking about, you know, underpaid congressional staff who are making these massive decisions or creating these investigations which are necessary in order to try to figure out a better strategy around antitrust regulation.

It's it does seem to be like a David and Goliath kind of topic in that, you know, the people with a lot of money, namely folks like Amazon, NWS, et cetera, they have a lot of leverage in these spaces that understaffed congressional investigations may not have.

What about you just.

Yeah, I mean, I'll echo of what you were starting off saying at the beginning there, that I think that in the show we've definitely heard a lot of people say again and again, well, how do we fix tech ethics?

We make policy. And then the conversation just kind of ends right there. And I know that we've shared some mutual frustration over that because it's really easy to just say, let's make policy and let's regulate things. But I don't think that we've really had the opportunity, the opportunity to dive into what exactly that can look like on the ground. And this was one of our first glimpses of what people are actually doing within Congress and within legislation to make some change happen. So I really appreciated Anna's insight into that, because there were a lot of things I learned in this interview that I've never even thought of before. And then again, to echo the second half of what you said, I agree. I think power is really ridiculously distributed amongst the tech giants here. And the fact that they have lobbyists at all is something that is scary to me. But then I think about who those lobbyists are and what they're masterfully trained in and the fact that they are basically there to make Congress feel like idiots, like they need to code to understand these things. They need to be computer scientists or experts or professional technology engineers to even have a conversation about how to regulate this stuff. That's super scary to me. And I know that my my subjectivity is showing because I'm all about education. But it just it's very it's very disheartening for me to know that there are a lot of people who want to be gatekeepers and want to build barriers instead of breaking those down and instead of wanting to encourage and foster this like collaborative commutative conversation to help democratize our technology together. So that discourages me a little bit and makes me kind of sad. But I understand that that's the reality that we live in.

So I watched some of those hearings, especially with like Mark Zuckerberg when he was in front of Congress. And I think the Ana's right on. I think that there's a way of that some of these companies and their representatives use explain ability of these complex, either like recommender systems or whatever, whether it's a deep learning, machine, learning, whatever, that they use some of the jargon in these systems in order to make these systems. Obtuse on purpose, and then use that as kind of a bludgeon to get Congress to move in a certain direction, like almost like a well, you don't understand this, but we understand this.

We've made a billion dollar industry off of it. So let us handle ourselves in this like that. It'll be fine. Don't worry about it. You just have to worry about keeping your seat so, you know, just make your constituents happy. But we'll handle this tech thing and let's keep it out of politics.

And what's what that's created is, again, exactly what Anna's describing, which is this division between the Congress and the policy and then the people who are working within the policy. And again, I honestly, I think we should interview someone on the policy and in the in the corporate world to kind of see their perspective on on this space. But my understanding is that, you know, the I guess it makes sense. They want to keep their sovereignty right. Like the corporate world wants to keep its sovereignty and doesn't want, you know, big government getting in the way of their, you know, whatever, basically making money. But, God, that that's that's hard, especially because, as you're saying, you know, there's a real need for regulation if at least the government doesn't want to to continue to spin kind of out of control and take ownership.

I mean, I think that and example of, you know, Amazon and HWC and just how much data, how much individual data of citizens, even global citizens.

Right.

That there's a God there's a lot of power now which is even in having that. But then what if they choose to do something about it and there's not regulation at all in place to hold them accountable? That's really scary to me.

I mean, speaking of scary, let's let's go back to steps and let's talk about money, because that's what I was thinking about this entire episode. When I think about trust and antitrust, I think about money because the only reason why people want to monopolize a certain sector of the economy is to make as much money as possible, at least generally, or maybe, I don't know, for power and whatever else comes with money. But it's unfortunate because I think that when a lot of these large tech organizations make it so that their only incentive to design and to deploy is to make more money, then the users get left out of the conversation. And I felt that a lot when Anna was explaining some of her citizen's panels and consensus conferences. Those are really interesting ideas that I would love to see implemented in action. And I think that the whatever ideas that the people who are in these conferences come up with to design better technologies are amazing. But the reality is that if those ideas aren't then taken to Congress to change the laws or if those ideas aren't then taken to engineers and developers and designers and program managers at these large corporations to deploy them into their platforms, then nothing really changes.

And so it's unfortunate because I see so much potential here. But then I'm thinking these these people have to be motivated to change. And if the lobbyists are out there to try to just confuse Congress into thinking that they know nothing in order to just keep monopolizing the industry and making more money, something has got to shift in their money mindset to make them not so crazy about making money that they can actually care more about the users and the people who are impacted by their decisions. And that's really hard because I can't just walk up to Mark Zuckerberg and say, hey, why don't you just want to be less of a monopoly in the social media industry? Like, can you just do this for the good of the people? Can you just try to be less of a monopoly? Can you just help antitrust laws get passed to make Facebook be a little bit less or a little bit more decentralized, obviously? Mark Zuckerberg isn't going to come back at me with a smile and say he's sure I'd love to do that power to the people. And so I just I I love this conversation, but it's leaving me just wanting to know what next, you know?

Yeah, well, and it goes right to how complex these systems are, because it's like.

You know, we can draw this parallel or this distinctions binary between the government over here in tech industry over here, but like there are people that are are behind the scenes in both, right. They're not actually like divided in terms of the individuals. And for me, it's like where can you start when the system is so complex, which is just going to keep perpetuating the same system that we already have.

And so I'm left with the question of whether, you know, you almost heard about the markets, right, a few years ago about like, you know, the too big to fail.

And I feel kind of similar here where it's like not too big to fail, but like, are they too big to succeed or too big to is the scale just at such a size that there's no way to make inroads without getting down into the very root and pulling up some of the rot?

I would say. But, you know, time will tell. And I'm again, glad the ANA is working on it.

Congress, thanks for solving all of our issues with capitalism and trust and monopolization in America and globally.

But on this note, for anybody listening, if you either are someone in a position of, you know, on the policy team within a large tech organization or, you know, someone who is who might be willing to speak to these issues even anonymously, please do let us know, because we would love to hear from the other side so we can have a little bit more of a holistic opinion about some of these issues. But for now, for more information on today's show, please visit the episode page at Radical. I dig.

And if you enjoyed this episode, we invite you to subscribe rate and review the show on iTunes or on your favorite pod catcher. Catch our new episodes every week on Wednesdays. Join our conversation on Twitter at radical iPod and as always, stay radical.

Sonix is the world’s most advanced automated transcription, translation, and subtitling platform. Fast, accurate, and affordable.

Automatically convert your mp3 files to text (txt file), Microsoft Word (docx file), and SubRip Subtitle (srt file) in minutes.

Sonix has many features that you'd love including share transcripts, upload many different filetypes, automated subtitles, powerful integrations and APIs, and easily transcribe your Zoom meetings. Try Sonix for free today.