Logbuch:Netzpolitik
Einblicke und Ausblicke in das netzpolitische Geschehen
https://logbuch-netzpolitik.de


LNP504 The politics of intellectual shame

A discussion with Meredith Whittaker

For this episode we made an exception and decided to record an interview in English. We are talking to Meredith Whittaker, president of the Signal Foundation, a non-profit company that's developing the Signal app that is world's most popular Multiplatform private encrypted messenger system. We talk to Meredith about her way into the digital realm, how she shook up Google by organizing walkouts and more or less erasing all memories to it's "don't be evil" motto, how Signal came to be and what its principles are, how she views Europe and the regulations policies of the EU and much much more.

https://logbuch-netzpolitik.de/lnp504-the-politics-of-intellectual-shame
Veröffentlicht am: 17. Oktober 2024
Dauer: 2:04:19


Kapitel

  1. Intro 00:00:00.000
  2. Prolog 00:00:30.293
  3. The Signal Project 00:01:20.769
  4. Meredith Whittaker 00:03:32.608
  5. AI Institute 00:20:03.653
  6. Google Walkouts 00:33:01.977
  7. The Tech Industry 00:45:23.914
  8. The call for encryption 00:52:06.778
  9. The aims of Signal 01:00:13.924
  10. Messenger Interoperability 01:05:30.605
  11. An American view of Europe 01:12:15.132
  12. EU regulation 01:25:40.134
  13. Telegram and Transparency 01:39:50.066
  14. The Signal Company 01:47:05.400
  15. AI Dynamics 01:51:51.610
  16. Goodbye 02:02:32.624

Transkript

Tim Pritlove
0:00:00
Linus Neumann
0:00:01
Tim Pritlove
0:00:02
Linus Neumann
0:00:04
Tim Pritlove
0:00:05
Meredith Whittaker
0:00:06
Tim Pritlove
0:00:30
Linus Neumann
0:00:41
Tim Pritlove
0:00:43
Linus Neumann
0:00:52
Tim Pritlove
0:00:56
Linus Neumann
0:00:57
Tim Pritlove
0:00:58
Linus Neumann
0:01:03
Tim Pritlove
0:01:11
Linus Neumann
0:01:14
Tim Pritlove
0:01:16
Linus Neumann
0:01:17
Tim Pritlove
0:01:20
Meredith Whittaker
0:01:26
Linus Neumann
0:01:29
Meredith Whittaker
0:01:37
Linus Neumann
0:01:38
Meredith Whittaker
0:01:42
Linus Neumann
0:01:55
Meredith Whittaker
0:01:55
Linus Neumann
0:02:15
Meredith Whittaker
0:02:18
Linus Neumann
0:03:04
Meredith Whittaker
0:03:14
Linus Neumann
0:03:29
Meredith Whittaker
0:03:31
Tim Pritlove
0:03:32
Meredith Whittaker
0:03:53
Linus Neumann
0:04:13
Meredith Whittaker
0:04:14
Linus Neumann
0:04:50
Meredith Whittaker
0:04:54
Tim Pritlove
0:06:23
Meredith Whittaker
0:06:28

Well, they would email in. I only did this for a second because I kind of figured out I wanted to do other things. And that's part of the story. There's a part of the story, actually. So I did answer inquiries sometimes. And there were auto responses, like hotkeys you'd use in this really janky ticketing system. And this is back. We didn't have laptops. There's no such thing as a smartphone. Our desktops were chained to the desk. Und wenn du nach Hause bist, wenn du nach Hause bist, wenn du nach Hause bist, dann musst du nachdenken. Du hast du nachdenken, das klunky Lenovo oder so. Ich weiß nicht genau, aber es war... Ich war mir, okay, ich werde rewarded in meinem Job, based auf wie viele Tickets ich close bin. Aber ich weiß nicht die Engenheer, die haben zu close diese Tickets, weil ich mich reporting bugs, und ich muss sie fix. Und wenn sie nicht fix die Bugs sind, dann bin ich nicht rewarded. Und ich war mir, das ist wirklich silly. So ich war über die Apps-Büte, weil ich in einer anderen Bühne bin bin. Und ich mit der engineering team und ich war, hey, ich bin das Person, die gibt es diese E-Mails, die Sie anwenden Sie. Und ich war able, zu convince einem einen zu gehen, zu Costco, die ist dieses große Warehouse-Store, ein Walmart-E-Store in den USA, und put ein giantes Kuchen auf seine eigene Karte, und bring es in den Apps-Bilden, so ich konnte sit mit dem. Weil wenn ich mit dem satte, dann könnte ich mit dem, und dann würde ich mich mehr möglich zu fixen, meine Bugs. Und so all my bugs started getting fixed but then my manager got really, upset because it was like obviously I had breached the hierarchy and I was like and so this meeting appeared on my calendar that was basically about me being insubordinate but I'm also like I didn't come from that class I had no I was like what the fuck did I do wrong I don't know, and I was like oh god I like I did something and then. The day before I think that meeting was going to happen, this email hit, we had this all, like some group, I don't know, it was like the group to the consumer operations team. And one of the engineering directors in Apps must have had like a beer or something during one of the many, many, many drinking parties that happened during the day at Google in that era. And sent some email that was like, Meredith, this couch is a model for collaboration.

Linus Neumann
0:08:40
Meredith Whittaker
0:08:47
Linus Neumann
0:09:59
Meredith Whittaker
0:10:02
Tim Pritlove
0:10:58
Meredith Whittaker
0:11:08
Tim Pritlove
0:11:09
Meredith Whittaker
0:11:13
Tim Pritlove
0:11:29
Meredith Whittaker
0:11:30
Linus Neumann
0:11:35
Tim Pritlove
0:11:38
Meredith Whittaker
0:11:41
Tim Pritlove
0:12:32
Meredith Whittaker
0:12:33
Tim Pritlove
0:13:31
Meredith Whittaker
0:13:36
Tim Pritlove
0:15:00
Meredith Whittaker
0:15:05
Tim Pritlove
0:15:24
Meredith Whittaker
0:15:26
Tim Pritlove
0:15:51
Meredith Whittaker
0:15:53
Linus Neumann
0:15:59
Meredith Whittaker
0:16:19
Linus Neumann
0:16:54
Meredith Whittaker
0:16:58
Linus Neumann
0:17:04
Meredith Whittaker
0:17:25
Linus Neumann
0:18:30
Meredith Whittaker
0:18:32
Linus Neumann
0:18:56
Meredith Whittaker
0:18:57

Netflix was shoving all its traffic through Cogent. What we had done is expose that and exposing that it exposed the principle that you have to if you wanted to ensure net neutrality, you have to take the interconnection points and the interconnection agreements into account. And that led Obama to add interconnection to the reclassification under Title II of these, you know, which kind of moved toward net neutrality that was nullified. But, you know, that was kind of the swan song, let's say, for my MLab time, because, of course, that was a huge deal, right? Like, that's where the business model rubber hits the road. And, you know, I'll just shorthand it. Like, that bought me a lot of capital at Google. And with that, I was, you know, I was already interested in a lot of things. MLab was sort of humming along and, like, it had grown from a hypothesis project to, like, a global thing that was working and doing stuff. And I started getting agitated by AI and a lot of these privacy and security concerns, being part of the community that you were part of, you know, kind of thinking around like tech alternatives and getting less comfortable with the business model. And from there I always had like eight different projects going on but I went on to found, to co-found the AI Now Institute which was really trying to bring the conversation, on what we called machine learning back then but like AI was like a flashier term, like bring it down to the ground a little bit and stop talking about superintelligence and stop talking about, you know, Political economy, what are these technologies, how are they being deployed, how do you oversee them, who uses them on whom, and what are the social and political dynamics of that.

Linus Neumann
0:20:36
Meredith Whittaker
0:20:43
Linus Neumann
0:20:48
Meredith Whittaker
0:20:49
Linus Neumann
0:20:53
Meredith Whittaker
0:20:54
Linus Neumann
0:20:59
Tim Pritlove
0:21:00
Meredith Whittaker
0:21:03
Linus Neumann
0:21:10
Meredith Whittaker
0:21:17
Tim Pritlove
0:21:46
Meredith Whittaker
0:21:51
Tim Pritlove
0:22:53
Linus Neumann
0:23:18
Tim Pritlove
0:23:21
Meredith Whittaker
0:23:49
Linus Neumann
0:24:30
Meredith Whittaker
0:24:37
Linus Neumann
0:24:40
Meredith Whittaker
0:24:40
Linus Neumann
0:24:46
Meredith Whittaker
0:24:48
Tim Pritlove
0:25:39
Linus Neumann
0:25:43
Tim Pritlove
0:25:45
Meredith Whittaker
0:25:48
Tim Pritlove
0:25:56
Meredith Whittaker
0:25:59
Tim Pritlove
0:26:31
Meredith Whittaker
0:26:34

I mean, yeah, that was Jeff Dean, I think. And then, you know, powerful compute, right? And they showed that sort of using gaming chips and a lot of data, you could beat the benchmark. So score much better against standard evaluations than past models and thus sort of catalyzed the industrial industry interest in AI and why were they interested? I think this is a key point because these optimist-seeking algorithms are really good at curating news feeds. They're really good at figuring out algorithms, right? I don't think it's an accident that Jeff was immediately hired, dass Jan LeCun, die die deep-learning-Algorithmen, die das Seed der Moment in AI in den 80ern, in den letzten 80ern, war er mit Meta. Und es war die Plattform-Companien mit der realen Investition in squeezing mehr Ad-Dollars aus der Daten, better-serving, all of das. Das ist Google mit DeepMind. Du sehen Meta und Google die leaders in das. Als measured by different evaluation standards, the measurement question hier ist eigentlich wirklich interesting und troubling. Until this generative moment where I think the ChatGPT Microsoft products shifted people's perception of AI and what it can do and just rearrange the leaderboard. But the paradigm is still the same And the paradigm is still that AI is applying old algorithms on top of the sort of massive platform monopoly business model, availing itself of huge amounts of data, which is produced via this surveillance business model, and really powerful compute that was designed, built up, you know, I would say, like, consolidated in the hands of these platform companies via the imperatives of the surveillance business model, right?

Linus Neumann
0:28:43
Meredith Whittaker
0:29:29
Linus Neumann
0:29:34
Meredith Whittaker
0:29:37
Tim Pritlove
0:30:45
Meredith Whittaker
0:31:24
Linus Neumann
0:31:52
Meredith Whittaker
0:33:32
Linus Neumann
0:33:34
Meredith Whittaker
0:33:35

I mean, it's all interlinked and kind of periodizing your own consciousness is hard. But I think I'm pretty earnest and I also don't come from that world, I don't come from that class so there are often places where I just didn't, you know, I would take things sincerely or be like really committed and then only realize like two thirds of the way through whatever it was like oh no one else really cares about this they're just networking or you know whatever it was so I think there was an element there where like you know when I was doing MLab I was like I really want to win net neutrality Und dann we won net neutrality. Aber dann ist das, dass ich das nicht der Fall war. Google hat eine größere Netzwerke als Comcast. Das ist nicht die Gatekeeper. Aber das war eine Sincere Sache. Und dann war ich, okay, ich kann mich, um, Geld zu all diese coolen Privatsch-Hacker-Projekte. Das war Sincere. Und dann wurde ich in AI und ich war, okay, und ich denke, das ist etwas, das hat sich verändert für mich. I think I used to have a lot, lot more faith in the power of ideas to influence real change, right? And I still think, you know, I spend a lot of time in kind of thinking through discourses, how do we shape them? Like, how do we kindly walk people into understanding things that, you know, they may have an interest in not understanding or they may have been, you know, misinformed about or what have you. But I began you know I began around the time I was looking at AI and sort of making all these cases, that everyone loved right like I was out there giving talks that were completely against the Google party line and I was getting like applauses, I was getting promoted like I was like this is a this is a perfect job. And then you know like I was I envied you not only once I'm the house troll, but then there was I was getting more influence so I was becoming known outside and inside I was like the person you'd call into your team when it was like oh we want to implement this is there an ethical way to do it, and you would say no I would, I would be like my dear friends let us sit down, and then I was getting, I don't know, that was sort of my life and we kind of took the AI Now Institute and really did a lot to reshape the debate. I was very focused on that discursive intervention and how do we begin to talk about AI in a more realistic way and that was working outside of Google but it wasn't really influencing core decisions at Google and that was kind of the thing I kept hitting up against more and more strongly until I got a signal message in... Late 2017.

Linus Neumann
0:36:34
Meredith Whittaker
0:36:36
Linus Neumann
0:37:32
Meredith Whittaker
0:37:38
Tim Pritlove
0:38:47
Meredith Whittaker
0:38:51
Tim Pritlove
0:38:54
Meredith Whittaker
0:38:58
Tim Pritlove
0:39:08
Meredith Whittaker
0:39:13
Tim Pritlove
0:39:18
Meredith Whittaker
0:39:20
Tim Pritlove
0:40:02
Meredith Whittaker
0:40:12

I think it was November 11th, 2018. Und everyone walked out for 20 minutes at 11.11am in their local time so we called it Rolling Thunder and it started in the Singapore office as I was going to bed in New York and I was seeing the photos, and this was chaos I hadn't slept in days there's so many meetings there's so many tears it's hard to organize something like that, and I remember going to bed and seeing the images from Singapore mit ein paar hundert Leuten in Singapur und ich war, das ist großartig. Und dann habe ich New York time gewohnt, um 5 a.m. zu gehen in New York zu unserer location und prepen es, um, dass wir uns da sind. And then I just remember seeing, like, there was this little park near the New York office, and then it just, like, grew outside the park. And then no one could get into the park. And then I was looking at my home, and there's my phone, and there's live helicopter feeds. And we don't have bullhorns, because I'm one of the speakers. There's, like, speakers standing on chairs to address the crowd. And then this guy, like, some, you know, there's this sort of type. I don't know if in Germany you have this type, but they're, like, kind of, like, the leftist at every protest. And like some guy had like found out about it and came in and I just remember this like, like this man I'd never seen like handing me a bullhound from below and I picked it up and like we were like bullhound like a megaphone like a megaphone yeah yeah yeah. And then it was you know and then we walked over to a Mexican restaurant and sat at the table and had a press operation where like that we were running and what was cool was like everyone organizing that was kind of a professional most of them were femme and most of them had jobs at Google that were like organizing the company so organizing their comms being an administrator for like 13 different directors, this particular type of hyper competence at coordinating activity across a number of people who you may not have direct power over and like for that much more.

Tim Pritlove
0:42:27
Meredith Whittaker
0:42:29
Tim Pritlove
0:43:10
Meredith Whittaker
0:43:17

I don't know. we were really careful not to keep lists yeah I mean you can dream up a number now you know I. 20,000 was the number estimated from the Sky Photos and the local reports. We had local leads at every office. We sent them zip files of the kit for handing out the flyers, the talking points, how to treat media, all of that. They all had that, and then they organized their local. And then they had reporting back in from press, reporting back into the central organizers around numbers. And then we were issuing the press releases. um but then you know there's employees of google and then there's contractors of google which more than double employees so it you know that number is and and then there's people who couldn't participate but were really supportive because they have you know need their health insurance or they'll die or they are on a visa and so it wasn't um i think there was a huge amount of support uh i know that we were able to get google to drop their military contract because there was enough support and there was you know there was like a spreadsheet of people who are quitting, conscientious objectors we had you know like every week at the all-town meeting there was a table we would set up with like banners and something like we would had questions like it was a very, it was a very like rigorous campaign and it kind of laid the laid the groundwork and i think it built some muscle that people are probably almost are definitely using now even if they don't call at organizing. It's like, how do we marshal the resources from this?

Tim Pritlove
0:44:54
Meredith Whittaker
0:44:57
Tim Pritlove
0:45:10
Meredith Whittaker
0:45:16
Tim Pritlove
0:45:23
Meredith Whittaker
0:46:50
Tim Pritlove
0:47:11
Meredith Whittaker
0:47:12
Tim Pritlove
0:47:17
Meredith Whittaker
0:47:17
Tim Pritlove
0:47:21
Meredith Whittaker
0:47:21
Tim Pritlove
0:47:27

Ja.

Meredith Whittaker
0:47:28

Ich habe die gleiche Sache gemacht. Und nicht jeder, ich glaube, war equating die growth und monetary success of the US-based tech industry with sort of, you know, values of social progress. I mean, I think we can, yeah, yes, it was liberal and then we can get back to a critique of liberalism and what have you, but it was, I think there were people who were looking at the infrastructure, who were looking at its capabilities, who were looking at the gap between the promises and the reality of what this tech did and calling that out. And I feel like, you know, when I entered into this kind of, Privacy, security, hacker development scene. There was a lot of that skepticism there around Google. That educated me a lot. There was a lot of skepticism around surveillance. I immediately recognized, yes, we need privacy because it doesn't matter if these people are good or benevolent. What we're doing is setting up an infrastructure that could be turned over at any moment to another regime. Logically, all of that made sense. But I don't feel that until the Snowden revelations, any of that was anywhere near in the nervous system of a kind of tech consciousness. And a lot of the work I've spent in the 1990s trying to sift through the crypto wars and sift through what happened with tech regulation to set up these surveillance giants and to permit this monopoly platform business model, has kind of looked at that gap between the rhetoric of liberal rights preserving. Open, free tech, and what was actually being built, right? And one of the things, if you look at, there's a scholar named Katharina Ryder, who I would really suggest, for show notes, I can send some of these links. But she did her dissertation looking at some of the negotiations in the crypto wars. And what you begin to see is that, yeah, we, you know, and this is a thesis sort of I build on top of in some of my work. Yeah, we won liberalized encryption, right? By 1999 in the US, it was finally legal to build, share, implement strong cryptosystems without approval from the government, without some threshold that made them useless. But the agreement there was basically, yeah, you can have encryption, but we're going to permit mass surveillance by companies. Und so you don't actually, like, you can just get the, you know, we're going to permit, we're going to endorse the advertising business model. We're going to endorse, and I can actually, I can start this point over, actually.

Linus Neumann
0:50:19
Meredith Whittaker
0:50:30

I can't say that there was a conspiracy. What I can say is that Katarina's work shows that Microsoft saying liberalize encryption, don't worry we're not going to encrypt all the data, we need it. Just come to us quietly and we'll give it to you. Instead of fighting over a backdoor, instead of doing this in sort of the public domain where we're kind of losing the fight on technical and other grounds. Allow companies free reign to surveil because that allows us to implement this ad-supported business model. And then the data agreements can happen behind the scene. Now, I've like completely compressed a very complex history into basically a meme. But I think the purpose is there is that there was always that gap between these sort of rhetoric and what was actually going on. And I think the, I don't know, like there's like a kind of internet people, right? This type maybe misunderstood exactly like how this, you know, who would have power over this technology, right? Like encryption is liberalized, but it's not gonna be applied to protect personal communication. It's applied to protect transactions, right? It's not, you know, like the people who get to choose whether or not it's used aren't us. In terms of, you know, actually this sort of mass infrastructure in terms of the tech ecosystem that's being built by the actors who are calibrating their decisions based on the surveillance business model that has been instantiated right, you know, through regulatory decisions made by the Clinton administration.

Linus Neumann
0:52:07
Meredith Whittaker
0:52:40
Linus Neumann
0:53:01
Meredith Whittaker
0:53:17
Linus Neumann
0:53:28
Meredith Whittaker
0:54:06
Linus Neumann
0:54:11
Tim Pritlove
0:54:14
Meredith Whittaker
0:54:25
Linus Neumann
0:54:47
Meredith Whittaker
0:55:08
Linus Neumann
0:55:13
Meredith Whittaker
0:55:14
Linus Neumann
0:56:58
Meredith Whittaker
0:57:21
Linus Neumann
0:58:29
Meredith Whittaker
0:58:55
Linus Neumann
0:58:59
Meredith Whittaker
0:59:11
Linus Neumann
0:59:12
Tim Pritlove
0:59:15
Meredith Whittaker
1:00:02
Tim Pritlove
1:00:05
Meredith Whittaker
1:00:37
Tim Pritlove
1:01:38
Meredith Whittaker
1:01:42

Yeah, yeah. I mean, I remember using, like, I don't remember, I'd like send text messages sometimes, but they were expensive on my BlackBerry, maybe. But we're talking about a drastically different tech ecosystem. And this is particularly important in the context of messaging and communications apps. Because, of course, you need a network effect for those to work. No one buys the first telephone because you can't use one telephone. Your friend has to have a telephone. All your friends have to have an app if you're going to use it. Particularly, it takes two to encrypt. Group encryption, it takes everyone in the group. And if everyone isn't using your app for communication, it's very different in a saturated marketplace where you have WhatsApp, where you have iMessage, where you have these normative models that people go to just to have their regular communication to introduce a new platform. For secure messaging or insecure messaging, right? Because people don't switch outside of the network unless their friends switch outside of the network and there's a collective action problem there and an inertia problem there and all of that, right? So when I think about, there's many things that are very precious about Signal, but the fact that Moxie carried it on his back for that decade and was actually able to keep it going and surviving without selling out, without selling data and actually creating something that is now able to scale to hundreds of millions of people. Means that Signal actually has a position in this ecosystem that makes it useful to people. That means that it's actually providing encrypted communication to people all over the globe because their friends are using it. And my contention here, and I'm willing to discuss this, I don't think we can recreate Signal, right? You could shift, you know, because it has that user base, right? You can introduce a new app, but how do you get people to use it without an OEM, without an existing installed user base, without some way of, you know, kind of making it useful to people? Because again, it's, you know, one telephone or, you know, a couple thousand hackers who all use PGP, but they can't talk to their dad on PGP, right? They can't talk to anyone outside of themselves on PGP. So Signal has both sort of kept this form that is very heterodox in tech. It's a nonprofit, so I couldn't really sell out profitably. I could try to sell out, but I'm not going to get any money from it. It has to be reinvested in the mission, and there are certain rules around mission alignment there. So we're not pressured to monetize surveillance or to collect any data. We're really able to stay focused on that mission and in an ecosystem where. The diametric opposite is the norm. That's really, really important, irrespective about the flaws of the nonprofit model more generally. So it's achieved this pretty rare, like it's the only thing like it in the ecosystem. And I think it also serves as a model for how we could think about building tech differently. Like, how do we disarm, deconstruct the massive centralized power of a handful of platform companies that basically control most of the infrastructure and information ecosystem in our world and our jurisdiction in the US? And how do we build other models that may be interoperable, that are more open, that are more rights preserving, and that aren't subject to the pressures of and incentives of the surveillance business model?

Linus Neumann
1:05:30
Meredith Whittaker
1:07:05
Linus Neumann
1:08:38
Meredith Whittaker
1:08:39
Linus Neumann
1:09:59
Meredith Whittaker
1:10:03
Linus Neumann
1:10:04
Tim Pritlove
1:10:09
Linus Neumann
1:10:11
Meredith Whittaker
1:10:14
Linus Neumann
1:10:21
Meredith Whittaker
1:10:41
Tim Pritlove
1:11:17
Meredith Whittaker
1:11:28
Tim Pritlove
1:12:14
Meredith Whittaker
1:12:52
Linus Neumann
1:13:14
Meredith Whittaker
1:13:16
Tim Pritlove
1:13:19
Meredith Whittaker
1:13:33
Tim Pritlove
1:14:09
Meredith Whittaker
1:14:12
Tim Pritlove
1:14:58
Meredith Whittaker
1:14:59
Tim Pritlove
1:15:22
Meredith Whittaker
1:15:26
Linus Neumann
1:15:37
Meredith Whittaker
1:15:38
Tim Pritlove
1:16:17
Meredith Whittaker
1:16:19
Tim Pritlove
1:16:21
Meredith Whittaker
1:16:23
Tim Pritlove
1:16:42
Meredith Whittaker
1:16:45
Linus Neumann
1:17:14
Meredith Whittaker
1:17:15
Tim Pritlove
1:17:29
Meredith Whittaker
1:17:31
Tim Pritlove
1:17:41
Meredith Whittaker
1:17:43
Tim Pritlove
1:18:08
Meredith Whittaker
1:18:12
Tim Pritlove
1:18:16
Linus Neumann
1:18:19
Meredith Whittaker
1:18:23
Tim Pritlove
1:18:23
Meredith Whittaker
1:18:25
Tim Pritlove
1:19:49
Meredith Whittaker
1:19:54
Tim Pritlove
1:20:43
Meredith Whittaker
1:21:07

Ja, ich meine, es gibt beide threads. Du hast zwei Wolves inside European Politik. Der eine will seine eigene Tech-Industrie und der eine will sie nicht zu machen. Das ist nicht zu US-Tech-Colonialismus. Und ich denke, du hast du some weirden Laws. Du hast die AI-Act, die aufwärtigte, die aufwärtigte, kind of had this last-minute brinksmanship around whether foundation models. These big LLMs that are now the trendy kind should be included or not. And you often see bold regulatory attempts that then get kind of shaped in odd ways, trying to have it both ways, right? Like how do we regulate the Americans away and get our own, right? But how do we do that in a way that it is reflected in principles, not in actually declaring that as an intent. And I think that is, I think you're seeing a huge amount of money be spent by the US companies in Brussels right now, which is also influencing things in interesting ways. And then this is something I'm theorizing a lot in my intellectual work, or just, and I think is really important. You also see what I'm calling the politics of intellectual shame. Be really pervasive in this conversation and this is not just Europe this is across the board. I mean that there is a real. A real fear among a lot of people who are in decision-making positions, politicians or academics or whoever, and not even in decision-making positions, but it matters when it's them, of being stupid about tech, of being behind the ball on tech. And this plays right into patriarchal dynamics. Men hate when someone else knows something more than them, in particular if that's a small woman.

Linus Neumann
1:23:08
Meredith Whittaker
1:23:09
Linus Neumann
1:23:11
Meredith Whittaker
1:23:12

Ja. Well, you all are, generally, and I don't want to gender this in such a schematic, but there is an ego that can be very, very fragile here. And the way I've put it before is it kind of turns uncertain men into yes men. Like, they don't want to ask the dumb question. They don't want to be like, what's an LLM? What's a server? Like, how does that work? And that type of insecurity, The fear of being behind, the fear of being called technically unsophisticated or hampering progress or putting your finger on the scales of science, look the Nobels, how could you stand in the way of all this progress? I think really gives the upper hand to the companies and those who have an interest in creating products and growth and domination via these technologies. Because people really don't want to challenge them because challenging their dominance or their plans gets conflated with somehow being anti-science or being stupid about tech or not being smart enough to have a position on a topic. And I think that's something because I kind of came up through Google asking every dumb question in the book because I had no I didn't come from that world. Right. So I had to ask, like, how does a computer work? Right. I'm like, I'm like, can someone diagram what a function is? I don't know any of this stuff. Right. But I kind of I think I have a sensitivity to that because I also I remember feeling it. I remember people being mean about it. Like if I didn't, you know, like back in the day when I was trying to learn this stuff. And I think that, you know, a discourse that collapses scientific progress into kind of. The success of a handful of tech companies preys on that type of insecurity and has created an environment in which people have no idea what AI is and are still professing boldly on how to regulate it.

Tim Pritlove
1:25:10
Meredith Whittaker
1:25:22
Tim Pritlove
1:26:20
Meredith Whittaker
1:26:38

Well, I don't, you know, I see this as an ongoing power struggle, right? Between who? Well, between, like this is not a misunderstanding. I think a lot of the people pushing for this understand that backdoors are dangerous and understand that, you know, the pretext is flimsy, but that, you know. Asymmetric power constitutes itself in part through information asymmetry. And there's a deep discomfort that dates back to 1976 when Diffie Hellman were trying to publish their paper introducing public cryptography and the U.S. Government was trying to suppress it, trying to say, don't publish this, right? And then, you know, but databases weren't quite big enough, networks weren't quite big enough or ubiquitous enough for it to matter, matter. But they were already looking at like, oh, shit, we don't want this in the public, right? And then you go through the 90s and there's the Clipper chip and key escrow and you have Stuart Baker writing in Wired magazine, like PGP is just for terrorists. We have proof. No, PGP is for pedophiles, right? Which really echoes what we're hearing now. Right? Like who even has a computer in 1994, I believe, when this op-ed is written? And then we have Post 9-11 and then it's like, actually, PGP is for terrorists, right? And encryption is for terrorists. All the while, our dependency on digital infrastructures for communications is growing and growing and growing. Our dependency on digital infrastructures generally is growing. And the need for encryption to protect commerce becomes existential to the Internet industry. And then what do you do about communications, right? And I think this has been an anxiety that is pervasive among those who, you know, law enforcement, governments, whoever, who feel that they need to constitute their power via information asymmetry. And any encryption that protects people, not just commerce, is a threat to that, right? And so what I don't see is that we're going to win an argument, right, or that we're going to win this via strength of argument. I do think we can fight and we, you know, I think, I think we're in a position now where we're seeing, we're seeing chat controls, you know, I believe Hungary just tried to raise it and didn't get the support. There was the Belgian proposal a few months ago, also didn't get the support at the last minute. And we just had the Dutch law enforcement authorities writing a memo to the government there saying, yo, don't support this. You're talking about a very dangerous backdoor that would undermine Dutch cybersecurity, right? At the same time, we have reporting in the Wall Street Journal... Es gibt einen Receit für was, was all of us should have suspected all along, ist, dass die Backdoors, die in den US-Telecommunications Infrastruktur, für Government Intercept, haben wir von China-Intelligence und vielleicht anderen. Ich denke, an dieser Moment, wir haben ein vieles... Die Facts sind auf unserer Seite, und die Facts sind auf der Seite, ist permeatet in diese Diskussion, and making it harder and harder for them to push it forward in the European Commission.

Linus Neumann
1:29:46
Meredith Whittaker
1:29:49
Tim Pritlove
1:30:10
Meredith Whittaker
1:30:15
Tim Pritlove
1:30:16
Meredith Whittaker
1:30:17
Tim Pritlove
1:30:18
Meredith Whittaker
1:30:24

In terms of politische Diskussionen, es hilft, dass wir richtig sind. We bring in, there's a huge amount of evidence that a lot of people haven't seen in these political discussions. I think we're on the, our side has been on the back foot for a while. There has been just in civil society, there's been cutting funding to privacy advocacy has happened since around, there's a sort of a history here. I think there's a move toward tech accountability happened after the 2016 election. There's the Cambridge Analytica scandal. There's all of this. And it's like, okay, we need to hold tech accountable. And then there are a number of – the way to hold tech accountable is to attack the business model, is my view. But there aren't that many pieces of legislation or proposals that actually do that. Many of them sort of use the wrapping and the language of accountability. But are actually just expanding surveillance, right? It's like, we're going to hold them accountable, so we need a database, so we need to know who's logging into websites so we can find the bad guy. We need to know what's in your messaging so that we can make sure that these tech companies aren't allowing crime on their platforms, et cetera, et cetera. So it was basically a hijacking of this, in many cases, kind of righteous moment where people recognize that this business model was pretty harmful, to fulfill the wishes that have been pervasive since well before then. At the same time that we're seeing privacy advocacy and a lot of those, you know, a lot of the things Linus, you and I had been doing for a long time, receiving less and less support and sort of, you know, out of the limelight. And so I think it was in that environment that things like chat controls, that things like the online safety bill and other, you know, paradigmatic examples of this, you client-side scanning to save children meme, grew up and then one of the reasons I was, There are many, many reasons I decided to move from being on the board of Signal to full-time at Signal. One of them was that I saw this moment and I realized there weren't that many people fighting it and that one of the things that I could bring was a staunch willingness to fight it.

Tim Pritlove
1:32:50
Meredith Whittaker
1:32:55

I open up my laptop. Ja, um, um, well, we, you know, it's not, obviously, it's not just me, nothing like this is a singular thing. We work with a pretty broad coalition of folks. I'm sure, you know, many of your friends, many listeners, perhaps, are part of that. Um, Signal doesn't have a policy arm. It's a very kind of lean, targeted, pretty senior organization. But we do work with people around the globe EDRI and the EU a number of other organizations to keep tabs on what's happening we also are in a good position. We're a non-profit we are very committed to rigorous communication so we don't have a history of hyper-marketing we don't do hyper-marketing now and so we're very careful when we make a claim When we make a statement, we're backing that with citations. It's accurate. We're really marshalling the technical knowledge and prowess that we have. I almost think of it as clarifying the record. If there's a report that says client-side scanning is actually safe, we know it's not safe. Okay, well, there's an academic coalition that has written this letter. Signal can write a letter. We can begin to put a bit more weight on the scales that have been fairly light, given the dynamics I just outlined. And then I do media, I do public speaking. I think a lot about how to tell this story in a way that isn't boring or alienating for regular people, particularly because the story on the other side is so arresting. It's like we have to save children from abuse. And that every one of us it like hits you in the heart, right? Like myself, right? Like my amygdala is activated. I, you know, suddenly I just want to do something. I want to help, you know, give me the thing to do. How do we do that, right? And then sitting across from that and being like, well, let me tell you about a one-way function, right? Like you can't, that's not. That's not going to work, right? And so like, how do you, How do you enter into that debate in a way that isn't dismissing the very grim and real problem that is being evoked and make it clear that the solution to that problem that is being presented will not solve that problem, one, and two, will cause drastically worse problems for many people around the world? And that's the task at hand right now.

Tim Pritlove
1:35:32
Meredith Whittaker
1:35:41
Tim Pritlove
1:36:05
Meredith Whittaker
1:36:08
Tim Pritlove
1:36:20
Meredith Whittaker
1:36:23

Jimmy Savile's walking around. You know, like what are the infrastructures in place to make sure that when children are going through this, they're believed, they're protected. You know, what happens when it's your priest? What happens when it's your teacher? What happens when it's your brother, right? Like these are the questions that are really hard to look in the face because they implicate social pathologies and interpersonal relationships and power dynamics that are really, really difficult and often relate to emotionally challenging factors outside of that or people's past experiences or what have you. So you're going right into very traumatic subjects. But I don't think we can have that conversation without having a real conversation. And then when you begin to pull back the layers there, you say, oh, well, the UK has been pushing for client-side scanning as a remediation to child abuse. But the UK government in 2023 funded social services at 7% of the amount recommended. Right? So like there's, you know, the roofs on the schools in the UKs are collapsing. Like, you know, there isn't support for this. And then if you, you know, if you look at, and I don't have public numbers to share, but I've had a number of personal conversations. Okay, well, how many law enforcement people, how many people are tasked with actually sort of pursuing the, you know, like the criminality that may be reported via online imagery? In some cases, it's two. In one case, it's two in one country. Two people, right? So, like, you're not actually, like, if you begin to map this, what you see is a story that does not add up. And you see, like, what, you know, and this is where I get enraged. Because I'm like, you are fucking trading... On children's pain, to get your backdoor, whatever the fuck you want, pretending that you're solving it, so taking up the space for actual solutions that could actually help real children who are suffering now, and turning no attention to every glaring problem in this massive list, which is pretty obvious, even for me, and I'm not an expert here, I've just sort of sifted through this. So I think that's the dynamic we're walking into.

Tim Pritlove
1:38:47
Linus Neumann
1:38:55
Meredith Whittaker
1:39:16
Linus Neumann
1:39:50
Meredith Whittaker
1:40:57
Linus Neumann
1:41:48
Meredith Whittaker
1:41:50
Linus Neumann
1:41:59
Meredith Whittaker
1:42:00
Tim Pritlove
1:42:04
Linus Neumann
1:42:04
Meredith Whittaker
1:42:07
Tim Pritlove
1:42:12
Meredith Whittaker
1:42:14
Tim Pritlove
1:42:29
Meredith Whittaker
1:42:29

Right? And when we're designing Signal, we're actually very, very careful not to be a social media platform. We think about that in the design phase so everything we do can be as encrypted as possible. So that we don't know anything about you or as close to zero about the people who use Signal as possible. And what we do know is we can say, yes, this phone number did sign up for a Signal account. We know when that phone number signed up for a signal count and we know last time they accessed it. Um, but we don't, you know, we, we would like to even not know that if it were possible. On the other hand, Telegram is a social media platform, which retains huge amounts of data, has a duty to under law to cooperate in turning over that data and was search functions that has search functions, has, you know, directories so you can find new things. So it's a, it's a very different beast. And I think, one, because Durov has, you know, been very. I'm trying for a diplomatic word, like, has made statements that are not supported by fact around Telegram being private and secure and kind of taken on this, you know, like yeoman's defender of free speech and privacy position. People often think Telegram is private and secure because it has a DMs feature, right? But, you know, Signal is just private and secure. So the TLDR on that is, there's really no danger for Signal here because we are very, very far away from Telegram. And we have set ourselves up so that one, such cooperation isn't required. And two, such cooperation is not possible. Because we literally, like you could put a gun to my head, I don't have that data. Whereas Telegram has servers and servers and servers full of that data.

Tim Pritlove
1:44:22
Meredith Whittaker
1:44:35
Linus Neumann
1:44:38
Meredith Whittaker
1:44:40
Tim Pritlove
1:44:43
Linus Neumann
1:44:44
Meredith Whittaker
1:44:45
Tim Pritlove
1:45:28
Meredith Whittaker
1:45:31
Linus Neumann
1:46:15
Meredith Whittaker
1:46:24
Tim Pritlove
1:46:35
Meredith Whittaker
1:46:38
Linus Neumann
1:46:41
Meredith Whittaker
1:46:46
Tim Pritlove
1:47:05
Meredith Whittaker
1:47:10
Tim Pritlove
1:47:42
Meredith Whittaker
1:47:48
Linus Neumann
1:48:23
Meredith Whittaker
1:48:30
Tim Pritlove
1:48:34
Meredith Whittaker
1:48:40

Well, I would never presume to speak to the consciousness of another person without there. But I think, yeah, I am very happy. I think a lot of the people who are at Signal are very happy. And I think it's also like we're kind of part of a project. And this is... Shows that what we have in tech, what's built in the tech industry is not inevitable. There's a series of choices, a series of incentives, a business model that has shaped tech into the form we have now, but it does not have to be that way, right? Like we can rewrite the stack, we can build alternatives. Nonprofits can work, right? We need capital, we need will, we need talent, we need all of those things. But the thing we have now is not inevitable. And I think of Signal as like a keystone species in the ecosystem, Kind of like, you know, like setting the bar, kind of regulating the rest, right? Like, you know, you can have privacy. You can have, you know, the right to private communications. You can subsist outside of this paradigm. And I think the future I want is that it's not just Signal, right? There are many, many other organizations and efforts sort of doing it differently, rejecting that paradigm, you know, drawing in capital there and, you know, away from the other place. Und beginnt zu marscheln die type of political will that is often very shallow like the 500 million AI fund but marscheln it for something that is actually substantive and is actually, making the kind of change to the tech ecosystem that I think we need to have a livable world.

Tim Pritlove
1:50:18
Meredith Whittaker
1:50:28
Tim Pritlove
1:50:43
Meredith Whittaker
1:51:04
Linus Neumann
1:51:53
Meredith Whittaker
1:52:27

Well, I think my answer to that is that that future will rely on laying an independent infrastructural bedrock and actually transforming some of the way we govern and think about digital technology generally, Including being really attentive to things like how is data created? Who gets to decide what data we use to reflect our complex lives and realities? Who gets to decide how patterns in that data are made sense of? What analysis is done to that data? And then what we do with the sense we make of it, right? What decisions we make, right? And so... We do all that, we transform what AI is and what it means. Because you're no longer just scraping all the detritus off the stupid web, which was deposited or created via this surveillance business model, packaging that in an LLM and calling that intelligence, right? You're actually having to grapple with the epistemic process by which data becomes a proxy for reality, and that proxy shapes our lives and institutions. And so I think AI itself, right now we're talking about these massive models, right? This laws of scale, this sort of like big American guy dream of the, you know, the largest in the world. But AI is a, you know, it's a very slippery term. It's not a technical term of art. It can apply to many, many different things. And there are small models. There are sort of, you know, heterodox approaches. There are expert systems, which they're now trying to bolt onto the side of generative systems. Because weight, probabilistic answers aren't true, so we need to bolt truth back on and we're kind of repeating a lot of the history of AI, kind of speedrunning it in the search for a business model. So my answer there is that a lot of the things that need to be done to simply disarm and draw down the centralized power of these surveillance and infrastructure companies are the same things that would need to be done to redefine, in a sense, what AI is and our relationship to how truth, how, you know, decisions, how, you know, I don't want to use the word truth actually, but like how information is made via analyzing data and who gets to control that. And I think, you know, my sensitivity to data in that answer comes directly from my measurement experience, right? Like where you, you know, one upgrade to the Linux kernel across our server fleet fundamentally changed. The kind of data we were able to create, like how it populated the schema. And meant that that data wasn't necessarily fungible with the data collected on the older version of the kernel, right? And in order to solve that problem, I had to get a guy to go sit with the kernel maintainers for like two years to make sure that the update wasn't going to fuck up the way we got TCB DOMS, basically. Um so like that you know that's and that's that's then think about social data then think about data that reflects like who gets access to resources then think about all of the other things and um i think i think it's actually an exciting idea to think on like you know how do we how do we create systems where we're much more attentive to that and recognize that there is a you know it really matters how those choices are made how those you know how data is created who gets a und wie es zu sagen ist, und wie es zu sagen ist.

Linus Neumann
1:56:01
Meredith Whittaker
1:56:50
Linus Neumann
1:57:15
Meredith Whittaker
1:57:19
Tim Pritlove
1:58:18
Meredith Whittaker
1:58:21
Linus Neumann
1:58:32
Meredith Whittaker
1:58:34
Linus Neumann
1:59:05
Meredith Whittaker
1:59:10
Tim Pritlove
2:00:27
Meredith Whittaker
2:00:40
Tim Pritlove
2:00:51
Meredith Whittaker
2:00:52
Tim Pritlove
2:00:56
Meredith Whittaker
2:01:07
Tim Pritlove
2:01:58
Meredith Whittaker
2:02:02
Tim Pritlove
2:02:07
Meredith Whittaker
2:02:12
Tim Pritlove
2:02:13
Meredith Whittaker
2:02:15
Linus Neumann
2:02:22
Meredith Whittaker
2:02:25
Tim Pritlove
2:02:32
Linus Neumann
2:02:33
Meredith Whittaker
2:02:34
Tim Pritlove
2:02:41
Linus Neumann
2:02:47