LNP504 The politics of intellectual shame

A discussion with Meredith Whittaker

For this episode we made an exception and decided to record an interview in English. We are talking to Meredith Whittaker, president of the Signal Foundation, a non-profit company that's developing the Signal app that is world's most popular Multiplatform private encrypted messenger system. We talk to Meredith about her way into the digital realm, how she shook up Google by organizing walkouts and more or less erasing all memories to it's "don't be evil" motto, how Signal came to be and what its principles are, how she views Europe and the regulations policies of the EU and much much more.

avatar
Linus Neumann
avatar
Tim Pritlove
avatar
Meredith Whittaker

Für diese Episode von Logbuch:Netzpolitik liegt auch ein vollständiges Transkript mit Zeitmarken und Sprecheridentifikation vor.

Bitte beachten: das Transkript wurde automatisiert erzeugt und wurde nicht nachträglich gegengelesen oder korrigiert. Dieser Prozess ist nicht sonderlich genau und das Ergebnis enthält daher mit Sicherheit eine Reihe von Fehlern. Im Zweifel gilt immer das in der Sendung aufgezeichnete gesprochene Wort. Formate: HTML, WEBVTT.


Transkript
Tim Pritlove 0:00:00
Good morning, Linus.
Linus Neumann 0:00:01
Good morning, Tim.
Tim Pritlove 0:00:02
Linus, what watch?
Linus Neumann 0:00:04
Ten watch.
Tim Pritlove 0:00:05
Such much?
Meredith Whittaker 0:00:06
You'll get along beautifully in America.
Tim Pritlove 0:00:30
Logbuch Netzpolitik Nummer 503 vom 16.Oktober 2024 und ihr werdet es schon gemerkt haben.
Linus Neumann 0:00:41
Wir sprechen eine andere Sprache.
Tim Pritlove 0:00:43
Wir sprechen eine andere Sprache heute, weil wir haben heute einen Gast unddas ist dann ein Gespräch, was wir heute auf Englisch führen wollen.
Linus Neumann 0:00:52
Bevor wir aber kurz unsere Episodennummer korrigieren, ist nämlich 504.
Tim Pritlove 0:00:56
Was habe ich gesagt?
Linus Neumann 0:00:57
503.
Tim Pritlove 0:00:58
Oh.Ja, ausnahmsweise lebe ich mal nicht in der Zukunft.
Linus Neumann 0:01:03
Man merkt, wir sind aufgeregt.Tim hat ja einen britischen Pass und muss jetzt mal seine Britischkenntnisse unter Beweis stellen.
Tim Pritlove 0:01:11
Was habe ich einen britischen Pass? Ich habe gar keinen britischen Pass, ich bin Deutscher.
Linus Neumann 0:01:14
Okay, aber du hattest ja nicht einen britischen Pass.
Tim Pritlove 0:01:16
Das ist aber abgelaufen.
Linus Neumann 0:01:17
Achso, der ist abgelaufen. Okay, dann das erklärt einige.
Tim Pritlove 0:01:20
Anyway, we say hello to Meredith. Meredith Whitaker. Hello, welcome to our podcast.
Meredith Whittaker 0:01:26
I'm so happy to be here. Thank you for having me.
Linus Neumann 0:01:29
Meredith, you are, in case somebody doesn't know you, the president of the Signal Foundation.
Meredith Whittaker 0:01:37
I am. I am.
Linus Neumann 0:01:38
And Signal is a messenger application?
Meredith Whittaker 0:01:42
It is the world's most widely used, truly private messaging application.I always have to put the full sentence in there. A very special messaging application.
Linus Neumann 0:01:55
What makes it special?
Meredith Whittaker 0:01:55
Um we build againstthe grain to put it in kind of flowery language we go out of our way to buildfor privacy and to get as close to collecting no data at all as possible whichas you and i assume many in your audience know is a lot more work.
Linus Neumann 0:02:15
Than simply encrypting.
Meredith Whittaker 0:02:18
Then well encrypting is part of that but then you need to make sure the encryptionworks and then encryption of message content is Das ist nicht möglich,weil du also hast Meta-Data. Du hast auch libraries, du hast du in.Du hast Core Services, du hast du mit. So, da ist viel ingenuity required zuactually create a system das ist rejecting den normative assumptions der Tech-Industrie heute.Wir alle wollen ein ton of Daten, und wir alle wollen monetize das Daten.Wir wollen Sie ads mit dem Daten. Wir wollen das zu anderen Kunden.Wir wollen das Daten-Broker. Wir wollen das AI-Modeln mit dem Daten.Most of the infrastructure in the tech ecosystem now assumes that as a given.And then we have to rewrite things too.
Linus Neumann 0:03:04
And you negate all these assumptions and try to build a truly private messagingplatform in a world that has a completely different business model.
Meredith Whittaker 0:03:14
Yeah. Well, I'm going to just flip that to be a little bit rhetorical and saythose assumptions negate the human right to private intimate communication.And we are trying to rebuild a tech ecosystem that actually honors those.
Linus Neumann 0:03:29
That's an even higher goal.
Meredith Whittaker 0:03:31
Yeah.
Tim Pritlove 0:03:32
Before we dig more into the Signal project, which in itself I think is quiteinteresting, we'd like to understand how did you get there, where you are right now.So what was your introduction into this digital world? When did it start?
Meredith Whittaker 0:03:53
I don't have one of those stories where I'm like an Atari. I was four. I'm a hacker.I didn't care about tech, qua tech. I wasn't interested.You remember back in school, there were two kinds of nerds.There were the math nerds and the book nerds.
Linus Neumann 0:04:13
You were the book nerd?
Meredith Whittaker 0:04:14
I was the book nerd. But in the story we tell now, because nerddom has beenoverlapped with monetary success and sort of a career in math and science,I think we forget about the book nerds. But hello, I'm here to remind you.We exist. We're here. And that's who I was.And so I studied literature and rhetoric at Berkeley, which I just thought were,I mean, they're still beguiling.Like being able to read and write is pretty fundamental for anything.Und dann ich war poor so ich habe einen Job bei Google weil sie waren die erstenMenschen zu offeren mich ein Job und dann habe ich sehr fasziniert mit was aufder Erde was going on When war das?
Linus Neumann 0:04:50
Wenn warst du bei Google als jemand der Studierende Literatur und Rhetorik?
Meredith Whittaker 0:04:54
July 10, 2006 Ich habe mich über Berkeley und ich musste Geld und ich mussteund ich put mein Resumé auf Monster.com which ist ein precursor zu LinkedIn,und ich habe mich überlegt out.And then I talked to my friend Michelangelo, because at the time you still neededto invite to join Gmail, as I recall.And I was like, hey, Michelangelo, can you get me an invite so I can make aGmail address so I can appeal to this recruiter?And now my spam-filled Gmail address dates from that moment.So I was hired as something called a customer operations associate,and I didn't know what that was.No one knows what that is. It's just a bunch of words, But it sounded,I was like, that's a business job. It sounds like a business title.I had no idea. It was basically a customer support person who wrote technicaldocumentation, who are kind of user documentation, some technical documentation,and answered inquiries about Google's free products.And I was doing that for Rightly, which was an acquisition that Google madethat then they rebuilt to become Google Docs. Okay.Und das ist an der Zeit, die ganze Zeit, die Google-Businessen war, was nur Search.So Search war all auf der Main Campus. Es war wo die Money kam,und dann war das Dinky-little-Building, wo alles, Gmail, Blogger, Rightly, Reader,all of them sat und sie waren experimental projects. Und das war called Apps.
Tim Pritlove 0:06:23
Did you ever have exposure to actual customers or did you just write documentation?
Meredith Whittaker 0:06:28
Well, they would email in. I only did this for a second because I kind of figuredout I wanted to do other things. And that's part of the story.There's a part of the story, actually. So I did answer inquiries sometimes.And there were auto responses, like hotkeys you'd use in this really janky ticketing system.And this is back. We didn't have laptops. There's no such thing as a smartphone.Our desktops were chained to the desk. Und wenn du nach Hause bist,wenn du nach Hause bist, wenn du nach Hause bist, dann musst du nachdenken.Du hast du nachdenken, das klunky Lenovo oder so.Ich weiß nicht genau, aber es war...Ich war mir, okay, ich werde rewarded in meinem Job, based auf wie viele Tickets ich close bin.Aber ich weiß nicht die Engenheer, die haben zu close diese Tickets,weil ich mich reporting bugs, und ich muss sie fix.Und wenn sie nicht fix die Bugs sind, dann bin ich nicht rewarded.Und ich war mir, das ist wirklich silly.So ich war über die Apps-Büte, weil ich in einer anderen Bühne bin bin.Und ich mit der engineering team und ich war, hey, ich bin das Person,die gibt es diese E-Mails, die Sie anwenden Sie.Und ich war able, zu convince einem einen zu gehen, zu Costco,die ist dieses große Warehouse-Store, ein Walmart-E-Store in den USA,und put ein giantes Kuchen auf seine eigene Karte, und bring es in den Apps-Bilden,so ich konnte sit mit dem.Weil wenn ich mit dem satte, dann könnte ich mit dem, und dann würde ich michmehr möglich zu fixen, meine Bugs.Und so all my bugs started getting fixed but then my manager got really,upset because it was like obviously I had breached the hierarchy and I was likeand so this meeting appeared on my calendar that was basically about me beinginsubordinate but I'm also like I didn't come from that class I had no I waslike what the fuck did I do wrong I don't know,and I was like oh god I like I did something and then.The day before I think that meeting was going to happen, this email hit,we had this all, like some group, I don't know, it was like the group to theconsumer operations team.And one of the engineering directors in Apps must have had like a beer or somethingduring one of the many, many, many drinking parties that happened during theday at Google in that era.And sent some email that was like, Meredith, this couch is a model for collaboration.
Linus Neumann 0:08:40
I was going to say, that's why you hired somebody from Berkeley.To write your tickets because they come up with ideas like that.
Meredith Whittaker 0:08:47
And then that meeting just disappeared and I was like, well shit,I need another job because I just burned someone who's going to figure out howto get back at me and that's where my street sense comes in.So I took a bike around campus for a couple months and asked every VP duringtheir office hours, which is a thing they used to have like, hey, I want another job.And I would just drop in and be like, hi, you don't know me. I'd like a job.And then I got a job doing basically Standards work, like trying to push document interop standards.And so I started in standards and then standards parlays into measurement prettyquickly because measurement is ultimately the methodological standard.And co-founded MLab and then that became the nucleus around which the researchgroup I founded existed and then that was the...That was when the fresh air of sort of these political and social complexitiesstarted hitting my technical work. And I was like, oh, standardization is power.Oh, creating data means you own the narrative. Oh, shit, none of this is neutral.All of this is contingent.I started seeing the balance sheet. I started seeing the capital that was involvedin running infrastructure.And I remember around the time I met you, Linus, like maybe over a decade ago.
Linus Neumann 0:09:59
Yeah, it must have been about a decade or longer. Yeah, probably a bit longer even.
Meredith Whittaker 0:10:02
Yeah, aging is weird.Aber ich habe diese schluss-Ballanz-Ballanz-Sheets. Ich habe aus Google gepritten,die gezeigt, wie viel die Uplink-Bandwidth und die Infrastruktur und Power-Costs für MLAB war.Und ich würde das zeigen, das zu all diese Civil Society funders,wie gesagt, Sie können 250k verdienen.Es kostet 40 Millionen a Jahr in Bandwidth, nur zu MLAB.Du wirst nicht verstehen, die economics of this, denn wir sind in der Era derCivic-Technik, All we need is a good idea, right?And I think I was really lucky to get sensitized to the political economy andthe fact that we're talking about infrastructure,capital, network effects, economies of scale and not some kind of brilliantidea that just ephemerally transformed our world,and that our side just needs to wait to have one of those to get our turn.
Tim Pritlove 0:10:58
That sounds like a quick upgrade from a customer support person.Yeah, yeah, yeah. So, you mentioned MLab, which stands for Measurement Lab.
Meredith Whittaker 0:11:08
It does, yeah.
Tim Pritlove 0:11:09
Can you explain what this is all about and how it came alive and why you were involved?
Meredith Whittaker 0:11:13
Yeah. It was a project that was Derek Slater, Vince Cerf, Stephen Stewart,Sasha Meinrath at Open Tech Institute.
Tim Pritlove 0:11:29
The elders of the internet.
Meredith Whittaker 0:11:30
Yeah, some old internet guys and me.
Linus Neumann 0:11:35
Let's say the elders of the internet.
Tim Pritlove 0:11:38
You're one of them now.
Meredith Whittaker 0:11:41
And the lady with the couch. And the conceit there, which felt really simpleto me at the time, is everyone is buzzing about net neutrality.And I was a Kool-Aid drinker. I still think the value underlying that kind ofmythology, let's say, like, yes, we should not have one gatekeeper decidingwhich news source, right? This is old school common sense.It goes back to Western Union, who there's a little, you know,in the US would refuse to carry telegrams from political candidates the company didn't support, right?So there's a real bedrock precedent here.And I was like, yeah, of course we need net neutrality.Neutrality itself is a really wafty, it's a loose concept.It needs to be augured in some type of benchmark against which we can assess is this neutral or not.
Tim Pritlove 0:12:32
You need to quantify it.
Meredith Whittaker 0:12:33
Yes. And then you just flung into the abyss of philosophy the second you try to do that.But we started and I would say it was not necessarily built to succeed.It was built as a sort of hypothesis project wherewe stood up three servers that hostedopen source measurement clients and the thingthat we were doing was putting servers that were all configured the same thatgave each client a dedicated slice of resources and then way over-provisionedthe uplink between the server and the switch so that we could all,for all intents and purposes, guarantee that any artifacts that were detectedthrough the measurement methodology like some TCP,RTT or something, were not interfered with by our infrastructure and weren'tsuffering for bandwidth.
Tim Pritlove 0:13:31
So for practical reasons you had unlimited bandwidth and zero latency to everywhere.
Meredith Whittaker 0:13:36
Yeah, and let's, you know, and that's very expensive. But then,we immediately DDoS the servers or like, oh shoot, because Vint's put a blogpost out and he was you know he's a big name,and then it was like this job of years and years of getting servers in differentinterconnection points because we wanted to measure across consumer you knowwe're not just measuring to a server in the network exactly and we need to crossinterconnection points to do that because that's where you begin to see interestingbusiness relationships and kind of feuds,and it was just it was very it taught me about like how difficult and contingent and,ultimately subjective, creating data is, and the political process of then sortof defending that data as a reliable proxy for reality.Because I would be at the Federal Communications Commission and I had Comcastacross from me being like, no, we want to measure it multi-threaded TCP versussingle-threaded because you get a, let's just say, higher number that way.It's more forgiving. and we would be sort of defending the methodology of ourtests and the openness principles.So it was open source code, the database architecture was open.I don't remember all the server architecture, it was old school,old school ideological.
Tim Pritlove 0:15:00
You really don't sound like a book nerd now. You've become a real network nerd.
Meredith Whittaker 0:15:05
Let's just say book nerds read and network stuff is written down.I read the fucking menu of it. I'm like, no one told me I was allowed to notread it and still have an opinion.
Tim Pritlove 0:15:24
I love networking.
Meredith Whittaker 0:15:26
It taught me everything about so much because you get down to the network layerand you're like, fuck, everything is broken. It's all tied together with string.What's going on? Okay, so your promises in the pitch meeting are guaranteed by this.This crusty old library that's maintained by a guy on a boat with a BitLockeraccount. Like, what is going on?
Tim Pritlove 0:15:51
It's scary, isn't it?
Meredith Whittaker 0:15:53
It is. I mean, it's a catalyst into today for my work.
Linus Neumann 0:15:59
You did that for, I think, a couple of years, the MLab stuff.And you also, at this point in time, I think, began, you know, the ties to the,Let's say broader internet freedom community. I remember Uniprobe at the time, I think, was another.They still exist, right? They do.
Meredith Whittaker 0:16:19
And we actually talked to them at Signal. They'll detect signal blocking.And I believe, so one of the things MLab did was provide backend infrastructurefor projects like Uniprobe.And I don't actually know what that relationship is today.So academics and hackers and developers could write a test or kind of a measurementmethodology deploy a client,to consumers like you test from your laptop or whatever and then we would bethe ones paying for the bandwidth and infrastructure that would allow that to scale Uni.
Linus Neumann 0:16:54
Is the Open Observatory of Network Interference.
Meredith Whittaker 0:16:58
If I remember the acronym Hi Arturo So...
Linus Neumann 0:17:04
Disclosing a whole social network here voluntarily. So, but eventually somethinghappened that is known as the Google Walkouts.Was that following your MLab activities or during your MLab activities or didyou rise even higher in the Google hierarchies before that happened?
Meredith Whittaker 0:17:25
Well, I mean, there are many. I think, so...I was not working on MLAB by that point.MLAB, there's a part of this story that I didn't cover, which is around,I don't know, 2014, 2015,we started seeing really interesting artifacts in the data,which showed essentially that at particular interconnection points between particulartelcos, we were seeing drastic drops in performance.And what we were able to do was look at, you know, every intersection of Telco1 and Telco 2, we see this drop.Intersections of Telco 3 and Telco 2 don't see this drop.And so what we're able to do is say, like, there's actually a business feudgoing on. And the interconnection point is the locus of that feud.And using traceroute data, we were able to say, like, look, these aren't,it's not just that they're sharing a path in some other region of the network that's slowing it down.
Linus Neumann 0:18:30
They're actively throttling.
Meredith Whittaker 0:18:32
They're actively throttling. And we put together a report on that.And this is where I was sort of, I think, kind of sideloading myself into someacademic research work. I was like, okay, how do we do this?Document our methodology. Document sort of everything was open.So we pulled that together.And it showed, I don't know if you remember, there was a sort of Comcast Cogent Netflix.
Linus Neumann 0:18:56
Oh, yeah.
Meredith Whittaker 0:18:57
Netflix was shoving all its traffic through Cogent. What we had done is exposethat and exposing that it exposed the principle that you have to if you wantedto ensure net neutrality, you have to take the interconnection points and theinterconnection agreements into account.And that led Obama to add interconnection to the reclassification under TitleII of these, you know, which kind of moved toward net neutrality that was nullified.But, you know, that was kind of the swan song, let's say, for my MLab time,because, of course, that was a huge deal, right?Like, that's where the business model rubber hits the road.And, you know, I'll just shorthand it. Like, that bought me a lot of capital at Google.And with that, I was, you know, I was already interested in a lot of things.MLab was sort of humming along and, like, it had grown from a hypothesis projectto, like, a global thing that was working and doing stuff.And I started getting agitated by AI and a lot of these privacy and securityconcerns, being part of the community that you were part of,you know, kind of thinking around liketech alternatives and getting less comfortable with the business model.And from there I always had like eight different projects going on but I wenton to found, to co-found the AI Now Institute which was really trying to bring the conversation,on what we called machine learning back then but like AI was like a flashierterm, like bring it down to the ground a little bit and stop talking about superintelligenceand stop talking about, you know,Political economy, what are these technologies, how are they being deployed,how do you oversee them, who uses them on whom, and what are the social andpolitical dynamics of that.
Linus Neumann 0:20:36
But the AI Now Institute is not hosted at Google, is it? No.Which university is it again?
Meredith Whittaker 0:20:43
It was at NYU, and it's moved out of NYU for reasons of that being easier.
Linus Neumann 0:20:48
Okay.
Meredith Whittaker 0:20:49
And cheaper, because they take like 40%, those universities.
Linus Neumann 0:20:53
Okay.
Meredith Whittaker 0:20:54
Word to the wise, your money.
Linus Neumann 0:20:59
What do you think?
Tim Pritlove 0:21:00
Like whatever donations you get. Okay.
Meredith Whittaker 0:21:03
Yeah. So it's an expensive brand.
Linus Neumann 0:21:10
Well, for a reason, right? I mean, NYU Institute, that's something to have, I think, co-founded.
Meredith Whittaker 0:21:17
Co-founded, yeah. I mean, the work was really good. And this was,again, gaining the capital at Google, cementing a reputation,being able to get to a level where I had a budget.And then part of what I was always trying to do is how much can I pull out ofGoogle and get into the hands of people doing work that I think is cool.How do we carve tributaries in the massive river of this huge,rich company and get it out?
Tim Pritlove 0:21:46
So you started that institute at the end of 2017.
Meredith Whittaker 0:21:51
Well, beginnings are sometimes hard to date.It was born out of a request from the Obama administration to be the host of one of his AI summits.I don't remember exactly the contours there, but it was in 2016.And so it started as like, like the idea then was like, let's do this and let's do it big.Like, let's make this the most polished, flashiest, like hard to ignore kindof spectacle using all the tools we can get from like hiring an events agency,doing good press, all of that.But let's also make this the one that is the most engaged with these political topics,that's actually forcing the debate in that direction and kind of making it facethese questions that are much more grounded and, you know,hopefully much more healthy for our population.So it was very successful and from there we got offers of funding and a lotof encouragement and the work just kept going.
Tim Pritlove 0:22:53
Well, we in Europe, we are used to be a bit behind, but if I can recall 2016,there was no discussion about AI in Europe at all.And so, it's quite interesting to see that Obama has actually,you know, decided that it's finally time to do something about this topic nobodyhas ever really talked about.
Linus Neumann 0:23:18
Germans are currently making up their mind whether that is a topic.
Tim Pritlove 0:23:21
Ja, wir sprechen viel über Europa heute.Aber mein Frage ist, kann man describe was kind of discussions going on 10 Jahreago in American society that this has come up to be something of a topic for the near future,which actually became a big one.Where did this discussion take place?
Meredith Whittaker 0:23:49
Yeah, to answer this question, I'm going to be drawing on a lot of the researchand the historical work, the work I've done since then.Because when this dawned in my life, when it started being a thing,I had basically that same question. What is this stuff?Why is it kind of... At Google, you would see...A shift toward a new paradigm or a new trend by the incentives that were structuredinto the OKRs, the quarterly goals,that kind of, you know, there'd be all these training modules that would popup and it's like make your software engineer into an AI developer,you know, and you'd be like, there's an incentive here.
Linus Neumann 0:24:30
Seems like they weren't too successful though, right? I mean,that's not like the biggest AI wave at Google.
Meredith Whittaker 0:24:37
Well, they were, that was DeepMind era. So, okay, okay, good.
Linus Neumann 0:24:40
Sorry.
Meredith Whittaker 0:24:40
I think they were ahead. It was them and Meta for a long time. And the history...
Linus Neumann 0:24:46
Didn't make it to the business though at the time, right?
Meredith Whittaker 0:24:48
No, but they're chaotic. It's a kind of court in decline. So that,you know, actually where the... Actually the business model part has never beentheir strong suit beyond search.To be real, you know, like cloud is like the best technology presented confusinglywith 18 versions all deprecated, right? Like that's... Um,um, but this, the, the AI stuff was, so if you look at the, if you look at thesort of recent history, which is something I'm spending, I've spent a lot of time on.I spend a lot of time on because I think we, you know, that gives us a reallydifferent picture than the Elon Musk narrative or the kind of popular narrative.There's a very important paper that was published in 2012 that introduced theAlexNet algorithm. und das war Jeff Hinton und his students, you may recognize.
Tim Pritlove 0:25:39
Who just got the Nobel Prize. Physics, interestingly enough.
Linus Neumann 0:25:43
Yeah, because there is no AI.
Tim Pritlove 0:25:45
Yeah, because there is no computer Nobel Prize.
Meredith Whittaker 0:25:48
Well, if you claim that your technology is everything, then you can get a prize in anything.
Tim Pritlove 0:25:56
Which is a proven point now, yeah.
Meredith Whittaker 0:25:59
And it was Ilya Stutzkever and then Alex. I'm sorry, Alex, I'm not grabbingyour last name from the ether right now.But nonetheless, this was a paper that kind of pulled together key ingredientsthat became the foundation of the AI boom now.So this is deep learning algorithms, which is the paradigm we're still in.It doesn't matter. There's architectural sort of rejiggering,but nonetheless, it's deep learning.Huge amounts of data. So what I've called the derivatives of the surveillance business model.
Tim Pritlove 0:26:31
He found all the cats on YouTube.
Meredith Whittaker 0:26:34
I mean, yeah, that was Jeff Dean, I think.And then, you know, powerful compute, right?And they showed that sort of using gaming chips and a lot of data,you could beat the benchmark.So score much better against standard evaluations than past models and thussort of catalyzed the industrial industry interest in AI and why were they interested?I think this is a key point because these optimist-seeking algorithms are reallygood at curating news feeds.They're really good at figuring out algorithms, right?I don't think it's an accident that Jeff was immediately hired,dass Jan LeCun, die die deep-learning-Algorithmen, die das Seed der Moment inAI in den 80ern, in den letzten 80ern, war er mit Meta.Und es war die Plattform-Companien mit der realen Investition in squeezing mehrAd-Dollars aus der Daten, better-serving, all of das.Das ist Google mit DeepMind. Du sehen Meta und Google die leaders in das.Als measured by different evaluation standards, the measurement question hierist eigentlich wirklich interesting und troubling.Until this generative moment where I think the ChatGPT Microsoft products shiftedpeople's perception of AI and what it can do and just rearrange the leaderboard.But the paradigm is still the same And the paradigm is still that AI is applyingold algorithms on top of the sort of massive platform monopoly business model,availing itself of huge amounts of data, which is produced via this surveillancebusiness model, and really powerful compute that was designed,built up, you know, I would say, like, consolidated in the hands of these platformcompanies via the imperatives of the surveillance business model, right?
Linus Neumann 0:28:43
Would you say that, so clearly, I mean, you say,so here are Google and Meta that have these massive amounts of data,like larger amounts of data than probably ever existed or were in the handsof anybody or any organization.Would you say that they had amassed all this data and eventually learned,okay, we probably can't handle this anymore, so we're interested in this newparadigm to even monetize this data any further?Or was it rather like, oh, we have all this data, we're monetizing it well,here's another way to monetize it on top of that. because clearly these deeplearning algorithms are not of much use if you don't have large data.
Meredith Whittaker 0:29:29
Set and large compute for both training and inference but is.
Linus Neumann 0:29:34
It the hen egg what's the hen what's the egg there.
Meredith Whittaker 0:29:37
I think thisis almost a perpetual motion machine right likeevery quarter you have to report progress you have to report growth the logicis metastasis and so you know you're trying to squeeze more out of what youhave and you're trying to get more of what you have so you can squeeze moreright there's also these sort of laws of scale remember big data we used to call it that,and so I don't I don't actually know the answer to your question but I think it,Which came first? But, well, I mean, the business model came first, right?You had to have the ingredients to know what they did together.And I think it was deep learning and AI had sort of languished in the backwaterwith some interesting experiments through the 2000s because its history is alwayspromising too much and disappointing since the mid-50s when it was invented.And I think the goal was really to sort of supercharge their existing business model.
Tim Pritlove 0:30:45
I think deep learning was just the technology that perfectly served their currentbeliefs in that they have to work on the data,that they have to build up algorithms to somehow predict your personal futureand be there with an ad before you even know it and we've seen this,as an ad everywhere and we've also seen it in political influence as we've seen in the Brexit,decision and also in the elections where we heard there's going to be one inthe near future as well in the US that might be influenced as well.
Meredith Whittaker 0:31:24
Let me check my calendar.I mean I think we can like peel back also just this concept of like what isan advertisement right? It's an influence.Is it trying to get you to buy something? Is it trying to get you to like something?Is it trying to get you to believe something? Is it trying to get you to vote a certain way?And I think that the term advertisement is usefully deconstructed when we startto think about the connection between all of those.
Linus Neumann 0:31:52
The term advertising is one of the best tricks the devil ever pulled.Because it's behavior manipulation. It's a stated goal. But I remember when.The first debates came up about, for example, the Cambridge Analytica scan.It's like, they used Facebook data to manipulate voters.And it's like, no, there's a red line over here. You can't manipulate behavior.I mean, Facebook and Google were basically built to manipulate users' behaviorand ideally capture their attention to change their behavior.Wir haben wahrscheinlich dieses shirt oder das oder das oder das oder das oderdas oder das oder das oder das oder das oder das oder das oder das oder das oder das ist.Wie können wir das so etwas machen? Das ist unheard of.So viewing advertising as like, oh, this is just an offer.This is, oh, we're just making our product. No, no, I have a limited amount of attention per day.You're capturing it and you're doing it with one simple reason.And that is changing my behavior in the sense that you aim for.And I guess eventually it sounds like,eventually you questioned your Google career, right? Because you were,I mean, clearly you had an impressive career there in just a few years.Um, but you began to question not only your, like Google and the company culture,but apparently also a little bit your,your, the way you want to continue with your, with your, you know,making use of the influence and the knowledge you have.
Meredith Whittaker 0:33:32
Yeah. I think.
Linus Neumann 0:33:34
Is that the walkouts?
Meredith Whittaker 0:33:35
I mean, it's all interlinked and kind of periodizing your own consciousness is hard.But I think I'm pretty earnest and I also don't come from that world,I don't come from that class so there are often places where I just didn't,you know, I would take things sincerely or be like really committed and then only realize liketwo thirds of the way through whatever it was like oh no one else really caresabout this they're just networking or you know whatever it was so I think therewas an element there where like you know when I was doing MLab I was like Ireally want to win net neutrality Und dann we won net neutrality.Aber dann ist das, dass ich das nicht der Fall war.Google hat eine größere Netzwerke als Comcast. Das ist nicht die Gatekeeper.Aber das war eine Sincere Sache.Und dann war ich, okay, ich kann mich,um, Geld zu all diese coolen Privatsch-Hacker-Projekte. Das war Sincere.Und dann wurde ich in AI und ich war, okay, und ich denke, das ist etwas,das hat sich verändert für mich.I think I used to have a lot, lot more faith in the power of ideas to influence real change, right?And I still think, you know, I spend a lot of time in kind of thinking throughdiscourses, how do we shape them?Like, how do we kindly walk people into understanding things that,you know, they may have an interest in not understanding or they may have been,you know, misinformed about or what have you.But I began you know I began around the time I was looking at AI and sort of making all these cases,that everyone loved right like I was out there giving talks that were completelyagainst the Google party line and I was getting like applauses,I was getting promoted like I was like this is a this is a perfect job.And then you know like I was I envied you not only once I'm the house troll,but then there was I was getting more influence so I was becoming known outsideand inside I was like the person you'd call into your team when it was likeoh we want to implement this is there an ethical way to do it,and you would say no I would, I would be like my dear friends let us sit down,and then I was getting, I don't know, that was sort of my life and we kind oftook the AI Now Institute and really did a lot to reshape the debate.I was very focused on that discursive intervention and how do we begin to talkabout AI in a more realistic way and that was working outside of Google butit wasn't really influencing core decisions at Google and that was kind of thething I kept hitting up against more and more strongly until I got a signal message in...Late 2017.
Linus Neumann 0:36:34
From Moxie?
Meredith Whittaker 0:36:36
No, I mean, I probably did get one from Moxie at that time, but this one was not that one.It was from a friend of mine at Google who said, yo, there's a really disturbingproject that is hidden that I'm very close to.And you should know about it because you're the AI person who cares about thisand you have standing at the company around it.And this was the secretive contract that Google signed with the Department ofDefense to build AI drone targeting and surveillance systems for the drone war.And of course, like I was politicized post 9-11, post Snowden.Like this was, you know, the drone war and the signature strike and all of thatwere really core in kind of my, you know, things that I ideologically rejectedand, you know, felt like we needed to disarm, not supercharge.And I had like a, you know, like a righteous anger. I was just like, fuck this.
Linus Neumann 0:37:32
Because you're there running around, you know, trying to shape the discourse on AI.
Meredith Whittaker 0:37:38
And making them look good, right? Because they get to be like,look, we platform such heterodox voices. We're surely benevolent, right?And then I'm like, okay, and then you're inking this deal with the DoD behindthe scenes for technology that one we know doesn't sort of work for the purpose, right?Like, you know, it's not going to better identify a worthytarget of death oder whatever the fuck it's you know it's not youknow we know this is bullshit but like and youknow this is a multinational company like more thanhalf the employees are outside the US there is an issue with yoking yourselfto you know not that you have one nation's government not that many tech companiescare that much about that and then there was just the you know what is the structuraldanger which is deeply acute and,of a massive surveillance company with more data than the world has ever seen,more compromised than you can imagine, like yoking their fortunes and a keydependency to the US military, right?And, you know, we know from Snowden that that's already like,you know, seen as, you know, as long as it's a corporation gathering it, it's not, you know.
Tim Pritlove 0:38:47
Was Google still running under the motto of don't be evil at the time?
Meredith Whittaker 0:38:51
They were, they were. And that was, we marshaled that actually.
Tim Pritlove 0:38:54
Kind of like the end of it, wasn't it?
Meredith Whittaker 0:38:58
They quietly, like the lawyers removed that from the Google Manifesto.It was slowly fading. Yeah, it was like, they were like, just don't open that closet.
Tim Pritlove 0:39:08
Which motto? I don't recall any. It's like, do you know something about this?
Meredith Whittaker 0:39:13
Try not to be bad is the new motto.
Tim Pritlove 0:39:18
Don't be as bad as that.
Meredith Whittaker 0:39:20
Yeah, like lay off the evil.Ja, so I mean, and this was, there were like a lot of old school people thereat that time who really did drink the Kool-Aid and so it was you know,I just put my energy into organizing against that and that was when I turnedtoward labor organizing and thinking through traditional,methods and approaches to combating that type of corporate power or you knowindustrial power and we, you know, and that That was the on-ramp to the walkout.So the walkout was like a big, that was like a rupture, like a manifestation.It got a lot of press and it was the biggest labor action in tech.
Tim Pritlove 0:40:02
So just to make that clear, the walkouts were actually people walking outsideof the Google buildings for, I don't know, how long?
Meredith Whittaker 0:40:12
I think it was November 11th, 2018.Und everyone walked out for 20 minutes at 11.11am in their local time so wecalled it Rolling Thunder and it started in the Singapore office as I was goingto bed in New York and I was seeing the photos,and this was chaos I hadn't slept in days there's so many meetings there's somany tears it's hard to organize something like that,and I remember going to bed and seeing the images from Singapore mit ein paarhundert Leuten in Singapur und ich war,das ist großartig.Und dann habe ich New York time gewohnt, um 5 a.m. zu gehen in New York zu unsererlocation und prepen es, um, dass wir uns da sind.And then I just remember seeing, like, there was this little park near the NewYork office, and then it just, like, grew outside the park.And then no one could get into the park. And then I was looking at my home,and there's my phone, and there's live helicopter feeds.And we don't have bullhorns, because I'm one of the speakers.There's, like, speakers standing on chairs to address the crowd.And then this guy, like, some, you know, there's this sort of type.I don't know if in Germany you have this type, but they're, like,kind of, like, the leftist at every protest.And like some guy had like found out about it and came in and I just remember this like,like this man I'd never seen like handing me a bullhound from below and I pickedit up and like we were like bullhound like a megaphone like a megaphone yeah yeah yeah.And then it was you know and then we walked overto a Mexican restaurant and sat at the table and had a press operationwhere like that we were running and what was cool was like everyone organizingthat was kind of a professional most of them were femme and most of them hadjobs at Google that were like organizing the company so organizing their commsbeing an administrator for like 13 different directors,this particular type of hyper competence at coordinating activity across a numberof people who you may not have direct power over and like for that much more.
Tim Pritlove 0:42:27
Than developers in that.
Meredith Whittaker 0:42:29
Respect I mean let's just say Yeah, yes. You know, you can, like somebody whocan write an email and get someone to respond to that email by doing a thing is kind of a witch.And it was all these like these preternaturally competent femmes just turningthat energy toward organizing across the company,which was building on this sort of base of meetings and locals and all of thissort of work that we'd done to kind of form a discursive environment insideGoogle where we're like weekly meetings to discuss the news,what Google could do, what campaigns we could do.So we had this sort of energy and solidarity already at that point.
Tim Pritlove 0:43:10
How many percent of the Google employees would you say have at one point taken part in this?
Meredith Whittaker 0:43:17
I don't know. we were really carefulnot to keep lists yeah I mean you can dream up a number now you know I.20,000 was the number estimated from the Sky Photos and the local reports.We had local leads at every office.We sent them zip files of the kit for handing out the flyers,the talking points, how to treat media, all of that.They all had that, and then they organized their local. And then they had reportingback in from press, reporting back into the central organizers around numbers.And then we were issuing the press releases. um butthen you know there's employees of google andthen there's contractors of google which more than double employees soit you know that number is and and then there's people whocouldn't participate but were really supportive because theyhave you know need their health insurance or they'll die or they are on a visaand so it wasn't um i think there was a huge amount of support uh i know thatwe were able to get google to drop their military contract because there wasenough support and there was you know there was like a spreadsheet of people who are quitting,conscientious objectors we had you know like every week at the all-town meetingthere was a table we would set up with like banners and something like we wouldhad questions like it was a very,it was a very like rigorous campaign and it kind of laid the laid the groundworkand i think it built some muscle that people are probably almost are definitelyusing now even if they don't call at organizing.It's like, how do we marshal the resources from this?
Tim Pritlove 0:44:54
So did Google then still love you at that point?
Meredith Whittaker 0:44:57
Well, I think Google does still love me, but it doesn't love itself enough to admit it.
Tim Pritlove 0:45:10
They have issues, I think.
Meredith Whittaker 0:45:16
I do think that. Come on, come on. Which side do you want to be on?Which party do you want to go to? Let's be real.
Tim Pritlove 0:45:23
I mean, it leads to an interesting point because I would say,looking from the other side of the ocean, I think to us, this whole tech scene,The startups, the new stuff, the internet, everything that has developed inthe last 20 years or so, always had this liberal touch to it.It felt as if it was mostly about an open, world-loving agenda and it's good for everybody.And Google kind of tuned in with their motto and some other companies did as well, some not so much.But it was always this feeling that liberal thinking is at the core of everythingthat is driving the internet forward.And I think we stopped thinking that now because it looks totally different right now.Currently, we have more the feeling that it's turned into a total right-wing apocalypse somehow.And I haven't really seen this coming.Can you explain what happened to this tech scene? What happened?
Meredith Whittaker 0:46:50
I, well, you know, I think about post 9-11 and a lot of the fights over surveillance and tech.And there's a talk that, who was, Frank Rieger and Rupert Gronkripp?
Tim Pritlove 0:47:11
Rob Gronkripp, Rob Gronkripp.
Meredith Whittaker 0:47:12
Sorry, I don't, that's my American, you'll do beautifully in America.
Tim Pritlove 0:47:17
It's complicated for us.
Meredith Whittaker 0:47:17
Ja, aber ich entschuldige für nicht das richtig.
Tim Pritlove 0:47:21
Roppenfrank.
Meredith Whittaker 0:47:21
Ja, ich habe diese Talk, We Lost the War, ich glaube es war 1994. Nein, 2004.
Tim Pritlove 0:47:27
Ja.
Meredith Whittaker 0:47:28
Ich habe die gleiche Sache gemacht.Und nicht jeder, ich glaube, war equating die growth und monetary success ofthe US-based tech industry with sort of, you know, values of social progress.I mean, I think we can, yeah, yes, it was liberal and then we can get back toa critique of liberalism and what have you, but it was, I think there were peoplewho were looking at the infrastructure,who were looking at its capabilities, who were looking at the gap between thepromises and the reality of what this tech did and calling that out.And I feel like, you know, when I entered into this kind of,Privacy, security, hacker development scene.There was a lot of that skepticism there around Google.That educated me a lot. There was a lot of skepticism around surveillance.I immediately recognized, yes, we need privacy because it doesn't matter ifthese people are good or benevolent.What we're doing is setting up an infrastructure that could be turned over atany moment to another regime. Logically, all of that made sense.But I don't feel that until the Snowden revelations, any of that was anywherenear in the nervous system of a kind of tech consciousness.And a lot of the work I've spent in the 1990s trying to sift through the cryptowars and sift through what happened with tech regulation to set up these surveillancegiants and to permit this monopoly platform business model,has kind of looked at that gap between the rhetoric of liberal rights preserving.Open, free tech, and what was actually being built, right?And one of the things, if you look at, there's a scholar named Katharina Ryder,who I would really suggest, for show notes, I can send some of these links.But she did her dissertation looking at some of the negotiations in the crypto wars.And what you begin to see is that, yeah, we, you know, and this is a thesissort of I build on top of in some of my work.Yeah, we won liberalized encryption, right?By 1999 in the US, it was finally legal to build, share, implement strong cryptosystemswithout approval from the government, without some threshold that made them useless.But the agreement there was basically, yeah, you can have encryption,but we're going to permit mass surveillance by companies.Und so you don't actually, like, you can just get the, you know,we're going to permit, we're going to endorse the advertising business model.We're going to endorse, and I can actually, I can start this point over, actually.
Linus Neumann 0:50:19
Would you say those two decisions were...Strategically interlinked or were they just coinciding?
Meredith Whittaker 0:50:30
I can't say that there was a conspiracy. What I can say is that Katarina's workshows that Microsoft saying liberalize encryption, don't worry we're not goingto encrypt all the data, we need it.Just come to us quietly and we'll give it to you.Instead of fighting over a backdoor, instead of doing this in sort of the publicdomain where we're kind of losing the fight on technical and other grounds.Allow companies free reign to surveil because that allows us to implement thisad-supported business model.And then the data agreements can happen behind the scene.Now, I've like completely compressed a very complex history into basically a meme.But I think the purpose is there is that there was always that gap between thesesort of rhetoric and what was actually going on.And I think the, I don't know, like there's like a kind of internet people, right?This type maybe misunderstood exactly like how this, you know,who would have power over this technology, right?Like encryption is liberalized, but it's not gonna be applied to protect personalcommunication. It's applied to protect transactions, right?It's not, you know, like the people who get to choose whether or not it's used aren't us.In terms of, you know, actually this sort of mass infrastructure in terms ofthe tech ecosystem that's being built by the actors who are calibrating theirdecisions based on the surveillance business model that has been instantiated right,you know, through regulatory decisions made by the Clinton administration.
Linus Neumann 0:52:07
So then there were a couple of hundreds of thousands of nerds worldwide that used PGP encryption,using additional software and plugins to send an email with nonsense informationto avoid government or private sector surveillance.But it was probably never really a significant number of individuals that...That got to the point of, you know, having mass encryption out there.
Meredith Whittaker 0:52:40
Yeah, well, I tried. It's just that my friend group didn't overlap exactly withthose couple of thousand nerds. So what am I going to do, right?And I think that's, you know, this is the network effect. This is why it's actuallyvery difficult to do that.And this is why if one of those actors that controls these infrastructures doesn'tmake the choice for us, it's really difficult to make that choice.
Linus Neumann 0:53:01
And none of these actors voluntarily got the idea, right? I think it's partof the founding myth of Signal that Moxie at the time wanted to implement the,it's now called the Signal Protocol.I still remember it as being Axolotl or whatever it was called.
Meredith Whittaker 0:53:17
It used to be Axolotl. And then it was when Signal was launched,which was the Red Phone Tech Secure integrated into one app for iOS in 2013. It was changed.
Linus Neumann 0:53:28
Right. But initially, I believe Moxie wanted to implement at Crypto for Twitter,direct messages, if I'm not mistaken. I think it was at Twitter before.And so his idea to roll out Twitter,Definitely, he devoted a few years of his life to implementing and rolling outmass end-to-end encryption, right?But I believe it was he wanted to do it at Twitter at the time.And maybe I'm wrong here, then our listeners will correct me two minutes into the show.But clearly, none of the large...
Meredith Whittaker 0:54:06
And I don't actually know for sure that story, so don't correct me.
Linus Neumann 0:54:11
It's my mistake.
Tim Pritlove 0:54:14
Ich habe einen anderen Punkt, aber ich denke, dass Twitter nicht sich als eineMessergebung und eine Messergebung und eine Messergebung was,was wahrscheinlich ein Mistake von der Ich meine.
Meredith Whittaker 0:54:25
Und der Regulatory Issues. Ja. Ich meine, ich habe mich mit der Security Teamund da waren immer diese Ideen,ich habe immer ein Projekt wo wir uns an die Keystore für andere Services unddu hast diese wirklich exciting conversations mit den Security Guys and thenit wouldn't go anywhere because then, you know, the other guys would get involved.
Linus Neumann 0:54:47
Point I was trying to make is none of the big platforms ever rolled out massscale end-to-end encryption until eventually WhatsApp did.And it took an Edward Snowden revelations and an ongoing scanner for I thinkroughly one year until Facebook at the time said, okay, we'll implement this.
Meredith Whittaker 0:55:08
Well, actually, I mean, that's, So Signal existed before the WhatsApp integration.
Linus Neumann 0:55:13
Sure.
Meredith Whittaker 0:55:14
And the WhatsApp integration was driven by Brian and Jan, who are the co-founders.And my understanding there is that they were rushing to get that done beforethe Facebook integration, to make sure that they weren't selling something thatwould violate their principles.And I know Moxie was working on that. I remember that period of time.Und dann Facebook bought it.Und ich weiß nicht, dass die deal da ist, aber es bleibt, mit der Signals-Protokolle.Es war ein Post-Snowden Moment.Du sahst Android und iOS implementiert Full Disk Encryption.Du sahst Google encrypting. its.HTPS for its networks and I think a lot of this was just like we need to distanceourselves from bad government spying by adding encryption that proves that we'renot actually part of the problem that they have just sort of,the bad government has attacked us as taxpaying corporations,taken this data that we really want to protect but god how could we have knownand so now we're encrypting things and,ultimately good, right? But it was a way of not looking at the full story oflike, why is that there to begin with?And what else is not being encrypted? What other data is being given over?And why do you have the choice to do that to begin with instead of a more sociallybeneficial, democratic processof determining how we're comfortable with technology entering our lives?
Linus Neumann 0:56:58
I probably would have even had an even better or even stronger perspective onit seems like the governments didn't really attack these corporations.The governments made their business model look bad. So now they needed to changesomething to convince people no, no, no, your data is safe here.We'll encrypt something and moving on, moving on.
Meredith Whittaker 0:57:21
I was, for the gentle listener, I was kind of joking because that rhetoric aroundbeing attacked and being like, oh my gosh, was Very much the mood at that time.I was at Google when Snowden dropped and I remember just things popped off andI actually had to get, I got on a plane.I don't know, you all probably have a memory similar to me of like when theGuardian stories with the Verizon water, like the Glenn Greenwald stories.It was night in New York. It was probably morning the next day you guys sawit. I remember sitting on my couch and being like, holy fuck.And realizing just how big that was because it was the kind of thing we'd beentalking about, speculating about in the rooms that you and I were in, Linus.And then it was like, oh, receipts, shit.And there was a lot of unclarity. There was that Prism slide where it was like,is Google just giving them full access? That people were rioting inside.Security engineers were threatening to quit.And then that morning I got on a plane to TorDev.And so, yeah, in Berlin, actually.
Linus Neumann 0:58:29
I remember that this time,and we at CCC really had to bite our tongue and write a media communicationsstrategy and said, nobody says I told you so.We need to at least act surprised as well, right? And not say,well, we told you all along.
Meredith Whittaker 0:58:55
I give you all a pass, because you did tell us all along.
Linus Neumann 0:58:59
But of course it gave a lot of international media attention to our cause.And we needed to play that public attention wisely.
Meredith Whittaker 0:59:11
Exactly.
Linus Neumann 0:59:12
I think we maybe did.
Tim Pritlove 0:59:15
I find it quite interesting how encryption as a topic has changed over time. It's just...More or less 10 years ago that Facebook actually changed to HTTPS on their website by default.And so there was a time not so long ago, you know, where most of the data wasflowing around on the Internet, mostly unencrypted.And that although there was these already mentioned crypto wars,you know, about general encryption, but it was also always for nerds and forspecific applications.Then it also got this nice paint with this whole cryptocurrency craze goingon, which made it somehow popular and almost took the word away.
Meredith Whittaker 1:00:02
We're so wrestling.
Tim Pritlove 1:00:05
And it was also the rise of encrypted messaging that was really giving a new fuel.So Signal was in the middle of all of this, as we already heard.So, I'd like to focus on Signal for a moment as an organization that you now head.What's your understanding ofwhat Signal is and what it's not and how the organization deals with it?
Meredith Whittaker 1:00:37
Well, I love Signal and I'm really...Yeah, it's the only cool tech company in my view. Und ich denke...So boiling down, like what is Signal in one word is a little,you know, I'll just, I'll start somewhere and we'll end another place becauseI think it's actually, it's a number of things and kind of represents even more.You know, Signal started back, you know, the late 2000s, right?And we can, you know, we can date it to whenever, right?Signal as the integrated app was 2013, but your Red Phone, TextSecure predated that.And this is, you know, there's no iPhone.Jabber is the competition, right? It's like web client-based chat.There's no, you know, people aren't carrying smartphones. There isn't,WhatsApp doesn't exist.IMessage doesn't exist. You have a very, very different marketplace.
Tim Pritlove 1:01:38
Were there still ICQ and AIM, I think, at the time?
Meredith Whittaker 1:01:42
Yeah, yeah. I mean, I remember using, like, I don't remember,I'd like send text messages sometimes, but they were expensive on my BlackBerry, maybe.But we're talking about a drastically different tech ecosystem.And this is particularly important in the context of messaging and communications apps.Because, of course, you need a network effect for those to work.No one buys the first telephone because you can't use one telephone.Your friend has to have a telephone.All your friends have to have an app if you're going to use it.Particularly, it takes two to encrypt. Group encryption, it takes everyone in the group.And if everyone isn't using your app for communication, it's very differentin a saturated marketplace where you have WhatsApp,where you have iMessage, where you have these normative models that people goto just to have their regular communication to introduce a new platform.For secure messaging or insecure messaging, right?Because people don't switch outside of the network unless their friends switchoutside of the network and there's a collective action problem there and aninertia problem there and all of that, right?So when I think about, there's many things that are very precious about Signal,but the fact that Moxie carried it on his back for that decade and was actuallyable to keep it going and surviving without selling out,without selling data and actually creating something that is now able to scaleto hundreds of millions of people.Means that Signal actually has a position in this ecosystem that makes it useful to people.That means that it's actually providing encrypted communication to people allover the globe because their friends are using it.And my contention here, and I'm willing to discuss this, I don't think we canrecreate Signal, right?You could shift, you know, because it has that user base, right?You can introduce a new app, but how do you get people to use it without anOEM, without an existing installed user base, without some way of,you know, kind of making it useful to people?Because again, it's, you know, one telephone or, you know, a couple thousandhackers who all use PGP, but they can't talk to their dad on PGP, right?They can't talk to anyone outside of themselves on PGP. So Signal has both sortof kept this form that is very heterodox in tech.It's a nonprofit, so I couldn't really sell out profitably.I could try to sell out, but I'm not going to get any money from it.It has to be reinvested in the mission, and there are certain rules around mission alignment there.So we're not pressured to monetize surveillance or to collect any data.We're really able to stay focused on that mission and in an ecosystem where.The diametric opposite is the norm. That's really, really important,irrespective about the flaws of the nonprofit model more generally.So it's achieved this pretty rare, like it's the only thing like it in the ecosystem.And I think it also serves as a model for how we could think about building tech differently.Like, how do we disarm, deconstruct the massive centralized power of a handfulof platform companies that basically control most of the infrastructure andinformation ecosystem in our world and our jurisdiction in the US?And how do we build other models that may be interoperable, that are more open,that are more rights preserving, and that aren't subject to the pressures ofand incentives of the surveillance business model?
Linus Neumann 1:05:30
Now, I find it interesting, you mentioned the network effect that for a longtime worked against you, right? You said, okay, it took Moxie a decade.Now, that network effect, you know, the EU regulatory bodies have an idea onhow to weaken it by enabling messenger interoperability,Thus, pretty much trying to force the large messenger operators,be it WhatsApp or whatever else people use, to offer an interoperability interface.So any new messenger would have it easier than Signal did to reach the critical,amount of users to actually have the network effect.Now I know Signal's position strongly opposes this idea or you decided not toparticipate in this interoperability or making use of it.There are two.Forces or two goals of Signal, I guess, that contradict each other here,and that is its security and building open communication systems.Maybe you can explain a little bit why, after, you know, having understood howhard it is to build Signal and, you know, against a market, why you would stilloppose messenger interoperability.
Meredith Whittaker 1:07:05
Yeah, I like this question because I'm going to, I want to clarify our position,which is a little bit of nuance.I don't oppose interoperability in principle.Like if the interoperability mandate of the DSA were, you all have to,you know, interoperability needs to be an option and it has to happen at this rigorous security bar.You have to make sure that you're implementing metadata security, sealed sender.Basically, you're adopting thesignal privacy and security bar as the conditions for interoperability.That would be really cool, right? And I think that's the issue here.I mean, and I want to put an asterisk here for a moment saying,like, there are a lot of other complexities around policy.Like, you know, how do you take in a, you know, who deals with a law enforcement request, right?Even if you have no data to give them, you can't just ignore that.You know, like, if a user has a complaint, who do they write in?If you're interoperating with a platform for communication that also has a socialmedia arm, there's a totally different regulatory environment for Telegram orWhatsApp with channels than for a signal.How does that sort of work?So I want to say this isn't simple.
Linus Neumann 1:08:38
It's a whole can of worms.
Meredith Whittaker 1:08:39
There's a massive can of worms as the EU often opens. Um, but the,you know, like the conditions of interoperability are actually really,you know, they're really political here. Right.So in order to interoperate with WhatsApp, am I going to be giving signal user data to Meta?Well, that would violate the entire premise of what I'm spending my life's energydoing and Signal, right?Is Meta going to decide to, you know, cut off the account of one of our users?Who gets to decide that, right? Are they, you know, are they collecting,you know, other data because they aren't implementing some of our libraries or whatever?And so I think, you know, that's where the rubber meets the road.And we have to have a duty of care to the people who rely on Signal.Das heißt, wir sind absolut nicht going zu verabschieden, dass wir wissen,wenn wir real sind, dass es eine Frau in der Jäule in den USA ist,weil Meta in den Facebook-Messagen hat, zwischen ihr und ihr daughter,die sie überprüfen hat, dass sie sie erhalten,in der Stadt von Nebraska, nach dem Dobbs-Dobbs-Dobbs-Dobbs-Dobbs.That's the stakes of this conversation, even when we're talking about the technicaldetails of interoperability, right?
Linus Neumann 1:09:59
Dobbs' decision is reverting Roe v. Wade, right?
Meredith Whittaker 1:10:03
Yeah.
Linus Neumann 1:10:04
Okay, so just for some of the listeners that are not that much into the US policy.
Tim Pritlove 1:10:09
Basically outlawing abortion.
Linus Neumann 1:10:11
Allowing states to outlaw abortion.
Meredith Whittaker 1:10:14
Access to life-saving healthcare that more than half the population may need.
Linus Neumann 1:10:21
So your answer in short is interoperability is fine,but you do not see any path currently being debated that would result in upholdingthe security promises that Signal has worked long for to be able to make to its user base.
Meredith Whittaker 1:10:41
We will continue to advocate for a path that raises that bar,meets or exceeds Signal's bar. And if it succeeds, I'm like,yeah, I want to talk about that.But, you know, those are the conditions under which we would interoperate.So we're not, you know, we don't take a stand against it.You know, we just say like, look, these are the complexities and this is,you know, Signal stands with the people who rely on Signal, not with a sortof, you know, vision for some muddy middle where we're all interoperating,but we've sort of, you know, sold people out and made them susceptible to,you know, what we describe with Meta.
Tim Pritlove 1:11:17
Do people in the EU understand what you're talking about if you're offeringthese technical explanations why it's complicated?Because we have the impression that they don't really get it.
Meredith Whittaker 1:11:28
Is this mean or median, people?I mean, I don't think most people don't understand this at all because theygot laundry to do and this isn't their area, right?I think some of the politicians I've talked to seem to get it, but it's not...Ich denke, dass es eine Inertia über das Prozess ist, das bedeutet,dass es nicht klar, wie weit diese Punkte haben, die Bedrock zu haben, so zu sprechen.Ich würde auch, als Amerikaner, ich bin hier, für heute.Aber ich also habe, ich würde sagen, mein Instinkt auf die Generalentwicklungder EU ist nicht was ich, was ich rely auf.
Tim Pritlove 1:12:14
Which brings me to an interesting point because we are actually very interestedin your view as an American knowing how things work on that continent,What's your impression of how Europe deals with tech these new technologiescoming up and how it impacts society Can you just give me a feeling for how this is to you,in a good way, in a bad way, whatever,you feel just to...
Meredith Whittaker 1:12:52
One meta-observation, having been here for a bit of duration now, is,And I've thought this before, but it's interesting to think of Europe as one thing.When you go to different countries and meet different people and you're like,wow, this is not one thing.
Linus Neumann 1:13:14
I mean, that's the same in the US, right?
Meredith Whittaker 1:13:16
It is in the US, yeah. I mean, you know, it's big.
Tim Pritlove 1:13:19
But there is a common theme somehow. I'm just focusing on what you can probablymatch to Europe in general. or at least to the kind of discussions you haveon a political level when you face EU institutions.
Meredith Whittaker 1:13:33
It's almost, it's split in an interesting way.Because on one part, if you go to kind of the startup ecosystem,the VC ecosystem, like that world, there's a real,you know, there's a lot of smart people and cool people doing cool things.And there's sometimes a bit of magical thinking that I see, which is really like,you know, if we wish hard enough, if we're able to figure it out,we're going to be able to create competitors to the US incumbents,right? And we're going to have our own thing.
Tim Pritlove 1:14:09
Own search engine.
Meredith Whittaker 1:14:12
Which often just, you know, sometimes I'm just like, okay, that's a money play, right?Like, you get enough in, you know, your Series A, Series B, and then you'llget acquired, and no one will do anything with it, or you'll get rich, or whatever.You know, like, you may not necessarily believe that. Like, markets float on Hype.So, okay. But there is this thread where it's almost a willful misunderstandingof the reality of incumbent platforms, of the history that accrued that typeof power to US companies,and of the dependencies that Europe and most of the rest of the world have on these companies.The three cloud companies based in the US have 70, 70% of the global market.You have five major social media platforms.
Tim Pritlove 1:14:58
Amazon, Google and Microsoft.
Meredith Whittaker 1:14:59
Yeah, AWS, Azure, GCP. And then I think the other percentage are made up byUS companies as well. And then there's some Chinese companies.And then you have five platforms that effectively shape our global informationecosystem, like our perception of reality.The four biggest jurisdictions in the US, right?
Tim Pritlove 1:15:22
Which five platforms?
Meredith Whittaker 1:15:26
Das ist, was ich werde, ich werde, ich werde, vier haben, denn da ist immer eine Last in der Liste.So TikTok ist eine Non-U.S. one, und das ist die eine die Leute freigte aus, um, recently.
Linus Neumann 1:15:37
Weil es ist Non-U.S.?
Meredith Whittaker 1:15:38
Ja, ich denke, so flatly, ist es.Facebook, Instagram, X.Und dann, es ist nicht Twitch, aber irgendwie, da ist eine andere one auf YouTube.Und dann ist Twitch, too.Für all intents und purposes, das ist ein huge amount of concentrated powerthat, again, relies on network effects, relies on economies of scale,relies on all kinds of global infrastructure.It's trillions of dollars that can't just be interrupted by investment. This is a kind of...
Tim Pritlove 1:16:17
This is social media companies.
Meredith Whittaker 1:16:19
Social media and platform companies, right?
Tim Pritlove 1:16:21
And isn't Telegram one of them?
Meredith Whittaker 1:16:23
Well, Telegram doesn't, I think, run most of their own infrastructure.They don't have a cloud business model.And they also don't really have a business model. It seems like they have this crypto play.But it's not clear how that money moves. There's a lot of UAE investment.So you're not talking about these big tech...
Tim Pritlove 1:16:42
Okay, you're focusing on cloud, not so much on the social media aspect.
Meredith Whittaker 1:16:45
And what is it difficult? Where is the normative shape of the tech industry coming from, right?Like if the cloud companies all of a sudden decided to cut off half their APIsand change their infrastructure, there's like most startups in the entire world,including organizations like Signal, Telegram, whoever's writing on top,all their engineers' pagers go off.They got to respond to that, right? It's unidirectional that way.
Linus Neumann 1:17:14
Right?
Meredith Whittaker 1:17:15
If sanctions go up, let's do a wild dictator gets elected in the US and decidesthat Europe is now sanctioned and that Amazon can't do business with Europe. What happens?
Tim Pritlove 1:17:29
Data in the US, that would never happen.
Meredith Whittaker 1:17:31
I make things up as a creative person.Remember, I come from literature and rhetoric.
Tim Pritlove 1:17:41
Just stories in your head. Ja.
Meredith Whittaker 1:17:43
Und dann sieht man die Social Media Platforms, die wir wissen,dass es ein far-right ist,das wirklich hat die Intelligenz und Interessenzierung auf die Alternative MediaEcosystems über die Plattformen und die Affordanzer von Surveillance-Advertising-DrivenMedia Plattformen zu schaffen.
Tim Pritlove 1:18:08
Okay, back to the magical thinking of Europe.
Meredith Whittaker 1:18:12
Yeah. I'd like to understand how you see.
Tim Pritlove 1:18:16
So in this world.
Linus Neumann 1:18:19
There are like a few Europeans that say, here's 50 million, let's compete with these guys.
Meredith Whittaker 1:18:23
Yeah, yeah, yeah.
Tim Pritlove 1:18:23
And that's probably what the European search engine is.
Meredith Whittaker 1:18:25
It's like 500 million dollars European sovereign AI fund.And then you're like, but that's half a training run. Like, what are you buying with that?Which is like which is disturbing because it's like okay well that's a lot ofmoney also let's not be flip about it and it could be going to really good thingsit could be supporting you know interesting open projects it could be supportingyou know alternative interoperable alternatives are like you know smaller clouds for more,heterodox open source project like there's really cool stuff that is languishingwithout that money and I think it's where is that money going?Well if you're talking about going into AI it's going to one of those threecloud companies it's renting infrastructure from Microsoft Amazon or Googlefor model development or for deployment which is inference and inference isreally expensive you don't just train once,you use a model using it is way way more expensive than normal information retrievaland so you're also it's just this massive computationally expensive,Und du hast nicht die Europäische Sovereinheit, du hast eine Gefühl, die...I don't know, like, not behind, which, like, you know, and feeling of not beingashamed by being technologically...
Tim Pritlove 1:19:49
But apart from the magical thinking, is there anything else you would stick to in Europe?
Meredith Whittaker 1:19:54
And then there's the other side, which is I often find a much more sophisticatedand clear-eyed view of these problems, right?Like, having this discussion about that concentrated power in the hands of infrastructureand media ecosystem, way easier in Europe.I mean, people feel it, right? They see it. And there's been a history of pushingback against the encroachment of UStech, both effectively and often very ineffectively, that I really enjoy.And particularly in Germany, there's a very high sensitivity to privacy,very often clear-eyed view on some of these debates, which doesn't always translate into policy.But there's at least, like I find the intellectual environment around this stuff,when you talk to people who are knowledgeable and have thought about it,to be very, I don't know, to teach me a lot and be really sophisticated.
Tim Pritlove 1:20:43
Yeah, the GDPR is probably a German thing somehow in its core, for sure.So how does this affect the talk to European politicians and how do you seethe trends in regulations and trying to apply new laws and regulations towardsthis whole tech industry?
Meredith Whittaker 1:21:07
Ja, ich meine, es gibt beide threads.Du hast zwei Wolves inside European Politik.Der eine will seine eigene Tech-Industrie und der eine will sie nicht zu machen.Das ist nicht zu US-Tech-Colonialismus.Und ich denke, du hast du some weirden Laws.Du hast die AI-Act, die aufwärtigte, die aufwärtigte, kind of had this last-minutebrinksmanship around whether foundation models.These big LLMs that are now the trendy kind should be included or not.And you often see bold regulatory attempts that then get kind of shaped in oddways, trying to have it both ways, right?Like how do we regulate the Americans away and get our own, right?But how do we do that in a way that it is reflected in principles,not in actually declaring that as an intent.And I think that is, I think you're seeing a huge amount of money be spent bythe US companies in Brussels right now, which is also influencing things in interesting ways.And then this is something I'm theorizing a lot in my intellectual work,or just, and I think is really important.You also see what I'm calling the politics of intellectual shame.Be really pervasive in this conversation and this is not just Europe this is across the board.I mean that there is a real.A real fear among a lot of people who are in decision-making positions,politicians or academics or whoever,and not even in decision-making positions, but it matters when it's them,of being stupid about tech,of being behind the ball on tech.And this plays right into patriarchal dynamics.Men hate when someone else knows something more than them, in particular if that's a small woman.
Linus Neumann 1:23:08
I think we enjoy it right now.
Meredith Whittaker 1:23:09
Ja, ja, ja. Totally.
Linus Neumann 1:23:11
Wir haben eine gute Zeit.
Meredith Whittaker 1:23:12
Ja. Well, you all are, generally, and I don't want to gender this in such aschematic, but there is an ego that can be very, very fragile here.And the way I've put it before is it kind of turns uncertain men into yes men.Like, they don't want to ask the dumb question. They don't want to be like,what's an LLM? What's a server? Like, how does that work?And that type of insecurity, The fear of being behind, the fear of being calledtechnically unsophisticated or hampering progress or putting your finger onthe scales of science, look the Nobels, how could you stand in the way of all this progress?I think really gives the upper hand to the companies and those who have an interestin creating products and growth and domination via these technologies.Because people really don't want to challenge them because challenging theirdominance or their plans gets conflated with somehow being anti-science or beingstupid about tech or not being smart enough to have a position on a topic.And I think that's something because I kind of came up through Google askingevery dumb question in the book because I had no I didn't come from that world.Right. So I had to ask, like, how does a computer work? Right.I'm like, I'm like, can someone diagram what a function is? I don't know any of this stuff. Right.But I kind of I think I have a sensitivity to that because I also I remember feeling it.I remember people being mean about it. Like if I didn't, you know,like back in the day when I was trying to learn this stuff.And I think that, you know, a discourse that collapses scientific progress into kind of.The success of a handful of tech companies preys on that type of insecurityand has created an environment in which people have no idea what AI is and arestill professing boldly on how to regulate it.
Tim Pritlove 1:25:10
So I read it as you think that the European positions might be slightly under-informedand probably not well thought out in the current situation.
Meredith Whittaker 1:25:22
I should be clear. I don't think that what I was describing and the politicsof intellectual shame are not unique to Europe, but I think are in Europe as well.And particularly folks who feel like, you know, the Americans beat us, we got to get ahead.Right. I think where I see the European position being most, let's say,under-informed or perhaps just in some cases pernicious is in the chat controlsregulation and the desire,the apex of magical thinking, Which is, you know, let's rename a backdoor client-sidescanning and then let's mandate scanning everyone's private messages,comparing what's in those messages against some database of permissible or impermissiblecontent and then, you know, taking action on those, you know,in the name of protecting children,which is the justification during this instantiation of the crypto wars.
Tim Pritlove 1:26:20
Let's stick to this topic, because it's still an ongoing battle right now.We are more or less talking about this in every of our shows.And yeah, it's still totally unclear what's going to come out of this.How do you see this discussion evolve?
Meredith Whittaker 1:26:38
Well, I don't, you know, I see this as an ongoing power struggle, right?Between who? Well, between, like this is not a misunderstanding.I think a lot of the people pushing for this understand that backdoors are dangerousand understand that, you know, the pretext is flimsy, but that, you know.Asymmetric power constitutes itself in part through information asymmetry.And there's a deep discomfort that dates back to 1976 when Diffie Hellman weretrying to publish their paper introducing public cryptography and the U.S.Government was trying to suppress it, trying to say, don't publish this, right?And then, you know, but databases weren't quite big enough, networks weren'tquite big enough or ubiquitous enough for it to matter, matter.But they were already looking at like, oh, shit, we don't want this in the public,right? And then you go through the 90s and there's the Clipper chip and keyescrow and you have Stuart Baker writing in Wired magazine, like PGP is justfor terrorists. We have proof.No, PGP is for pedophiles, right? Which really echoes what we're hearing now.Right? Like who even has a computer in 1994, I believe, when this op-ed is written?And then we have Post 9-11 and then it's like, actually, PGP is for terrorists, right?And encryption is for terrorists. All the while, our dependency on digital infrastructuresfor communications is growing and growing and growing.Our dependency on digital infrastructures generally is growing.And the need for encryption to protect commerce becomes existential to the Internet industry.And then what do you do about communications, right? And I think this has beenan anxiety that is pervasive among those who, you know, law enforcement,governments, whoever, who feel that they need to constitute their power viainformation asymmetry.And any encryption that protects people, not just commerce, is a threat to that, right?And so what I don't see is that we're going to win an argument,right, or that we're going to win this via strength of argument.I do think we can fight and we, you know, I think, I think we're in a positionnow where we're seeing, we're seeing chat controls, you know,I believe Hungary just tried to raise it and didn't get the support.There was the Belgian proposal a few months ago, also didn't get the support at the last minute.And we just had the Dutch law enforcement authorities writing a memo to thegovernment there saying, yo, don't support this.You're talking about a very dangerous backdoor that would undermine Dutch cybersecurity,right? At the same time, we have reporting in the Wall Street Journal...Es gibt einen Receit für was, was all of us should have suspected all along,ist, dass die Backdoors, die in den US-Telecommunications Infrastruktur,für Government Intercept, haben wir von China-Intelligence und vielleicht anderen.Ich denke, an dieser Moment, wir haben ein vieles...Die Facts sind auf unserer Seite, und die Facts sind auf der Seite,ist permeatet in diese Diskussion, and making it harder and harder for themto push it forward in the European Commission.
Linus Neumann 1:29:46
But that usually means they just do another attempt next year.
Meredith Whittaker 1:29:49
Exactly. And that's why I think we're not going to win. There's going to beanother pretext if we win this one, right?There's going to be another angle if we win this one.We just have to keep building our muscle to sustain this fight probably forever,because I don't think the will to power is going away.I think they're just going to keep trying to rearrange the reasoning.
Tim Pritlove 1:30:10
Aber wie geht es? Wenn Sie sagen, dass die Strecke der Argument nicht genug ist, ist es nicht genug?
Meredith Whittaker 1:30:15
Ich mache Yoga every day.
Tim Pritlove 1:30:16
Does es helfen?
Meredith Whittaker 1:30:17
Ich meine, yes, es hilft.
Tim Pritlove 1:30:18
Ich meine, in terms of politische Diskussionen.
Meredith Whittaker 1:30:24
In terms of politische Diskussionen, es hilft, dass wir richtig sind.We bring in, there's a huge amount of evidence that a lot of people haven'tseen in these political discussions.I think we're on the, our side has been on the back foot for a while.There has been just in civil society, there's been cutting funding to privacyadvocacy has happened since around, there's a sort of a history here.I think there's a move toward tech accountability happened after the 2016 election.There's the Cambridge Analytica scandal. There's all of this.And it's like, okay, we need to hold tech accountable.And then there are a number of – the way to hold tech accountable is to attackthe business model, is my view.But there aren't that many pieces of legislation or proposals that actually do that.Many of them sort of use the wrapping and the language of accountability.But are actually just expanding surveillance, right?It's like, we're going to hold them accountable, so we need a database,so we need to know who's logging into websites so we can find the bad guy.We need to know what's in your messaging so that we can make sure that thesetech companies aren't allowing crime on their platforms, et cetera, et cetera.So it was basically a hijacking of this, in many cases, kind of righteous momentwhere people recognize that this business model was pretty harmful,to fulfill the wishes that have been pervasive since well before then.At the same time that we're seeing privacy advocacy and a lot of those,you know, a lot of the things Linus, you and I had been doing for a long time,receiving less and less support and sort of, you know, out of the limelight.And so I think it was in that environment that things like chat controls,that things like the online safety bill and other, you know,paradigmatic examples of this, you client-side scanning to save children meme,grew up and then one of the reasons I was,There are many, many reasons Idecided to move from being on the board of Signal to full-time at Signal.One of them was that I saw this moment and I realized there weren't that manypeople fighting it and that one of the things that I could bring was a staunchwillingness to fight it.
Tim Pritlove 1:32:50
And how do you do that? I mean, can you walk us through a day?
Meredith Whittaker 1:32:55
I open up my laptop. Ja, um, um, well, we, you know, it's not,obviously, it's not just me, nothing like this is a singular thing.We work with a pretty broad coalition of folks. I'm sure, you know,many of your friends, many listeners, perhaps, are part of that.Um, Signal doesn't have a policy arm. It's a very kind of lean,targeted, pretty senior organization.But we do work with people around the globe EDRI and the EU a number of otherorganizations to keep tabs on what's happening we also are in a good position.We're a non-profit we are very committed to rigorous communication so we don'thave a history of hyper-marketing we don't do hyper-marketing now and so we'revery careful when we make a claim When we make a statement,we're backing that with citations. It's accurate.We're really marshalling the technical knowledge and prowess that we have.I almost think of it as clarifying the record. If there's a report that saysclient-side scanning is actually safe, we know it's not safe.Okay, well, there's an academic coalition that has written this letter.Signal can write a letter. We can begin to put a bit more weight on the scalesthat have been fairly light, given the dynamics I just outlined.And then I do media, I do public speaking.I think a lot about how to tell this story in a way that isn't boring or alienatingfor regular people, particularly because the story on the other side is so arresting.It's like we have to save children from abuse.And that every one of us it like hits you in the heart, right?Like myself, right? Like my amygdala is activated.I, you know, suddenly I just want to do something. I want to help,you know, give me the thing to do. How do we do that, right?And then sitting across from that and being like, well, let me tell you abouta one-way function, right? Like you can't, that's not.That's not going to work, right? And so like, how do you,How do you enter into that debate in a way that isn't dismissing the very grimand real problem that is being evoked and make it clear that the solution tothat problem that is being presented will not solve that problem, one,and two, will cause drastically worse problems for many people around the world?And that's the task at hand right now.
Tim Pritlove 1:35:32
So basically the discussion isled by pointing the other side to like the infeasibility of the approach.
Meredith Whittaker 1:35:41
Well, the infeasibility, the danger of the approach,a lot of evidence around the infeasibility of the approach that is either kindof willfully ignored or just not understood,and then figuring out how we explain that without being either accidentallyor genuinely callous about the concerns that have brought people to the table.
Tim Pritlove 1:36:05
Then they will say, but we have to do something.
Meredith Whittaker 1:36:08
Well, how about funding social services?How about, you know, what do you do? And I mean, like, if we're going to gothere, we're going to go there.Prince Andrew's walking around.
Tim Pritlove 1:36:20
No, you're suggesting to make the world better.
Meredith Whittaker 1:36:23
Jimmy Savile's walking around. You know, like what are the infrastructures inplace to make sure that when children are going through this,they're believed, they're protected.You know, what happens when it's your priest? What happens when it's your teacher?What happens when it's your brother, right? Like these are the questions thatare really hard to look in the face because they implicate social pathologiesand interpersonal relationships and power dynamics that are really,really difficult and often relate to emotionally challenging factors outsideof that or people's past experiences or what have you.So you're going right into very traumatic subjects.But I don't think we can have that conversation without having a real conversation.And then when you begin to pull back the layers there, you say,oh, well, the UK has been pushing for client-side scanning as a remediation to child abuse.But the UK government in 2023 funded social services at 7% of the amount recommended.Right? So like there's, you know, the roofs on the schools in the UKs are collapsing.Like, you know, there isn't support for this.And then if you, you know, if you look at, and I don't have public numbers toshare, but I've had a number of personal conversations.Okay, well, how many law enforcement people, how many people are tasked withactually sort of pursuing the,you know, like the criminality that may be reported via online imagery?In some cases, it's two. In one case, it's two in one country. Two people, right?So, like, you're not actually, like, if you begin to map this,what you see is a story that does not add up.And you see, like, what, you know, and this is where I get enraged.Because I'm like, you are fucking trading...On children's pain, to get your backdoor, whatever the fuck you want,pretending that you're solving it, so taking up the space for actual solutionsthat could actually help real children who are suffering now,and turning no attention to every glaring problem in this massive list,which is pretty obvious, even for me, and I'm not an expert here,I've just sort of sifted through this.So I think that's the dynamic we're walking into.
Tim Pritlove 1:38:47
They might only have two people, but at some point in time they might have doubled,by adding another one.
Linus Neumann 1:38:55
I agree that, I mean, it is quite telling how much emphasis is being laid on,hey, we really need client-side scanning and then the world is going to be safe.And, you know, if you say, well, how about we, you know, we fund support orany kind of prevention, preventive activities in um social care it's like yeah.
Meredith Whittaker 1:39:16
Yeah yeah well sorry we use the prefix online so that's not our,um and i think it's also like there's something people gravitate to the abstractionright if this is online child abuse then we don't have to deal with it in ourreal lives it becomes an abstraction that we can almost blame on the same platformsthat have been so unaccountable we can blame it as an internet phenomenon, not a phenomenon.And like, oh wait, our church doesn't have the infrastructure to actually dealwith this in a humane way, right? And I think that's a dynamic that we're also seeing here.
Linus Neumann 1:39:50
This is, by the way, one thing I find so interesting about Signal as a secure messenger.Well, it has become mainstream, but it has also managed to maintain a reputationof goodness, right? I mean, saying like, okay, text me on signals.Oh yeah, that's the secure messenger, blue symbol, looks nice,you know, very friendly user interface.Or it would be like, let's text on 3MAR, be like, oh, that's the complicated black one.Well, how about Telegram? And that's like, okay, that's a completely different end of the internet.And that makes me think of the curious case of Pavel Durov being detained inFrance and apparently at least charged because they refused to cooperate in numerous cases.Why do you dare to come to Europe, Meredith?
Meredith Whittaker 1:40:57
Well, I'm a brave person, Linus. Aber, zu sagen, ich denke,This is one of the places where more public education is necessary because Telegramis actually, you said the other end of the internet, it is very,very, very different from Signal.So Telegram is a social media platform. It allows mass broadcasts to millionsof people. You can go viral on Telegram.You can find strangers on Telegram via directories.There's a near-me feature that will geolocate things happening near you,all sorts of things that are not private, are not secure,are regulated completely differently from private and secure communicationslike Signal, which is solely a private and secure interpersonal communications app.
Linus Neumann 1:41:48
What about Signal Stories, though?
Meredith Whittaker 1:41:50
Signal Stories don't go viral. If I sent all the people in my contact list oneby one a photo or something...
Linus Neumann 1:41:59
I don't get your stories.
Meredith Whittaker 1:42:00
They're so cute, Linus. I wish, like, it's, you know.
Tim Pritlove 1:42:04
I hate it.
Linus Neumann 1:42:04
I mean, that feature was implemented. I was like, okay, where can I turn it off?
Meredith Whittaker 1:42:07
I'm sorry. Well, you're all missing my stories is all I'm saying,and they are pretty good.
Tim Pritlove 1:42:12
Have you sent us some? No.
Meredith Whittaker 1:42:14
Well, have you activated them to check? But we do let you deactivate them foreverand never bother you about them, which is, you know, part of the way that Ithink we maintain our reputation for not being shitty is we try to literally not be shitty.
Tim Pritlove 1:42:29
Yeah, thanks.
Meredith Whittaker 1:42:29
Right? And when we're designing Signal, we're actually very,very careful not to be a social media platform.We think about that in the design phase so everything we do can be as encrypted as possible.So that we don't know anything about you or as close to zero about the peoplewho use Signal as possible.And what we do know is we can say, yes, this phone number did sign up for aSignal account. We know when that phone number signed up for a signal countand we know last time they accessed it.Um, but we don't, you know, we, we would like to even not know that if it were possible.On the other hand, Telegram is a social media platform, which retains huge amounts of data,has a duty to under law to cooperate in turning over that data and was searchfunctions that has search functions, has, you know, directories so you can find new things.So it's a, it's a very different beast. And I think, one, because Durov has, you know, been very.I'm trying for a diplomatic word, like, has made statements that are not supportedby fact around Telegram being private and secure and kind of taken on this,you know, like yeoman's defender of free speech and privacy position.People often think Telegram is private and secure because it has a DMs feature, right?But, you know, Signal is just private and secure. So the TLDR on that is,there's really no danger for Signal here because we are very,very far away from Telegram.And we have set ourselves up so that one, such cooperation isn't required.And two, such cooperation is not possible.Because we literally, like you could put a gun to my head, I don't have thatdata. Whereas Telegram has servers and servers and servers full of that data.
Tim Pritlove 1:44:22
Leaving out Signal completely now, what do you think happened?Pawelldorf was put into custody. He's now free on bail.And France talked to him.
Meredith Whittaker 1:44:35
I have no idea. I mean, this is like...
Linus Neumann 1:44:38
Didn't he send you like a telegram message?
Meredith Whittaker 1:44:40
I haven't checked.
Tim Pritlove 1:44:43
You missed his story.
Linus Neumann 1:44:44
Yeah.
Meredith Whittaker 1:44:45
I mean, this is like overlaid the French legal system.I am not a lawyer, especially not in France. Some of the vagaries of their legalsystem in which, you know,any judge can open an investigation and the basis for the charges will not beknown until, you know, trial.And we're looking at years and years until then, with me not speaking Frenchwell at all with like weird translation, you know, like, so I want to stay awayfrom speculating there.But what it looks like based on the charges that were released in the pressrelease is that it was, you know, failure to comply with requests for data andthen, you know, kind of a handful of other charges added on that aren't as severe as those.
Tim Pritlove 1:45:28
So does Signal get these requests too?
Meredith Whittaker 1:45:31
You can go to Signal.org slash Big Brother and every request that we have beenforced to comply with because we fight them and have unsealed are posted thereshowing exactly how close to no data we are able to turn over and showing,and I think this is interesting for some of your listeners probably,You see what the law enforcement agencies in these requests are requesting,and it's often huge lists,massive amounts of data, which gives you a sense of just how much data surveillance,like a telegram or another platform,is commonly able to provide that Signal is not.
Linus Neumann 1:46:15
Oh, so it's actually every single request. I thought it would maybe be aggregated.
Meredith Whittaker 1:46:24
Yeah, no, it's PDFs of the request. It's not a transparency report, it's transparency.
Tim Pritlove 1:46:35
I like that.
Meredith Whittaker 1:46:38
Although it's just the ones that we can unseal, so we do have to go through that.
Linus Neumann 1:46:41
I would assume there are probably some with gag orders, right? be able to say?
Meredith Whittaker 1:46:46
There's no gag order on me being able to say.And there are some, but that's the fight to unseal it. And what I don't recallthe answer for right now is whether we're in one of those fights right now or not.But I'd have to check with my friends at the ACLU.
Tim Pritlove 1:47:05
So Signal as a company, how does it work?
Meredith Whittaker 1:47:10
It's a non-profit. So we're funded by donations onlyonly yeah um and weare thinking about in the future maybe having apaid tier for some features something like backups encryptedbackups which we're you know building right now um you know could we chargepeople for media storage or other expensive features but um you know that wouldbe in addition to donations and squarely within the non-profit structure thatkeeps us safe from pressure to surveil.
Tim Pritlove 1:47:42
And how do you get your talent? How do you get people to work for Signal?
Meredith Whittaker 1:47:48
We pay very well. And we are, I mean, it's a really cool mission, right?So imagine the jobs in tech are kind of depressing in many cases.Not everyone wants to go optimize an ad server and then Signal,you know, Signal you can work for core infrastructure for dissent and humanrights work and journalism around the world that, you know, without which alot of those things would be deeply imperiled.Like it's a real cool thing to get to do and support. And we pay well.
Linus Neumann 1:48:23
So not only do you get respected for your six-digit salary, but for doing goodthing, earning that money.
Meredith Whittaker 1:48:30
Es ist die original Silicon Valley dream.
Tim Pritlove 1:48:34
So ist es der letzte Platz in tech where people are actually happy?
Meredith Whittaker 1:48:40
Well, I would never presume to speak to the consciousness of another person without there.But I think, yeah, I am very happy. I think a lot of the people who are at Signal are very happy.And I think it's also like we're kind of part of a project. And this is...Shows that what we have in tech, what's built in the tech industry is not inevitable.There's a series of choices, a series of incentives, a business model that hasshaped tech into the form we have now, but it does not have to be that way, right?Like we can rewrite the stack, we can build alternatives.Nonprofits can work, right? We need capital, we need will, we need talent,we need all of those things.But the thing we have now is not inevitable. And I think of Signal as like akeystone species in the ecosystem, Kind of like, you know, like setting thebar, kind of regulating the rest, right? Like, you know, you can have privacy.You can have, you know, the right to private communications.You can subsist outside of this paradigm.And I think the future I want is that it's not just Signal, right?There are many, many other organizations and efforts sort of doing it differently,rejecting that paradigm, you know, drawing in capital there and,you know, away from the other place.Und beginnt zu marscheln die type of political will that is often very shallowlike the 500 million AI fund but marscheln it for something that is actuallysubstantive and is actually,making the kind of change to the tech ecosystem that I think we need to have a livable world.
Tim Pritlove 1:50:18
You mean not only a model for other communication companies but also a modelfor any kind of technology company?
Meredith Whittaker 1:50:28
Ja, well, I mean, we build a communications app, but we rely on telecom networks.We rely on server infrastructure. We rely on core libraries.
Tim Pritlove 1:50:43
That's not what I meant. I mean, I understood you that you think that the modusoperandi of Signal as a company might be something that other companies could also leverage and do.It's not only limited to some much-needed device.
Meredith Whittaker 1:51:04
I do think that there's a model there. I think I'm interested right now in researchinghybrid structures and tandem structures.Are there sort of for-profit,Are there areas of tech that aren't driven by surveillance?Are there ways you could fund nonprofits, fund some of this core infrastructure,libraries and other things that have been languishing for decades?How do you revitalize that? And then are there, you know, are there ways tobuild truly independent infrastructure outside of the, you know,three companies, five platforms model that, like, I think it's just clearlycritically dangerous at this point.
Linus Neumann 1:51:53
So, when it's about building infrastructures in our hands, that's not goingto be easier in an AI world, right?Where it's model data that we need, huge investments into these models justfor training, then for operating them.Do you see any future for this whole AI thing in users' hands,in our hands, serving our actual privacy needs and let's say private and business needs?
Meredith Whittaker 1:52:27
Well, I think my answer to that is that that future will rely on laying an independentinfrastructural bedrock and actually transforming some of the way we governand think about digital technology generally,Including being really attentive to things like how is data created?Who gets to decide what data we use to reflect our complex lives and realities?Who gets to decide how patterns in that data are made sense of?What analysis is done to that data? And then what we do with the sense we makeof it, right? What decisions we make, right? And so...We do all that, we transform what AI is and what it means.Because you're no longer just scraping all the detritus off the stupid web,which was deposited or created via this surveillance business model,packaging that in an LLM and calling that intelligence, right?You're actually having to grapple with the epistemic process by which data becomesa proxy for reality, and that proxy shapes our lives and institutions.And so I think AI itself, right now we're talking about these massive models, right?This laws of scale, this sort of like big American guy dream of the,you know, the largest in the world.But AI is a, you know, it's a very slippery term. It's not a technical termof art. It can apply to many, many different things. And there are small models.There are sort of, you know, heterodox approaches. There are expert systems,which they're now trying to bolt onto the side of generative systems.Because weight, probabilistic answers aren't true, so we need to bolt truthback on and we're kind of repeating a lot of the history of AI,kind of speedrunning it in the search for a business model.So my answer there is that a lot of the things that need to be done to simplydisarm and draw down the centralized power of these surveillance and infrastructure companiesare the same things that would need to be done to redefine, in a sense,what AI is and our relationship to how truth, how, you know,decisions, how, you know, I don't want to use the word truth actually,but like how information is made via analyzing data and who gets to control that.And I think, you know, my sensitivity to data in that answer comes directlyfrom my measurement experience, right?Like where you, you know, one upgrade to the Linux kernel across our serverfleet fundamentally changed.The kind of data we were able to create, like how it populated the schema.And meant that that data wasn't necessarily fungible with the data collectedon the older version of the kernel, right?And in order to solve that problem, I had to get a guy to go sit with the kernelmaintainers for like two years to make sure that the update wasn't going tofuck up the way we got TCB DOMS, basically.Um so like that you know that's andthat's that's then think about socialdata then think about data that reflects like who getsaccess to resources then think about all of the other things and umi think i think it's actually an exciting idea to think on like you know howdo we how do we create systems where we're much more attentive to that and recognizethat there is a you know it really matters how those choices are made how thoseyou know how data is created who gets a und wie es zu sagen ist, und wie es zu sagen ist.
Linus Neumann 1:56:01
Would you say in that future, so there is these large gen AI models and whatnot,and from other discussions we've had, I know that you probably believe thereis a stronger future for specialized models, expert models, not the generative ones.Would that be a prediction of how this whole AI thing is going to evolve?Not from the business model perspective or from the political perspective butfrom the actual technology and research perspective.Do you think there is still going to be exponential improvement in the Gen AIworld or do you think it's now the time for the smaller speedboats?
Meredith Whittaker 1:56:50
Well, I think it's definitely time for smaller speedboats and I want to indexon that word improvement because if we scratch the surface on some of these large models,some of which are generative you begin to realize that a lot of the claims toimprovement and accuracy are based on really narrow benchmarks and evaluationsthat don't reflect the performance of these models That's.
Linus Neumann 1:57:15
How Germans optimize their diesel engines.
Meredith Whittaker 1:57:19
Exactly So it's thereare things that Gen AI models can do I don't see them realistically going awayBut I do see the struggle for a market fit that can produce the kinds of returnsnecessary to prop up a massively energy intensive,massively infrastructurally intensive, extraordinarily capital intensive industry, right?So you have billions of dollars for a training run, just huge amounts of energyand effort needed to create a model.But okay, who's going to keep paying for a chatbot that's wrong?And so I think there is a struggle for market fit.I think you see this with things like Microsoft Recall where they pushed toimplement this. I don't know.Microsoft Recall was supposed to ship with Windows 11.
Tim Pritlove 1:58:18
Es ist aber nicht auf der Default.
Meredith Whittaker 1:58:21
Ja, wir haben das kleine Sache. Aber es ist ein AI-System, whose value propositionist, dass es will remember everything you were doing on your device for the last N months.
Linus Neumann 1:58:32
Das ist ein nice value proposition.
Meredith Whittaker 1:58:34
Ich weiß, value to your boss, maybe. Und so you can type, oh,ich will find this browser, ich will find this thing, und es will...I get it for you. I obviously don't use it. And how does it remember is really the key here.It remembers because it's taking screenshots of your device every five seconds,creating a library of those screenshots and accessing those as the data on whichit is able to claim intelligent memory.
Linus Neumann 1:59:05
That is not an efficient way to approach the problem.
Meredith Whittaker 1:59:10
And I don't need to know that I was doomscrolling. Like, you know,that's not a proud moment of memory for me.So, yeah, I mean, and to me what that says, like, that's not a very useful purpose.It's probably going to be marketed to enterprises for, you know,worker surveillance is my guess.But it shows that Microsoft is really trying to find a market for this, right?Because they clearly circumvented their QA process.They clearly circumvented their security evaluation. There was a lot of thingsthat clearly didn't happen and didn't happen at a company that is actively...Die haben OpenAI, sie investieren eine große Menge in Azure,sie sind in der Infrastruktur überall, all of sie sind.Aber Microsoft ist joked zu OpenAI, so sie hat die Leader Position für ein Moment.Und ich denke, also, dass Microsoftist wirklich zu regieren eine gute Reputation in der Security-World,es ist indicative of how desperate but that rush to market fit and the AI exceptionalismthat is driving it is that they just mess that up so egregiously.And I can hear some hacker in a Microsoft hallway being like,I don't fucking know. I just left that meeting.Because you can kind of sense how those things happen.
Tim Pritlove 2:00:27
What would you say is the dividing line right now between useful applicationsfor machine learning, expert systems, AI stuff, and the hype?
Meredith Whittaker 2:00:40
Well, I mean, the question to dig into is useful to whom?Because hype is useful to investors, right? You know, yada yada.
Tim Pritlove 2:00:51
I'm not talking about, yes.
Meredith Whittaker 2:00:52
I mean, I think AI is...
Tim Pritlove 2:00:56
In a computer science-y way. I mean, really like making applications possiblethat haven't been before, that actually do useful stuff to people or society.
Meredith Whittaker 2:01:07
Yeah, I mean, one use I think about a lot that is definitely,I'm sure, useful to intelligence services is, you know, I'm sure we all assumethat POTS telephony data is being collected en masse by every intelligence service,who can and has been for many, many, many years,and that data was probably not that useful for a long time because,you know, you're going to have to know, you're going to have to have a humanreview it or, you know, something like,It's probably a lot more useful now that you can quickly transcribe that withAI and sort of synthesize and search using these generative systems, right?So that's one example where I think it's probably almost certainly very,very useful and changes the calculus on like how dangerous this surveillancebusiness model is as well.But who is that useful to? It's not me and you.
Tim Pritlove 2:01:58
Yeah, you're going right into surveillance capitalism again.
Meredith Whittaker 2:02:02
No, well, that's my, I'm really good at making that turn.
Tim Pritlove 2:02:07
I'm looking for a rosy outlook into the future, like hope.
Meredith Whittaker 2:02:12
Yeah, well, I mean.
Tim Pritlove 2:02:13
Anything in store?
Meredith Whittaker 2:02:15
I refuse to plant my hope in delusion.
Linus Neumann 2:02:22
Okay, thanks, thanks.
Meredith Whittaker 2:02:25
And I am probably one of the more optimistic people you'll meet.
Tim Pritlove 2:02:32
All right.
Linus Neumann 2:02:33
Meredith, it was a pleasure.
Meredith Whittaker 2:02:34
So fun. Always so fun. Thank you for having me. It's been a really,really delightful conversation with you all.
Tim Pritlove 2:02:41
All right. Thank you. Thanks for listening and see you next week. Goodbye.
Linus Neumann 2:02:47
Bye-bye.

Shownotes

17 Gedanken zu „LNP504 The politics of intellectual shame

  1. Moin mon,
    ggf habt ihr den HTTP-StatusCode vergessen.

    Ich bin aber für was anderes hier ..
    .. da in den vergangenen Sendungen mehrfach erwähnt wurde, dass „ggf- die Theman ausgehen“ ..
    .. vllt. habt Ihr ja eine Sicht auf OpenDesk mitzuteilen

    Ich denke es ist „hinreichend politisch getrieben“ und gleichzeitig „OpenSource“ und „ein Dorn im Auge vom Microsoft“, um hier erwähnt zu werden.
    Auch nach der „gescheiterten“-Linux Umstellung in München.

    Falls ihr da nen passenden Gast findet oder ähnliches noch cooler.
    Ich fänd es nur schade, wenn es nichtmals in den Kurzmeldungen auftaucht – es fühlt sich, gemessen an OpenSource-Fortschritt in Deutschland schon ‚etwas größer‘ an.

    LG

    • Zustimmung. Gerade soll die komplette öff. Verwaltung durch proprietäre Cloud-Stacks von Microsoft & Co gekapert werden, schön gekoppelt mit Cloud-Anwendungen die rein zufällig nur in Azure laufen (und demnächst nicht mehr on-premise erhältlich sein sollen). Alternativen um da raus zu kommen wie OpenDesk werden im Bundeshaushalt kaputtgekürzt. Übrigens fordern Koalitionsvertrag u Cloud-Strategie der Bundesregierung etwas ganz anderes (Vorrang für OpenSource). Von VMware-Alternativen, Thema Betriebssysteme usw. ganz zu schweigen.

  2. Vielleicht hab ich den Punkt überhört aber was ist mit dem Elefant im Raum? Warum wurde nicht gefragt, wie sich das Hochhalten der Privacy mit der Tatsache vereinbart, dass Signal die Angabe meiner Telefonnummer verlangt? (was weder Telegram noch Threema tut)

    • Die Frage wurde glaube ich oft genug gestellt und oft genug beantwortet. Ich fand es spannender, sich auf aktuelle Themen zu konzentrieren.

      Wenn dich diese alte Debatte interessiert, empfehle ich dir (in dieser Reihenfolge)

      https://media.ccc.de/v/36c3-11086-the_ecosystem_is_moving

      https://support.signal.org/hc/en-us/articles/6712070553754-Phone-Number-Privacy-and-Usernames

      https://support.signal.org/hc/en-us/articles/6829998083994-Phone-Number-Privacy-and-Usernames-Deeper-Dive

    • Ganz ehrlich, der Privacy-Aspekt stört mich dabei am wenigsten. Viel ätzender finde ich, dass ich darauf achten muss, immer meine Mobilnummer zu behalten. Die Handynummer wird damit zum Single-Point-of-Failure. Mittlerweile kann man bei Signal wenigstens auch einen Alias erstellen, aber um das hinterlegen einer Nummer kommt man m. W. trotzdem nicht herum. Portierungen funktionieren zwar generell, sofern man nicht auswandert oder andere komplexe Lebensumstände hat, sind aber immer noch ein PITA, auch wenn sie nichts mehr kosten oder vielleicht auch weil sie nichts mehr kosten dürfen. Bei Threema muss ich weder E-Mail noch Telefonnummer hinterlegen und das einzige um was ich mich kümmern muss, ist mein Passwortsafe. Für viele, die noch weniger Wert auf Privacy legen und einfach nur einen Messenger haben wollen, mit dem sie all ihre Kontakte erreichen können, ist die Erreichbarkeit und Auffindbarkeit per Telefonnummer allerdings ein Riesenplus. Daher benutze ich Signal bzw. Molly mittlerweile hauptsächlich dienstlich und Threema privat. Da MMS in Deutschland mittlerweile Vergangenheit ist, RCS auch von Apple unterstützt wird, und RCS prinzipiell auch Ende-zu-Ende-Verschlüsselung bietet, könnte man auf den ganzen Krempel eigentlich auch verzichten. Das Schöne an Telefonnummer wie E-Mail-Adressen ist immer noch, dass sie Anbieter übergreifend funktionieren. Die Fragmentierung bzw. Appisierung des Internets fand ich schon immer mistig und eventuell ist das auch mit *ein* Grund, dass es die „Dead Internet Theory“ gibt, weil die von einander abgekapselten Blasen kaum voneinander mitbekommen. Im Ausland gerade in ärmeren Ländern hat Whatapp die geschäftliche Kommunkation per Telefonie und E-Mail fast komplett ersetzt bzw, ist das der einzige publizierte Kommunikationskanal. Das emfinde ich schon sehr gruselig und als Pervertierung der Idee hinter dem Internet ein möglichst dezentrales, redundantes Netzwerk zu haben. Das ist geradezu selbstverschuldete Autokratie.

      • Man kann auf Signal seine Telefonnummer auch ändern, man muss also seine Telefonnummer nicht für immer behalten, nur wenn man sie wechselt, sollte man das auch in Signal umstellen.

  3. Toller Gast. Würde ich mir auch gerne anhören…
    …aber leider kann ich nicht gut genug Englisch.

    Der letze „Alles gesagt“ Podcast war mit Brian Ferry – und auch in Englisch.
    Den haben sie mit KI ins Deutsche übersetzt.
    Das Ergebnis ist wirklich, wirklich klasse.

    Hier das Original:
    https://www.zeit.de/kultur/2024-10/bryan-ferry-interviewpodcast-alles-gesagt-english

    und hier die deutsch KI-Version:
    https://www.zeit.de/kultur/2024-10/interviewpodcast-alles-gesagt-bryan-ferry

    Könntet Ihr das mit diesem Podcast vielleicht auch machen – oder ist das zu viel Aufwand?
    Das wäre jedenfalls toll.
    Es gibt bestimmt noch mehr die kein Englisch können und das gerne hören würden.

    Gruss
    Andreas aus Frankfurt/M.

    • Ich glaube Meredith Whittaker hätte keine große Freude damit, ihr zweistündiges Interview durch eine AI gejagt zu sehen (wenngleich das natürlich vermutlich vermehrt passiert).

      Ich finde es selbst schade, dass Leute dieses Interview aufgrund einer Sprachbarriere nicht hören können – Whittaker ist eine coole Person, somit kannst du mit der Information kannst du jetzt natürlich machen was du willst/glaubst.

  4. Hallo :)

    sie erwähnte eine Dissertation von einer Katarina Ryder(?) die sich mit den verhandlungen im Zuge der crypto wars beschäftigt? Hat hier wer zufälligerweise mehr Infos? Ich konnte dazu keine weiteren Infos finden, eventuell habe ich aber auch den Namen nicht richtig verstanden :D

    Liebe Grüße und vielen Dank für die spannende Folge!

  5. Vielen Dank für das fetzige Interview,
    ich denke die Zeit die ihr euch genommen habt, hat sich für alle Beteiligten gelohnt!

    Danke auch an Meredith,
    für die offene Kommunikation „transparency [statt einem] transparency report“.
    Ich (wie viele hier) spenden / unterstützen Signal ja schon lange. Trotzdem ist es wichtig, den Personen die die Organisation bilden, vertrauen zu können. Dabei hat mir das Interview geholfen.

    Als politisch aktive Person, bin ich mit meinen Mitstreitenden auf niedrigschwellige, sichere Gruppenchats angewiesen. Da ist Signal aktuell unersetzlich.
    Auch im Hinblick auf kommende Wahlen gibt mir das viel Rückendeckung.

  6. An dieser Stelle möchte ich einfach auch nochmal Danke sagen. Danke für diese fulminante Sendung und dieses Interview.

    …wie meine Lehrerin früher immer sagte: „In english, please.“

    To join the show and listen to Meredith’s remarks was such a pleasure. It reflects what the values of privacy and transparency really are and even what efforts it may take, to keep ‚em high. The work of Signal, as well as her personal engagement, therefore is highly appreciated.

    While I also like to credit the achievements of the hosts, in beeing a constant voice on all these topics, for those who listen!

    Thank y’all! And keep goin‘!

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert

Diese Website verwendet Akismet, um Spam zu reduzieren. Erfahre mehr darüber, wie deine Kommentardaten verarbeitet werden.