Why Signal’s Meredith Whittaker thinks OpenAI isn’t that open | StrictlyVC LA

Thank you thank you All So there's there's an on excuse me there Is a switch on the microphone uh thank You all for coming out uh and thank you Meredith for joining us delightful to be Here thank You so uh I want to start out on a Positive note uh what are your personal Top three existential threats to privacy And free Expression yeah let's dig in Devon um Just ride just go for it just what Whatever comes to mind unlike most People in Tech I don't usually think in Stack ranks so I'm not sure this is uh My top three nor that I would wait them Accordingly as we go through your your Mental like Castle what do you see well There are many threats coming from Different Realms and so let's start in The legis islative realm where we're Seeing a number of I would say parochial And very politically motivated pieces of Legislation often indexed on the idea of Protecting children um from the crimes Done in the dark um and these have been Used to push for something that's Actually a very old wish of Security Services governments autocrats which is To systematically backdoor strong Encryption so there should be noet Network that cannot be accessible to you Know lawful or unlawful intercept and of

Course like since 1976 we know that you Know a backdoor for the good guys is a Back door for the bad guys but this wish Has not died or gone away and it's more Of a power struggle than kind of an Intellectual argument and we are Seeing a lot of new legislation that you Know under the guise of child protection Often I believe pushed by well-meaning People who just don't have the knowledge Or education to understand the Implications of what they're doing that Could you know fundamentally eliminate The ability to communicate privately Digitally and this is the legislators That have no education or information I Mean I'm not in the room like I don't Who's the lobbyist is this a Biometrics Company in the US funding because they Want to be the people who are doing some Scanning or monitoring or something like I don't know I think they're a number of Interests converging but the overall Theme I'm seeing is a deep desire for Accountability in Tech which we saw sort Of animated mid 2010s that then was has been you know Weaponized I think we're seeing kind of Surveillance wine in accountability Bottles where it's like accountability Looks like more monitors more oversights More back doors more elimination of Places where people can express or Communicate freely instead of actually

Checking on the business models that Have created sort of you know massive Platforms that whose surveillance Advertising modalities can be easily Weaponized for information Ops or you Know doxing or whatever it is right like They're you know an unwillingness to hit At the root of the problem and instead What we see is effectively proposals to Extend surveillance into government and NGO sectors in the name of Accountability so that's one that's one And well let's let's uh let's let's Double down on that since we'll get to Your next two we'll get there don't Worry one is AI so oh don't worry we're Getting there and uh yeah and and if any Of the threats are in this room we can Talk about it later or you can please Raise your hand point with your Eyes uh don't worry no I know it's not Anybody here necessarily uh and uh so But with specifically one of the one of The laws we were talking about uh is the UK right now is uh with using the Investigative Powers Act they're sort of Bringing it out of the the like the Broom closet to be like oh well now We're not going to allow updates that we Don't uh approve and then you can't roll Those out glob There's this it's this effort of what You're talking about to sort of stifle Specifically strong encryption end to

End encryption for the purposes you Mentioned is there is there anything That you can do you're not in the room So I guess we just have to let it happen Right I mean we make our case clearly And we make our opinion known and we Have a lot of technical credibility so We you know we advise on this we were Very outspoken when the Online safety Bill was being proposed and we see this There's a you know kind of a dual attack Happening one Online safety Bill and Similar legislation proposing Effectively back doors right in the name Of this sort of accountability and and Kind of you know protecting people from Harm and then there's things like the Investigatory Powers Act which is Effectively claiming for the UK the Ability to demand that any tech company Across all jurisdictions check in with The UK government before you ship a Security patch because they may may be Exploiting that patch somewhere for some Business they want to keep doing right Like imagine like is a form of again Parochial magical thinking here but it's Very dangerous because we are being Threatened to a return before the Liberalization of encryption in 1999 Kind of an early 90s Paradigm where the Government has a monopoly on encryption And the right to digital privacy and Where the ability to deploy encryption

Or privacy updates or anything that Would secure and Harden your service Become something you have to get Permission from the government to do Which was the Paradigm up until the late 90s so it's you know we need to pay Attention to this and honestly I think We need the VC community and the larger Tech companies more involved in naming What a threat this is to the industry And pushing back it's uh very Presumptuous I must say to think that You can replace the us as the Global Security force on uh you know thought Police and that sort of thing I can't Speak for the UK I don't know if anybody Could do it maybe it's them uh in the Broader over in that in that General Region uh I want to say though this is Uh since we're since we're in that Geographic region the EU is actually Moving towards this interoperability Thing with with messaging obviously that Uh could be a huge opportunity for Signal but maybe I'm not thinking it Through and this is also bad News well who has worked in a standards Process or like a standard's body anyone Here like done the ETF or WC3 okay me Too whoever you are Um it's a really messy process it's kind Of you know like it's the original Design by consensus which came from a You know really good place right these

Were like academics the gray beards of The internet sort of you know Recognizing that for a network to Interconnect we have to have different Standards we agree on right that's like The you know logical reality there right And so we still need these standards and The you know the dma is this sort of you Know law that wants to figure out a Solution to the fact that you have a Handful of messaging monopolies that Don't talk to each other sort of the you Know blue green bubbles on on IMessage but it's sort of doing so Through a standards process in a way so I think the spirit makes a lot of sense But of course signal can't interoperate With another messaging platform without Them raising their privacy bar Significantly because we don't just Encrypt the contents of messages using The signal protocol we encrypt metadata We encrypt your profile name your Profile photo who's in your contact list Who you talk to when you talk to them None of that is data we have anyone who Runs our servers had anyone has but the People involved in that communication Right and so that would need to be the Level of privacy and security agreed Across the board with anyone we Interoperated with before we could Consent to interoperate because they're You know we we make very clear

Technologically backed Promises to the People who rely on Signal often in Places where physical security is Directly linked to digital privacy that We simply are not willing to adulterate So you know that's just one layer of Complication I think there are real Issues around Integrity trust and safety How do you adjudicate things you know When you have accounts from one you know Platform talking to another um and so I Think I really agree with the spirit of It but you know what we are concerned About is that it could actually drop the Standard of privacy creating kind of an Interoperative monolith that you know Further relegates those who are you know Demanding a um a standard of privacy With a lot of Integrity to a more Marginal position gotcha yeah so you may Still be the the weird app standing in The corner in a couple of years as the Dma progresses I mean we've never been The weird app standing in the corner but Um uh so but you so speaking of speaking Of uh safety and privacy there's a new Signal sort of standard and feature the Uh using usernames rather than uh phone Numbers as a sort of primary uh Interface for doing contacts can you Tell us tell us a little about that like It sounds significant but I can't say Quite how or why it took this long to Sort of make this the the default well

Let me start uh by kind of explaining That with an example um in India Recently it has become a requirement in Order to obtain a SIM card to submit to A biometric facial recognition scan that Is not just happening in India we're Seeing a number of jurisdictions where To obtain a phone number you are Required to provide more and more Personal information some in some places Like Taiwan that is linked to a Government ID databases that often get Breached and cause a lot of problems so That identifier in the US you can get a Burner still you know you can buy that Data from cell phone companies and it's Not always private but it's not it's not As acute a privacy issue in the US as it Is in many other places and that is Simply increasing so a request we got Frequently from journalists in Conflict Zones from Human Rights workers was like Hey we love it but the phone number is a Real issue for us we need to be able to Speak with people without sharing this Information we need to be in groups of Strangers where we're not afraid that They can scrape that and we need to be Able to initiate conversations with Others without sharing our phone number Because again that you know that's my Biometrics that's everything else and That can leak a significant amount of Information so we did a Hu I mean we

Basically turned our architecture inside Out to support this and to support it in A way that I'm really proud of because You know as signal we do not want to Take responsibility for Content we are Not entering into the sort of content Adjudicate ation business but of course With usernames traditionally you create A new name space right you create Something that you in effect have to Monitor perhaps police perhaps censor Kind of you know deal with and so we did This in what I would say is a sort of Safety by Design way that allowed us to Stay very true to our principles which Is we just don't take on that work we're Unwilling to you know create a a block List or other things to sort of Determine what is and is not appropriate We're also unwilling to create new Surfaces for harm right like we Recognize that that can be a real issue So what are we going to do we're going To design it so that we've minimized or I believe eliminated the sort of harm Space so the username is not a handle It's not shown inside the app it's not Something we have a directory for but it Replaces the phone number when you go to Initiate contact so I think there's Actually kind of a paradigm around safe And sort of you design with Integrity That we're also pushing forward as we Add a very essential layer of privacy to

The App and you can read about that should Launch soon in the new next client Version 7.0 it will be launching to Everyone who has that version updated uh And if you're a beta user you can go in And like claim your username now uh if You're if you're about That I will uh So speaking of Designing With Integrity let's talk about AI um we We talked we talked So for so long we we talked in at Disrupt about Ai and you wrote this Great paper everybody should read Mer's Paper and your your co-authors about how The the claims of openness in by AI Companies and AI developers aren't Really true and so we we kind of talk That to death but there's something else I want to talk about which is the power Behind the power that we've all seen Sort of emerge like erupt out of the the New AI economy and that's Nvidia which Is this just huge gigantic I don't even Know how to describe it it's absolutely Without Nvidia there's no AI it doesn't Exist it's not possible the chip Monopoly yeah it's and the Cuda Monopoly And the Cuda Monopoly yeah it's all These things so what is give me your Give me your Nvidia take I mean we all Everybody H everybody uses it everybody Here uses something that ultimately Drills down to a Nvidia GPU or h100 or

Whatever uh what is what is your take on This like is this is it has it become a Dangerous company in this way I mean I Don't we have a lot of Spider-Man Pointing at each other right in you know Like yes Nvidia is a core dependency and It is not being displaced soon they have The Fabs tied up they have the contracts With that Dutch company that does the um You know extreme ultraviolent Lithography they have Cuda which is not A thing that can be easily displaced Like when you talk about a lack of Talents for machine learning you're Talking about a lack of Cuda developers They start started that framework in 2006 they've been integrating libraries And sort of you kind of Standards from The academic research Community for a Long time they've been hosting developer Communities so like the water AI Research swims in and AI development is NVIDIA it's the the analogy for those of You who don't spend time in this in this Realm is basically the Apple developer Ecosystem right a huge like great Libraries like beautiful ux affordances You can do it's all open to use but of Course it only runs on Apple's Proprietary systems their Hardware right And that's the that's what Cuda is but That's for the entire AIML ecosystem and That's why you know it doesn't AMD has a Plausible competitor they don't have

Cuda right and that's why AMD isn't you Know and you know of course Google has Tpus and tensor flow but that is used Internally and for some jobs it's a Little more efficient but they still License gpus because that's the standard Right that's how machine learning gets Built so Nvidia is a giant Accord Dependency it's not going to be easy to Displace you know particularly if you Look at just sort of the Tsmc um kind of the the process to Actually create a new Fab create new you Know capacity there it's just there's no Time right um or the timelines we're Looking at are not the sort of AI hype Cycle timelines but I kind of you know I'm seeing Microsoft pointing fingers at Nvidia now and saying like oh you know You really need to if you're if you're Worried about Monopoly do not look to PO Microsoft you look to Nvidia they're the Ones and you also look to Google Google Is the only they they put out this sort Of uh PR missive last week um kind of Their AI access principles and they they Talked about Google being the only Vertically integrated company from App Store to chips and that's true right but Then Google published a couple days Later like Microsoft is actually the Monopoly because it has the open Ai and Sort of the Azure Monopoly right so There's a lot like no one is innocent

Here here um I think you should leave Hot dog guy we're all trying to find the Guy who did this um and I think we need To recognize like this isn't you know That it's you know AI is dependent on Big Tech it requires big Tech resources It is not open in any sense we can be Honest about right if you if you need $100 million for a training run that is Not an open resource right if you need $100 million to deploy at scale for a Month that is not open right so we need To be kind of honest about how we're Using these terms um but I don't I don't Want the deflection toward like Nvidia Is the culprit of the weak to detract From what we're dealing with is Massively concentrated power and I don't Think there's a way To diffuse that without actually Radically Shifting the bigger is better AI Paradigm which is dominating the Field right now and it's it's dominating For because it's essentially been Effective in creating these largely the Llms but uh you know some other some Other models uh it's it's actually it's Creating the use cases that people are Gobbling up although I'm I'm still not Sure if people are actually use cases Well that's the they're the you know the Ones everybody are using no but all of Them no but no but um I think I am not Clear what the real business model is

Right like chat gbt is an advertisement For GPT apis you can license from aure But it's not making Microsoft any money What's making Microsoft their money is You know the ability to license those Apis and if you look at the cost of Training and then inference of these Models I don't you know an email prompt Or an information retrieval intermediary Or even co-pilot it's not clear that the Amount of money going into the solution Is really how when we want to solve Those problems so I see you know I see I See the business model not being baked I Do see the only pathway toward customers Still being through these large tech Companies right like that was you know The the light speed um gentleman who was Up here before said that more Diplomatically but like what did mraw Need they need compute and they need Access to customers and there's no you Know that's Cloud contracts or that's Owning a massive platform like meta There's still no business model beyond That and what problem these solve for The cost is still not clear Um you know I'm sure you all are Diversified you'll be fine but sure it's Fine and what do you think about the uh What do you think about the databases That are being built I think when I Think about how much chat GPT or you Know Claud or whatever is being used

Every day thousands and thousands Millions and millions of prompts asking For this and that and all this kind of Thing and that's all going into a Gigantic bucket where they're like boy Is this bucket going to be valuable in a Couple of months when we make the next One or try to sell it to the to the next Guy I mean like like all the data they Are both collecting and creating every Single yeah I mean I think this is a Surveillance technology right it doesn't You know it relies on surveillance which Is the mass data that is scraped from The web and sort of you know labeled or You know calibrated depending on the Type of model You're Building um this is A technology that produces surveillance Data even if it's through inference not Through sort of direct observation or Its proxy and this is a technology that In using it you are providing very Intimate data often to these companies That they're using for you know whatever And you know we don't have a federal Privacy Law so for what whatever is Still kind of the the Paradigm there uh We only have a few seconds left but I Wanted to give you the opportunity to Clap back at Apple for their Quantum Shade that they threw at you just in Case you wanted that oh no we being Talked about by Apple is fine um you Know welcome to the party we were there

May 2023 and it is really important now to Begin securing your crypto systems for a Future in which sufficiently powerful Quantum computers could crack encryption Today and read all the data that we're Protecting with it so signal did that we Deployed it in 2023 we were the first Messenger uh to do so Apple followed Suit very recently and sort of did a Whole you know their marketing team spun Up beautiful branding they're so good at That you know with a kind of explanation Of adding postquantum but like we're a Nonprofit we're not here trying to sell Signal over Apple if we can raise the Bar and keep raising it that's actually Our goal so it is really good that Apple's doing that and you all should Actually worry about that now because You don't want to be 10 years from now Realizing that everything in your Encrypted database is now open to Scrutiny because you didn't secure it With a PQ uh Algorithm oh well that's something to Worry about 10 years from now I guess or We can talk about it then that or Climate I don't know but all right well Thank you very much Meredith always a Pleasure thank you


Coinbase is a popular cryptocurrency exchange. It makes it easy to buy, sell, and exchange cryptocurrencies like Bitcoin. Coinbase also has a brokerage service that makes it easy to buy Bitcoin as easily as buying stocks through an online broker. However, Coinbase can be expensive due to the fees it charges and its poor customer service.

Leave a Comment

    • bitcoinBitcoin (BTC) $ 62,995.00 0.04%
    • ethereumEthereum (ETH) $ 3,070.24 0.14%
    • tetherTether (USDT) $ 0.999812 0.05%
    • bnbBNB (BNB) $ 535.78 3.18%
    • solanaSolana (SOL) $ 133.28 2.85%
    • usd-coinUSDC (USDC) $ 1.00 0.03%
    • staked-etherLido Staked Ether (STETH) $ 3,063.28 0.05%
    • xrpXRP (XRP) $ 0.492601 0.1%
    • dogecoinDogecoin (DOGE) $ 0.153669 4.42%
    • the-open-networkToncoin (TON) $ 6.14 8.26%