- 5 minutes ago
Signal President Meredith Whittaker discussed privacy concerns due to the rise of AI, as companies need to collect user data to train models. On the other hand, this also creates a dependency on private companies for governments who need AI tools.
Category
đź—ž
NewsTranscript
00:00good morning hello and welcome to day two of the india today conclave 2026
00:07we're in the age of disruption and the disruption is not just taking place in west asia and the
00:14world in terms of conflict but more importantly it's taking place in the world of big tech
00:21and our first subject this morning is designed for the times in which we live
00:27machine versus mankind agentic ai and the battle for human agency i'm delighted to welcome a very
00:37special guest meredith brittaker is president of the signal foundation a leading voice on tech
00:43accountability on privacy and she is here to discuss the issue of agentic ai because as machines grow
00:52more autonomous the question rooms who really is in control is the battle between machine capability
01:02and human agency and where should society each and every one of you here and watching this program
01:09draw the line welcome meredith uh to india good to have you here thank you it's a pleasure to be
01:15here
01:15thank you let's start meredith with that central question one which is should be consuming the
01:21world which is consuming ai at the moment who really is in control of this crazy world of ai around
01:31us is
01:31there anyone at all individual or government or organization that really is in control of this ai revolution
01:41well i hate to break it to you it's me i am in control um that's a joke um it
01:50didn't land well but
01:51uh we'll move on and i think that uh that um just to answer this question i want to give
01:58a little bit of history
01:59and so please bear with me i have a academic tendencies here but i think it helps in a world
02:05of so much hype
02:06where ai is everything and nothing in so many conversations to ground us materially so there's a question we might
02:14ask
02:14why are all the dominant ai companies right now the same companies that emerge through web 1.0 the
02:24platform economy to become the hyperscalers and the giants so we do see a bipolar universe when it comes
02:32to ai where you have the giants in the u.s and then some in china you know duking it
02:38out for ai supremacy is
02:40the sort of rough narrative and i think we have to look back to the 1990s and a number of
02:47moves that
02:48were made in the u.s to privatize formerly government and university infrastructure turning the internet
02:58over to the private sector in a way that allowed for these platform monopolies so if you look at some
03:05decisions made by the clinton administration what they did was effectively hand over
03:11these networks to private companies with no restriction and anyone who knows the history of
03:17communication networks understands that they function as natural monopolies from the post
03:23office to telecoms you always have guard rails around these networks you know so that you can
03:29can manage them so you don't have that monopolistic tendency in this case those guard rails weren't put
03:37in place and so a business model built on scale at all costs around the globe around data collection so
03:45you
03:45could create and sell ads and around infrastructural dominance to support that data sort of cemented itself
03:54among key players in the 1990s and those are exactly the ingredients that in the early 2010s we all realized
04:04in tech where i was at google could be put to good use if we applied them in combination with
04:11deep learning
04:12with an old ai approach you know approaches from the 1980s so all of this may sound a little dry
04:19a little
04:19academic but i think it's really important to that question you just asked who controls this so so
04:25let me just let me just simplify that academic answer uh to to give to to explain the first big
04:32challenge
04:33therefore to my question who is in control is you're just flagging off the fear of concentration of power
04:42of data of millions of people across the world in the hands of a few private companies am i broadly
04:49correct without necessarily having the guard rails that are needed to ensure privacy safety and
04:57security that is absolutely correct yes you distilled that beautifully and i think we have to look at this
05:04is not just data in the hands of these companies it is the models the ai models that they can
05:11train the
05:11infrastructure to train these models and importantly it's the distribution networks it's the cloud services
05:17that they can license the ai from the apis that people you know white label to turn into different
05:24products it is the platforms the facebook's the instagrams the amazon marketplaces they can integrate
05:31the ai into for dynamic pricing or for ad serving and so these resources combined are what's needed to
05:40create and then use ai from head to tail i think what this also gives these entities that we haven't
05:48discussed enough is under the banner of intelligence they begin to have the authority to tell our stories
05:58for us to define us to define our reality to define our worlds in ways that don't map to traditional
06:05data
06:05privacy arguments but nonetheless need a lot of scrutiny given the power that the authority these these
06:13models and the companies who control them are are gaining as these technologies are you know introduced
06:19throughout our institutions and infrastructures let me come therefore from what you've just said to two
06:25points point number one uh in india at the moment ai for the common person and then a large number
06:32of
06:32people consuming it is associated with the chat gpd with the gemini with a claude you're using it for
06:39basic information but we are already now pivoting into a world of agentic ai which is the next level where
06:45essentially uh you've got machines making decisions often on our behalf and perhaps as a result making life
06:54easier for us and therefore a number of people are going to choose possibly convenience over privacy
07:02what we did possibly when the social media revolution came we just embraced it not realizing that our
07:09data may be compromised but we are choosing in many ways convenience over privacy do you believe that
07:16there is a real threat or is it exaggerated that as a result we are eventually going to lead into
07:22a
07:23world where we are going to surrender human agency to algorithms that an algorithm is going to set
07:28how our lives are going to be led do you really see that happening in the next decade or even
07:33in the
07:34next year well it's not an absolute we're already living that way in some sense right if we go into
07:41a bank and an agentic system determines we don't get a loan we have just been told how we live
07:46in a
07:46certain respect by an algorithm that informed a bank that has very material consequences on our lives so we
07:53already are steered and sorted and ordered and allocated or or withdrawn resources due to these
08:00systems i think this agentic turn however is is very important and it's very important for signal
08:07how many people in the room use signal how many people in this room use signal well we love to
08:15see
08:15it it's um so that's that's less than 10 percent what's your message to them um what's your message to
08:22them everyone else should probably use signal i make no money when you use signal we're a non-profit
08:29but we are the gold standard for private communication militaries governments boardrooms really
08:35anyone who recognizes that private communication the ability to strategize the ability to game plan
08:43requires infrastructure that actually keeps your secrets uses signal and we've become a core
08:50infrastructure for a reason we are the only large-scale messaging app that is open source that guarantees
08:59the right to private communication not because i say so but because you can scrutinize our encryption
09:05protocol the security and hacker community have spent over a decade looking at our code and have found no
09:12active vulnerabilities and we exist as a non-profit with one mission to ensure that this right survives
09:19in a world where almost every other service or interaction is collecting data often letting that
09:27data out in a breach selling it to data brokers or using it to you know model your preferences train
09:34their ai models whatever else it is so we so so so so to distill what you're saying very quickly
09:39what you're telling those who are not using signal let's say for a moment that if you zealously are concerned
09:45about your privacy and the privacy of your messaging you should be using signal
09:50am i am i broadly correct you are
09:54completely correct if you have anything serious to say use signal
09:58you don't want that coming out in discovery
10:01but what if meredith what if meredith the criticism that comes what if there is an anti-social criminal
10:07network and this criticisms have been made in the u.s and other parts you signal because they will then
10:13want to create their own little private universe which is not governed by the rules and norms or laws of
10:20the
10:20rest of the world i am an anti-social anti a mafia of sorts that is using signal to pass
10:27these encrypted messages
10:29well i think the what if there were a bad guy
10:33thought exercise is very convenient for people who want to withdraw rights and i need to be very blunt
10:40about that because we are in a golden age of data collection and surveillance law enforcement and
10:46governments have never had more access to data over the past 30 years we've become a datafied
10:53society in which our interactions our transactions our social networks our employment history our
10:59educational history i have to take many many more breaths to get down that list are in the hands of
11:05private and government actors are now being queried by powerful ai models that are finding patterns and
11:12identifying people across those data troves we have never had more data the art now is finding
11:19that needle in that haystack of data the idea that the only barrier between law enforcement
11:27and catching the criminals is one application that holds the human right it's a human right for
11:35a reason to private communication sacrosanct and ensures that in a world where almost everything else is
11:42is surveilled that is not an evidenced belief and if you look around at how law enforcement is resourcing
11:50their task forces on some of the most heinous crimes what social supports are going to victims
11:56how are we prosecuting the people who appear in prominent dossiers and files as very clearly at least aware of
12:07criminal behavior if those people are very prominent people how is all of that working on a social and
12:14political level we begin to have to roundly question the assumption that the only problem here
12:22is that we ensure the right to private communication there is no evidence for that which which brings me
12:28to my to my second point because to to function well ai assistants need access to our emails they need
12:36access to calendars contacts conversations therefore from what you seem to suggest we are already building
12:44or have already built the most intimate surveillance system that mankind has created we are already being
12:51completely surveyed our lives have completely in a way been intruded with where does that leave the
12:57right to privacy do i even have a right to privacy given this kind of surveillance system that has already
13:03been
13:03created that's a great question because with the introduction of these agents we have to question that and this
13:11is something signal is very concerned about so one of the issues we have is the agent is a bit
13:17of a
13:17a marketing term right what it indicates is an ai enabled system so most agents have you know most of
13:25the big agents the gemini is the clods have a handful of ai models working behind the scenes but they're
13:32software systems that run on operating systems they run on hardware they're you know and they are
13:39increasingly being implemented into operating systems at a very low level
13:45and in order to work there's two things that are true an agent to do the things you want it
13:52to do
13:52needs to have access to data needs to know as much as possible about you or it can't really act
13:57on your
13:57behalf and it needs to be able to act it needs permission to act like it's you to you know
14:05it's going to plan a
14:06car trip with you and your kids well it has to have your credit card so that it can book
14:12the car it has to
14:13have your calendar so it can find a good time to that it has to have maybe access to your
14:18signal
14:19so it can text your kids and make sure they're waiting at the right time when the car picks them
14:23up well all of those things are effectively creating vectors for invasion of privacy when you look at the
14:30security architectures of many of these agentic systems they are the nightmare of cyber security
14:37professionals they are extremely vulnerable and you know what we have here i think is a very uneasy
14:45tension in which the amount of capex the amount of investment that has been allocated to these extremely
14:53expensive ai models and infrastructure is pushing a lot of these companies to introduce ai in ways that
15:01are bypassing traditional cyber security safeguards or safety engineering and when it comes to integration
15:09in our operating systems where we have three large operating systems we all use ios mac os windows android
15:18this creates a systemic vulnerability that we're very concerned about because of course we're an
15:24application we also have to build on the operating system and if the operating system as the foundation
15:30and the toolbox that we as signal rely on you know cuts a hole between the blood brain barrier between
15:38the operating system and the application letting people in you have bypassed our encryption and you
15:43have undermined key security guarantees and so signal is the example that proves the rule but this danger
15:49is coming for anyone who develops at the application layer and anyone who has to rely on these operating
15:55systems which is all of us that's your military that's your mom that's everyone and so i think there
16:02is a need to press pause and just bring some of the adults into the room realize there's a lot
16:07of hype
16:08clouding our vision and that the consequences for this kind of cyber security recklessness are you know
16:15significant this is institutions core infrastructures you know highly regulated industries that that can be
16:24attacked through these vectors but how am i going to press the pause button it's too late to press the
16:30pause button
16:31you know that that that engine has already rolled on how am i going to today press the pause button
16:38do i have the capacity to have my choice i mean is it about individual i've surrendered my individual
16:45freedom let's be very clear i've surrendered my individual freedom the moment i've entered into the uh into the
16:52world of ai or agentic ai so having surrendered it now how do i put in safety guard rails to
17:00ensure
17:01that at least some minimal privacy of my life of my data uh is with me is that at all
17:08possible
17:09i think this is a i love this question you know we can look at a little historical antecedent here
17:16how many people here know that the fossil fuel industry invented the concept of the individual
17:21carbon footprint i know that because i read weird stuff they did that in part because a lot of
17:28criticism was coming to the fossil fuel companies for their role institutionally industrially and they
17:35said you know it's much easier to narrate this as individual choice i feel bad because i didn't recycle i
17:41feel bad but ultimately when we look at who could actually shift the needle it's the large companies
17:47who are doing the lion's share and in a sense i think with this narrative we're dealing with something
17:53similar you know what can i do me as meredith not actually being in control can't do that much
18:00but i did just name a fact here there's you know three major companies that control these operating
18:06systems where does the pause button exist you know i worked at google for over 13 years
18:13we had cyber security you know security review of every launch they would look at an architecture
18:18like this five years ago and they would say to the person who presented it you're fired because
18:22clearly you didn't pass 101 this is not something that we would ever ship to production and so the
18:29pause buttons exist but i think we need to you know ai is not an ephemeral pollen floating
18:35around in our air that just escaped the lab somewhere ai is a material set of infrastructures
18:41built on very specific resources that are in the hands of a handful of actors and so that does give
18:48us a target when we're talking about how do you pause and who actually has the levers for pause this
18:54is
18:54not saying no ai this is saying stop reckless deployments and recognize the collateral consequences
19:00of making short-term decisions on the backs of our long-term security and privacy you know what i
19:06find fascinating at the moment is for example with the war on you've got a situation where anthropic finds
19:12itself in conflict with the trump administration uh the u.s government wants control over the mass
19:19surveillance systems anthropic is saying that's not possible we've got safety guard rails in place so
19:25suddenly anthropic is unwilling to give uh the kind of mass surveillance systems that the government of
19:32the united states would like access to so it almost seems as if some of these companies some of these
19:39monopolistic uh companies or oligopolistic companies are actually more powerful now than governments
19:45they are they are the ones who can actually almost dictate terms to the government is that the sort of
19:51dark world that we're entering where individuals don't matter governments don't matter it is uh as
19:57i said at the outset the algorithm engineer sitting in san francisco or or these large companies which
20:05are virtually going to dictate terms is that the future that we are heading to unless we put these
20:10guardrains can regulatory systems be put in place by governments is it possible for governments to press the
20:16pause button at all or not possible anymore too late well i don't i don't actually think that showdown is
20:22decided yet what we do have is a very clear evidence of the type of concentration of power and of
20:33its locus
20:34that the u.s state obviously understands these technologies as part of their geopolitical advantage
20:41and they obviously understand the type of dependency on these technologies that they have when they
20:49integrate one or another model into their systems however these are not technologies that the state
20:55could create it doesn't matter how much money they have it actually wouldn't be legal for the u.s
21:01government to collect the kind of data that the private industry can collect this is another
21:06clinton era legacy the you know and so it's this is built on the backs of a business model
21:12that has pooled the resources for ai in the private industry that could not be replicated by the
21:19government in terms of you know world-spanning cloud services these massive platforms to both distribute
21:26and collect more data and the ability to collect the kind of data that is in the hands of these
21:32companies
21:32and so you are seeing a tug of war a showdown between the imperatives of the state which is
21:40aware increasingly self-aware of this dependency and its potential danger and you know what we can
21:47call the free market and the you know frankly the the ability to build and shape technologies you know per
21:59one set of imperatives or another you know just to give an indian example uh in india we are digitizing
22:06everything payments identity even governance therefore in this age of ai do you believe that empowers our
22:14citizens government say that's empowering you but are we being exposed without realizing it to new
22:20forms of surveillance i am surrendering my my individual freedom to the government in a democracy
22:27is that is that the way it is is that the way you see this entire digital world even pre
22:32-ai the world
22:33the way the world was getting digitized well i think you can't say getting digitized without recognizing
22:41that as a form of surveillance like the census is a form of surveillance is that surveillance used to
22:47ensure that everyone has food and everyone has care or is it used for oppression and division that's
22:53the question but i think you know of course this is a form of surveillance you know data is being
22:58collected
22:58it is being you know analyzed it's you know it's that that's what it is i think we need to
23:05also recognize
23:05that governments are not like sports teams that we root for irrespective of who plays on them they are
23:12containers for power and so we need to recognize that that surveillance could be used justly and it could
23:19also be used in oppressive and harmful ways can i ask a simple question a lot of people in this
23:25audience
23:26somebody's put it where do you store the data where do all where where do the signals and others store
23:33our
23:33data and where does the ai model store our data where is our data actually who who who has it
23:41it
23:41comes back to the question of control but if you can explain to us where is this data being stored
23:45at
23:45the moment i love this question because signal does not store your data we have built a system in which
23:54we
23:54have access to as little as little data as possible you don't store my data we you could put a
24:01gun to my
24:01head subpoenas come to signal we cannot give over the data because we don't have it we have built we've
24:10rewritten up and down the stack to ensure that your messages stay between you and who you talk to
24:17signal no one else can see those you're the only ones with the encryption keys that your profile
24:23information your contact list whom you're contacting is it being archived somewhere is it being archived
24:29somewhere no no literally no the only way to protect it is not to collect it because i could
24:37leave you don't have to trust me a new head comes right and they want to make money they want
24:42to sell
24:43out their their ideology is very different so technologically we build it so that at the level
24:50of code you cannot get the data and where are the other where are the other ai uh models storing
24:56their
24:57data i mean let's say it's the opposite of that because ai requires huge amounts of data for
25:01training so it has let's call a data imperative your prompts are data and we know that these
25:08companies collect those prompts they're often reviewed by multiple people we don't know where
25:12they go they go back in for training and then there is more data being gathered every day we can
25:17think
25:17about meta's ray-ban glasses which were apparently recording people in their bedrooms their most intimate
25:23moments not only recording them without their clarity or consent sending those recordings to other
25:30human beings who were then labeling those images of people fighting people crying people going to
25:37the bathroom people having sex the most intimate things that they had no idea with sent to human
25:43moderators who were then labeling those images so they could be reused to train the ai the answer to
25:50your question where is the other data is we don't know because we don't have control but we should
25:55assume it is proliferating throughout data brokers or companies and given the hunger for ai given the
26:02amount of money going into it we should assume that that is accelerating because people want more and more
26:08and more data and sensitivity to privacy in the industry is being numbed because of that i think
26:15you've uh so you've shaken you've shaken and stirred and shocked a few people by what you've just said
26:20which is a great warning in a way i'm glad you put that out as plainly as that but kali
26:25you wanted to
26:26respond you seem to have been shaken yes i have been shaken i wanted to ask uh meredith if let's
26:33say signal
26:34doesn't store the data but you have a pegasus type of virus on your phone then how safe is your
26:42message from
26:43one person to the other on signal and my colleague here sitting next to me has another question as
26:49well so we're going to give you two questions yeah so just quickly so one is uh if you're not
26:54storing
26:54the data is there still the possibility of during relay for someone to be able to catch it because
27:00you're other people's networks um sorry there's a white noise behind me so i missed some of your question
27:06but i got the pegasus question let's start with the pegasus question and we'll come to your question a
27:12moment uh that's a pegasus question so smart that is so one of the reasons when i answered the agent
27:18question that i indexed on operating systems is because that's the water that signal swims in as an
27:24application we have to trust the operating system there's no no way around that and so pegasus for
27:32anyone who's not familiar is very powerful israeli spyware that governments and corporations license
27:40to exploit vulnerabilities in the these operating systems in ways that can take over your phone
27:46now it's targeted it's not something that everyone in the world gets pegasus it's very expensive so if
27:51i'm a government i go sign a contract and i get access to this and then we have seen journalists
27:57dissidents
27:58and others being targeted by pegasus but basically what it gives the attacker is the ability to access
28:05your phone on the same terms you have which is exactly you know they can listen to your spotify
28:10they can look at your gmail i message signal whatsapp they can use those apps and see what's on those
28:16and so best practice there is to use disappearing messages so at least there isn't a lot of legacy
28:23and you know to to run lockdown mode if you're running on ios but this is one of the reasons
28:30we
28:30we work closely with citizen lab and others who are pushing back against targeted spyware because
28:36that undermines the protection that we can provide to our users and frankly makes us all less safe
28:42because it incentivizes keeping vulnerabilities open that should be closed and fixed on the operating
28:50systems in ways that are are bad so if i distill what you've said targeted spyware is one of the
28:56big
28:56challenges of our times which is is the fear that i have that you know maybe my phone has been
29:04infiltrated by pegasus and there's a fair possibility of that but sir you had a question yeah so two parts
29:10to the question can you hear me now yeah okay go ahead uh so one part of it is that
29:15signal like others
29:17when you join it asks you for your contacts right so as i give access to my contacts where does
29:23that get
29:23stored and the second part of that question is that is the encryption strong enough that during relay
29:29because you're piggybacking on other networks uh during relay there's no uh sort of spyware that can
29:36get in there to be able to actually see your messages okay the answer to the second question is yes
29:42yes it is that it's not a the relay is not a vulnerability point for our our system uh the
29:49first part of the question is um how so when you when you download signal which i'm sure about half
29:56of
29:56you just did you will see it it asks to find your contacts and that's basically so you can build
30:03a
30:03network there's no such thing as a communication network you can use by yourself um but all of that is
30:09happening on your device signal never sees your contact what it's doing is going through and
30:14looking for account identifiers that are sort of you know correlated with phone numbers we don't know
30:20whose phone number it is or anything else we just know oh hey this phone number correlates with an
30:25account identifier we're doing that all on device and then it says hey these people are people who are
30:30on signal so you can start using signal and communicate with them but that is not signal never has access
30:36to that we never see that so that's all just happening on your machine okay our time's up we've
30:42got a few questions maybe uh that maybe merited you can very quickly mr parigi very quickly we just have
30:48a minute so keep the question short i could go all day yeah yeah she could go on she's a
30:53she's
30:53uh she's done some research at nyu among other things that yeah yeah go ahead could you give us a
30:58idea on the dark web because there is a belief in certain electronic circles that the dark web
31:06had already ai agents is that true thank you so much for this question um so the dark web is
31:13a bit
31:14of a euphemism it's kind of a metaphor that's often used you know kind of in political circles to describe
31:21nefarious things that happen out of sight maybe um you know chat rooms and other things not as easily
31:29accessible um i don't you know i'm sure there are people who have spun up technical systems using ai
31:37agents we're certainly seeing scams that use ai agents we're seeing the use of you know ai agents
31:52in general are you know being leveraged we do know for nefarious activity okay okay i i i
32:15just have 30 seconds i just want to ask you something that i keep asking myself someone
32:19asked me these questions about privacy and i keep saying i have nothing to hide what would you tell
32:24those who say like i do i have nothing to hide so i'm okay with this world of surveillance what
32:30would
32:31you tell them um i would tell them to close their eyes and i would love everyone to do this
32:36with me
32:36close your eyes for just 30 seconds imagine every email you ever sent in your life every message you
32:42ever sent in your life to your family to your boss to your oncologist every dating app swipe you've
32:49ever made every mean little thing you've ever said over dm all of that is put in a database
32:56and you walk into a room and there's an intelligence official sitting there saying we're going to send
33:01the link to this well-structured database to everyone you know to your pastor to your teacher to your boss
33:08to your mother-in-law and they're going to be able to search through for their name and they're going
33:12to see everything what would you be willing to do to prevent that being sent okay at the end of
33:21this
33:21conversation i hope each and every one of you have gotten to signal the german ambassador certainly is
33:27nodding his head furiously so he's already on signal sending hopefully all those encrypted messages to
33:33to the leadership in bond but to all of you out here i think this was a warning signal given
33:40by
33:41the president of signal foundation meredith whitaker for shaking up this audience and hopefully the viewers
33:48out there to the world the dark world of ai if i may call it that thank you very much
33:53great to have
33:54you here at the india today conclave thank you thank you meredith uh you know one quick question
34:02because you've practically said it for what it is but a lot of us here are wondering and if you
34:06can
34:06say it is whatsapp compromise then because all of us are on whatsapp is it compromised signal is much
34:15better than whatsapp and there's no but yeah signal is much better than whatsapp i would prefer signal
34:25but whatsapp is better than telegram but it's owned by meta is that the reason when it really comes
34:34down to it sometimes even the white house uses signal over whatsapp look everyone who is serious
34:40about recognizing that their conversations need to stay between themselves irrespective of their
34:45thumbs but everyone who is serious uses signal all right thank you meredith to stay with us
Comments