00:00The gentlelady's time's expired. I now recognize Mr. Frost from Florida.
00:05Thank you, Chair. The first few months of the Trump administration have been unhinged.
00:10At several events in my district, I've heard from outraged constituents.
00:15People, as I've traveled across the country, have heard the same from people in many different districts.
00:19And there's two common themes I keep hearing about.
00:22Number one, people that are angry at the administration's attempts to degrade vital services like Social Security,
00:29Medicaid, FEMA, and the VA, and also fear about what Elon Musk and folks at Doge have done
00:38and could be doing with the data of American citizens and the American people,
00:44things like health records, banking information, and Social Security numbers.
00:48These conversations are really just the tip of the iceberg.
00:50Hundreds of constituents have also called into my office with the same fear that their private information
00:55is falling into the wrong hands.
00:57Mr. Schneider, reporting in May revealed that Doge is now feeding unknown amounts and unknown types
01:05of sensitive government data into Grok, the AI program owned by Elon Musk.
01:10I find this extremely concerning.
01:12What are the potential harms that individuals now face from Doge doing something like this?
01:17Well, we don't know.
01:21So it's the two things I would worry about, feeding it to Grok and also giving it to Polentier to train their models.
01:26So once it's now left the government, it's going to different companies for different things.
01:32And I think that the fact that we don't know what the data is, what's being done to it, how it's being used,
01:38how it's being protected is a grave danger, right?
01:42We secure that data because it is important, and now it is somewhere else also.
01:47Also, we don't know the accuracy.
01:49I haven't talked about it.
01:50I would worry about data poisoning.
01:52If you were an American adversary, one of the things you might want to do is go to that ill-protected data
01:56and change stuff because now the model is being trained on false data.
02:01And we've talked a little bit about hallucinations and making bad decisions.
02:05In what way could the dumping of Americans' private data into personal and corporate AI programs
02:12create conflicts of interest or create personal enrichment for certain people?
02:16Well, I mean, data is power, and whoever has the data has the power.
02:20And for us to give that data to corporations really with not even charging them for it gives them an enormous amount of power.
02:28How they can use that is going to depend on what they're doing,
02:32and future technologies are going to make different things possible.
02:35But think of it broadly as it increases the power of those who have the data.
02:40And it's important for everyone to remember that we've seen many different conflicts of interest throughout this administration.
02:45We've also seen Trump, Musk, and other folks dismantle data privacy protections and ignore law after law.
02:53Since we can't rely on the president or his administration to obey the law,
02:56what other safeguards currently exist to keep our data safe?
02:59That's all you got.
03:01I mean, you as a citizen, I mean, my data that is at the IRS, I can do nothing about it.
03:07I am relying on you, the government, to protect my data.
03:11If you're not doing that writ large, I'm kind of screwed.
03:13For 130 days, we've seen Musk and other actors lurk inside our most sensitive databases,
03:20grabbing whatever private data they want with zero to little oversight,
03:24especially no oversight from the United States Congress, including Republican members of Congress as well.
03:30How can we as the Oversight Committee assess the damage that's been done and hold people accountable?
03:36I think you actually have to do the work.
03:39There's an enormous amount of investigation that has to happen.
03:42We don't know what data was taken.
03:43We don't know where it went.
03:44We don't know where it is.
03:46And that's going to be work figuring it out.
03:49But I think it's work that has to be done because I think the first step to fixing the problem
03:54is understanding what it is.
03:55I mean, we can stop the flow.
03:57You can close down the loopholes.
04:00You can get these people out of the organizations.
04:02You can start protecting the data.
04:03But now the data is gone.
04:04You have to figure out where it is, who has it, and then take it back.
04:08Interestingly enough, you brought up the word accountability, which is really important.
04:13When the administration changed over from Democrats to Republican, I think little things matter.
04:18I think words matter.
04:19And there's a word that changed in the name of this committee.
04:23The word accountability used to be in the name of this committee.
04:25And they changed it from government accountability to government reform.
04:29It seems like a lot of my Republican colleagues have no interest in holding this administration
04:35accountable for the misuse of data.
04:38And Congress has a horrible record in being able to legislate as it relates to technology.
04:46I think there's huge educational barriers.
04:48We bury our heads in the sand until the problem gets out of hand.
04:51And then someone comes up with a grand solution of banning it.
04:54This is what happened with social media.
04:56Unfortunately, what I think might also happen with AI unless we get our stuff together here.
05:02I yield back.
Comments