Skip to playerSkip to main content
  • 4 hours ago
Transcript
00:00I am one of these people that is a little wary about what our future holds.
00:04And when I think about what you created at Wikipedia, it wasn't just a repository of information,
00:08but it was a repository of trustworthy, or at least the perception of it being trustworthy.
00:12And we're now in an era where everyone seems to disbelieve their own lying eyes for whatever
00:18reason, whether it's the distortion from AI or even before that, the idea of political discourse
00:23that has sort of bent the truth. So where does a place like Wikipedia, where does a place like
00:29Jimmy Wales even fit in into our future?
00:32Yeah, I mean, I think people still are very, very interested in finding quality information,
00:38sources of information that they can trust. I actually hope that some of the things that
00:44have happened with AI have helped people understand that we need journalism, we need quality information,
00:52we need to get it right. And of course, AI, one of the reasons people don't trust it is
00:56not that they've lost trust in everything necessarily, but because it actually isn't
01:00trustworthy. We've all seen AI make mistakes and errors and sometimes quite hilarious and
01:06sometimes not so funny. And so, you know, I'm optimistic. I do think people are really passionate
01:12about wanting the truth.
01:14You do. What about the idea of collaboration? I mean, Wikipedia was kind of one of the ultimate
01:18sort of examples of collaboration, complete strangers, people didn't know each other, finding a way to work
01:24together. And now we're in a society where it seems like no one trusts anybody unless they're,
01:28you know, part of your tribe.
01:30Well, I mean, it seems that way sometimes, but I actually caution that if we're not careful and we
01:35and we spend a little too much time on Twitter or whatever like that, you can get a very negative view
01:41of human nature. But we have to understand the algorithms are promoting content that's divisive or
01:45even if the algorithms don't do it, people are drawn, you know, like malls to a big argument or a fight
01:51or a flame war or something like that. But actually, if you think about normal day-to-day life
01:55and people you meet, you still pretty much trust people. Most people are basically decent and good.
02:01And online, there are still wonderful places. There's still horrible places. All of these things
02:06are true at the same time. But I do think people can come together. They can have thoughtful
02:11conversations. They can, you know, move the needle on their lives and on trust.
02:16And I hear what you're saying that social media is not real life. It is a refreshing perspective
02:23that, you know, you go out and meet people, you're going to have a different experience than if you
02:26spend all day on X, for example. I do want to talk about Wikipedia and how Wikipedia fits into a world
02:33where more and more people are using chatbots. It's become sort of, you know, a second muscle that
02:39people are developing that muscle memory with. So, I mean, do you have any existential concerns that,
02:45you know, if people get so used to using chat GPT that maybe they'll just skip the step of going to
02:52Wikipedia? I mean, we do think about it. Obviously, you have to think about, you know, as there's
02:56changes in the technology, change in the media landscape. It's something we've been thinking
02:59about for years, though. For example, the move from largely desktop to largely mobile page views
03:07meant a lot of things about how do you edit Wikipedia from a mobile phone? It's quite hard and so on and so
03:12forth. So we have to be aware of that. We're not super concerned simply because, well, for one thing,
03:18we're a charity and we're volunteers and we just edit Wikipedia because we love it and we're happy
03:23that it's popular and that's great. But it isn't why we do it. We do it because we care about knowledge.
03:28We care about, you know, getting in the weeds and some details on some geeky topic that we really care
03:33about. But I do worry, you know, for the world. If people are not aware of how flawed large language
03:41models are and they take what it says. I mean, one of the biggest problems with large language models
03:46is even when they're wrong, the way the technology works, kind of guessing the most plausible next
03:52token, next word, generates things that are wrong but plausible. And that's actually in a way worse.
03:59You know, if you asked a question and it said something completely crazy, you would go,
04:02oh, that sounds crazy. But if it sounds plausible, you might believe it. And so people need to be
04:07aware of that. They need to believe, you know, you really can't trust it. And I mean, I love that point
04:12that, you know, you built up Wikipedia and, you know, your folks continue to edit Wikipedia, including
04:16all the volunteers, because you love it. There's something so pure about the fact that, you know,
04:21you're just pursuing information. And I mean, Wikipedia has seen so many different cycles when it comes to
04:27the Internet. You think about just how many decades Wikipedia has existed. This moment
04:32in time that we're talking about when it comes to AI, when we're talking about the Internet and
04:37talking about trust. Can you compare this period to any other when you think about Wikipedia's history?
04:44I mean, yes and no. I mean, there's always something new and there's always some threat to
04:48Wikipedia and so forth. But also, you know, I mean, I think people do forget we can have a little bit of
04:54a rosy view of what the early days of the Internet was like. But the truth is there were massive flame wars and
04:59angry people and all of that. All of that's always been with us sort of random crackpots putting
05:03forward their views and so forth. But, you know, I do think, you know, the changes in technology are
05:11really important and we do have to focus on it. But I mean, what we're really focused on is
05:16that question of trust. You know, how do we make sure that our community is healthy, that we're happy
05:22that we're working really hard for quality? We're relying on quality sourcing, because as long as we do
05:28that, we'll have a place in the world. How do you police that, though? Because, I mean, it's one
05:32thing you have to obviously find that, you know, what the sourcing is, whether it's reliable. But
05:38then there's also the issue of not wanting to tamp down someone else's free speech or their
05:43opinions, even if those opinions may seem flawed. Yeah, I mean, we have a few techniques. I mean,
05:50part of it is, and this is rule number three from the book, is have a central purpose. So the purpose
05:55of Wikipedia is to write a free, neutral, high quality encyclopedia. And that gives a lot of
06:01clarity to decisions we make. So, you know, if you say, oh, well, how do you adjudicate between
06:07the New England Journal of Medicine and a social media influencer? We say, well,
06:12it's an encyclopedia. So we want quality sources. So we want that. And it turns out, you know,
06:17actually, there's loads and loads of people who are actually quite nice and calm. And so we try to just
06:21tune out all the noise of the world and just have constructive dialogues, constructive debates in
06:27the community. It's never easy, but it's just what we do. You reference the idea that you are a
06:32charity, a nonprofit. I mean, how secure is Wikipedia's future with that? I mean, we're okay. I mean, we always
06:38have to take fundraising very seriously. And the vast majority of money that supports Wikipedia is the
06:44small donors. Average donation is just over $10. And that's really important for our intellectual
06:50independence. You know, we're coming up on our 25th birthday. And, you know, we've been doing it
06:56for a long time. So we're confident the public will continue to support us. Our donors are happy
07:01and all of that. But you do have to worry. You know, certainly one of the things that we're kind
07:06of raising our hand to complain about is all of the AI bots scraping Wikipedia are actually costing us
07:12a lot of money. And our donors aren't donating to subsidize Sam Altman. They're donating to help
07:17Wikipedia. And so we think they should be using our enterprise product and sort of has there been any
07:23discussion about suing or any kind of a legal action? No, not not yet. Not yet. I mean, we're we're too
07:29friendly. And yeah, but we you know, like Google's always been a great customer of the enterprise
07:36product. And we're talking to the others. And, you know, we may go in for whether or not we would use
07:41Cloudflare or not. I mean, let's I don't know. I'm not on top of all that stuff. But yeah, you know,
07:47using something to block the AI crawler bots to say, look, it's really not fair. Like you need to pay
07:51because you're costing our donors a lot of money. And that's not really what it's all about. That's
07:56something that we would probably consider.
Be the first to comment
Add your comment

Recommended