00:00Does Wikipedia have a left-leaning bias? Its co-founder, Larry Sanger, believes it does, citing a blacklist of conservative outlets that are not allowed to be used as sources on the site.
00:09Sanger wants Wikipedia to improve its neutrality and have greater transparency and developed a nine-point plan he hopes administrators and editors at the encyclopedia will follow.
00:21Why do you think Wikipedia needs reforms?
00:23Wikipedia has got a lot of problems that it's accumulated over the years, and this is for a number of reasons, but I think the main root reason why it has as many problems is that it has no way to legitimize policy proposals and big new changes.
00:48So it ends up being institutionally conservative. It simply lacks any method of major reform.
00:58Sanger wrote nine suggestions that he hopes Wikipedia's current leadership will adopt.
01:02They include enabling competing articles, abolishing the blacklist, reviving the original neutrality policy, allowing articles to be raided, and ending indefinite blocking.
01:13He also wants Wikipedia to reveal who its current leaders are.
01:16Consider joining Wikipedia at some point in the next few months and making your voice heard there.
01:30You do have a right to edit there, and I think Wikipedia needs fresh blood from a wide variety of ideological and religious and other perspectives, national perspectives, for example.
01:46Right now, Wikipedia is very much a sort of echo chamber, I find, and they won't like me saying this, but they should, because I'm simply encouraging people to participate and to get behind the proposals that I've put out there.
02:11I think that really would make the world better by making Wikipedia better.
02:18Wikipedia has a list of reliable sources that can be used for entries.
02:22The sources are given one of five ratings, generally reliable, no consensus, generally unreliable, deprecated, and blacklisted.
02:31Fox News is listed as unreliable, which means it shouldn't be used, while MSNBC is considered generally reliable.
02:38Do you think that's a good system, or do you think that system needs to be changed entirely?
02:44Because I see that there could be a difference between thinking that's an acceptable system that's not executed properly.
02:50You see what I mean?
02:52I think it's a pretty bad system, generally speaking.
02:55My view is that sometimes facts only appear or are reported only in sources that are currently disallowed on Wikipedia.
03:10Maybe the source decided to speak only to that publication.
03:16And in that case, if that person doesn't speak to any other publication that is allowed on Wikipedia, then that information is never going to appear on Wikipedia.
03:27So the way it has to work, I think, is all sources are allowed, but we can still make a selection of the most credible or the best sources, and we can have a debate about which those are.
03:43Controversial views need to be appropriately attributed to the people who own them.
03:48And that also goes with opinions about the sources themselves.
03:53You can see that AdFontis has this media bias chart, and it's got two separate ways in which it grades news outlets.
04:01One, based on their partisanship, and also based on their reliability.
04:05And as you can see, some of the outlets on the right that you described, like Newsmax and the New York Post, have equal reliability to MSNBC, the New York Republican, and Mother Jones.
04:19But none of them are within that green box, which would hit the right balance of both not being partisan and having neutral news and also being very reliable.
04:29So I want to just push back on this idea that certain outlets should be allowed and wonder, perhaps maybe the standards should be stricter.
04:38No, I think most of the information that can be found about what's going on in the world is reported in sources that are outside of the box.
04:50That's not a problem.
04:53I think we can learn a lot from biased sources.
04:59Speaking as a sort of conservatarian myself, I think it's possible to learn things from the New York Times.
05:10I wouldn't simply take its word for it if it were the only source or if I hadn't checked it out on controversial issues.
05:24But there's a lot of things that I might trust it on.
05:26And if it's among multiple sources that are reporting a certain thing, well, it's fact-checking tradition is pretty strong.
05:38So I would say, I suppose the New York Times is okay, but it doesn't belong in the green box.
05:45It's very far left.
05:46When do you believe Wikipedia was most neutral and best fulfilling its mission?
05:52And when do you believe it started moving to the left?
05:55And how do you think that happened?
05:56Best fulfilling its mission and being most neutral are distinct things.
06:01I would say it was most neutral in its first three years or so, but it actually fulfilled its mission better in later years, maybe between 2005 and 2010.
06:16Then it was already starting to be biased.
06:21Biased, I remember complaining about that a bit at the time.
06:26Nevertheless, it wasn't nearly as bad as it is now.
06:29And it had expanded into just an absolutely enormous resource.
06:35So there never was a time that I'd want to go back to where it was just the way I would like it to be.
06:49I think in terms of the qualities of the community, I actually like how things were in the very first year.
06:58There were a lot of decent people around that there was a lot of give and take and people were not so full of themselves and busily negotiating the rules.
07:16So now as to the question when things really changed in the sense of becoming more biased,
07:26I would actually break it down into a few different periods.
07:30So first of all, the left really started descending in what I would call a noticeable way, even in the second half of 2001.
07:44Not a lot of people, but there were some people who were very clearly on the left.
07:50And then they didn't really crowd out.
07:54And here, when I speak of the left here, I'm not just talking about people who might happen to vote for the Democratic Party.
08:01That's not what I mean.
08:02I mean people who are serious full-time or at least part-time activists and who generally tend to be on the far left.
08:18So there were a number of people toward the end of 2001, and a lot more started showing up in the following year.
08:27And before that long, they really dominated what was going on so that by 2008, 2010, the Board of Trustees was pretty much dominated by such people.
08:48And it's been that way ever since, but here we're just talking about the people who are like on the Board of Trustees.
08:58And actually, in general, the community simply became more radicalized as essentially the political discourse on the left became more radicalized.
09:14So there was a big change, I would say, during the Bush years, especially over the Iraq War.
09:26And then there was an even bigger change, of course, due to Trump's election.
09:32So that's when really, when they became, I think, extremely radical.
09:39And that was reflected in the choice of sources.
09:46So it was just a couple of years after Trump was in office.
09:52I don't want to interrupt you, but you talked about the community, the Wikipedia community, and how it's changed over time.
09:59And I find that very interesting.
10:01So I saw the statistic, nearly 260,000 volunteer editors contribute to Wikipedia every month.
10:09Do you have any insight into who exactly they are and how that process works?
10:13They're able, of course, to be anonymous.
10:17So that has been the rule since the founding of Wikipedia.
10:22And in this, it follows much of the rest of the Internet.
10:26So we don't know who they are in that sense.
10:30We can make some generalizations about what kinds of people edit.
10:35They aren't all lefties.
10:39And indeed, there's a lot of people who edit Wikipedia on topics that have nothing to do with politics.
10:43Of course, you know, of course, a lot of people edit Wikipedia on topics about music and pop culture and, and, you know, apolitical history topics and geography and, and on such topics.
11:00It's, it's, it's, it can be quite good.
11:01Because I know, you know, Stephen Pruitt, who is the single biggest editor over the years, I believe he's made more than 6 million edits to 33,000 articles.
11:11No matter what the numbers are, they're astronomical.
11:14So what do you think of one single person having that much influence or impact?
11:19Well, he has every right to do it under the system.
11:23That's how the system is set up.
11:27It doesn't bother me if that's what you're asking.
11:29I think there's, there's nothing really wrong with, with somebody being able to do that.
11:36If he is, however, a front for an operation, that's another matter.
11:43But I'm not accusing him of being one.
11:47I, I think he might run a lot of bots.
11:50And that might be one of the main reasons why he has as many edits.
11:54But I actually don't know.
11:55So the House Oversight Committee started an investigation into what they call, this is their words, organized efforts undertaken in violation of Wikipedia platform rules to influence U.S. public opinion on important and sensitive topics by manipulating Wikipedia articles.
12:11The chairman, James Comer, wrote a letter to a Wikimedia Foundation CEO requesting documents and information.
12:18And they are saying that some volunteer editors have been caught violating platform policies, and they're curious to hear about the CEO's efforts to thwart that.
12:28I wonder what you think of congressional involvement in Wikipedia.
12:32It's sad that it's necessary, but at this point, I wouldn't second guess Congress if they think that it is necessary.
12:43I, it never makes me happy when Congress has to get involved overseeing any part of the Internet.
12:53It's extremely important that the Internet be free.
12:59And even if it is a Republican Congress, it makes me nervous and it should make us all nervous.
13:10That said, I do think the issues that they are taking up in that subcommittee hearing is, those issues are very important.
13:21And they, they have, let's just say that the Wikimedia Foundation, in my opinion, has not done the best job of staying on top of all of the problems that are going on on the platform.
13:41And I don't know why that is, and I think it'll be interesting to see if Congress is able to shed any light on that.
13:52Let me just go back to the 260,000 volunteers real quick, just because I think it's fascinating insight to hear from you about how the process works.
14:00Once one of those volunteers makes an edit, is there a check and balance system in place to make sure that what they wrote is accurate?
14:08Not as such. It's, the edit is immediately available to others to look at, and they will.
14:20Generally, the way it works is, and if you have made edits to an article, then unless you tell it otherwise, it will add it to your watched article list.
14:34And then, so if somebody else edits an article that you have edited, you'll be alerted, and then you'll go and look over the edits that other people have made and be a sort of informal peer review system in that way.
14:48And in a lot of ways, if I didn't think so, I wouldn't have started the thing.
14:58So, it is quite amazing that wikis work as well as they do.
15:05And, you know, we actually do have this process to thank, and of course, the human beings that have followed it, for a lot of the ease of access to information that we enjoy today.
15:25It's just that, unfortunately, that has been greatly, well, let's just say it hasn't gone well in recent years.
15:36And then, finally, let me cover one more topic with you.
15:39You point out in your article that Wikipedia is the free encyclopedia.
15:44But it seems like Elon Musk wants to change that.
15:47He recently called for people to join XAI to help build Grokipedia, which he says is going to be an open-source knowledge repository that is vastly better than Wikipedia.
15:57How do you feel about Elon Musk trying to build a competitor?
16:01Well, I hope he does a good job.
16:04I'm happy about it.
16:05I think that's great.
16:07The more encyclopedias in the world there are, the happier I am, I suppose, unless they're extremely irresponsible.
16:20I do worry, however, that Grokpedia, or whatever it's going to be called, will reflect the same sort of biases that Wikipedia has and that the Grok chatbot LLM has reflected.
16:45So in the last year or so, the Grok, actually shortly after it rolled out, it started drifting to the left.
16:57And in the last six months, if you ask it a question on anything remotely political, anything having to do with the culture war, it answers from a left-wing perspective.
17:15It's pretty annoying unless you actually support that.
17:19What do you make of the use of AI to build something like this?
17:22I've anticipated it for quite a while.
17:26For over a couple of years, I've been thinking about what's going to happen.
17:32I do think that AI can be used to write a decent encyclopedia article.
17:38I've done it myself.
17:40I've observed others using it to add articles to encyclopedias.
17:49So I know it can be done, and it can be done reasonably well.
17:53You said, when I asked you this original question about Grokopedia and XAI, you said, I hope he does it right.
18:00What is your advice to make sure he does it right?
18:03Well, there are these prompts that are in the background when you make a query to a chatbot.
18:16So you type in your prompt, but then it's wrapped essentially by another prompt, and then the whole thing is submitted to the LLM, to the system.
18:33And it's possible to edit that wrapping so that it elicits a more fair and balanced sort of response.
18:49I would like to see that in Grok, and I would like to see that reflected in Grokpedia.
18:56But I'm not sure that we're going to see that.
19:00I think Elon Musk and his people know that, of course, it's possible to edit those background prompts.
19:15I forget what they're called.
19:16Let's just put it this way.
19:26They have to know that the system as it exists now has the sort of biased result.
19:35Therefore, I think they must intend it.
19:37And if they intend it, that means that they're probably going to come up with a similarly biased encyclopedia.
19:48I hope not.
19:50I hope I'm wrong.
19:51Well, thank you very much.
19:52I really appreciate your time.
19:53I'm Ray Bogan for Straight Arrow News.
19:55For more reporting straight from our nation's capital, download the SAN app.
Be the first to comment