00:18As far as election ads, you know, there are tens and hundreds of millions of dollars being spent
00:25on ads. So that's across TV and, you know, and across the internet, social platforms, Facebook,
00:34Twitter, everything. Yeah, they're everywhere. You can't escape them.
00:52I think the big difference is from an ad perspective. So when we, when we, you know, when this thing
01:00got going, when it got started, ads were being placed all over the internet without any kind
01:06of knowledge about where they're going from an advertiser standpoint. So, you know, we started
01:12with Breitbart News and that was really, it became, you know, a really interesting test case
01:21because here was a site that was, it was really in a lot of ways racist and there was a
01:30lot of
01:30disinformation. There was a lot of sexism and, and a lot of bigotry and xenophobia.
01:38And, and there were big advertisers, really big advertisers, the biggest advertisers in the world
01:43were showing up on that site and thereby sponsoring it and sending their money to them. So there's
01:49been a much greater, this has not been necessarily a, you know, this has really been about, you know,
01:59politics as far as I'm concerned. It's about bigotry and disinformation. Of course, those
02:03things can become intertwined. But by and large, I think there's a difference between, you know,
02:09in our country, right and left and right and wrong. And you can, you know, say what you want
02:14politically, but when it veers into racism and xenophobia and bigotry, then, you know, that
02:20doesn't seem to be something that advertisers should want to be a part of. So, you know, since then,
02:27there's been a much greater awareness and a much greater reckoning around the advertising industry
02:33about where their dollars are going. And I think that's generally good.
02:50Sure. And so it used to be that you knew exactly, you know, so popular show on television,
02:56you knew exactly what time and what channel and what network it was going to be on.
03:00And then the internet came along, and it kind of worked the same way for a little bit.
03:05And then programmatic advertising came along. So algorithmic advertising. And what happened was you
03:13send your ad to, you know, a programmatic company, that'd be Google, Facebook, audience network,
03:19network, there are a bunch of different companies. And you would say, here's some money, not, you know,
03:25not as much money as I used to spend. So it was a bargain. And it would say, just spread
03:32your
03:32spread your money around, spread my ads all over the internet looking for eyeballs. So it became about
03:38engagement and algorithmic reach versus versus rewarding quality content. So whereas, you know,
03:47they would just go to the places where more people were before, it then started to become about
03:52engagement. So how many eyeballs can I get across the internet, no matter where those eyeballs are.
03:57And so it became really about attention. And unfortunately, hate and disinformation get more
04:05attention than facts and quality content. So that's where the big change happened.
04:21Facebook is, you know, by far the worst. But but Facebook, Twitter is getting a bit better.
04:29Twitter listens to people a little bit. YouTube is still not great. There, some of them are starting
04:37with the idea of responsibility, which I think is a good thing. When you start in one place,
04:42it's hard to move somewhere else. But if you start with the idea that you're going to be
04:46more stringent about your rules, then it's much easier to do that over time. But when you've got
04:52Facebook, which is allowed, it still does, disinformation, hate, live stream murders,
04:59you know, across the board, they've allowed all kinds of really horrendous things. And they
05:05fundamentally at their core, don't believe that moderation is a good thing, because it's baked
05:10into their business model. The more engagement they get, the more money they get, and the more
05:15than their higher their stock goes. So it makes it harder for them, because from the beginning,
05:20they have had that attitude, and they still do for them to change at all. They really don't want to.
05:30Money doesn't matter. They've got more money than than most companies in the world,
05:34and they can afford to do anything. The problem is, again, it's it comes down to their core rule,
05:40core belief that they that they that anyone should be able to say anything anytime on their platform
05:47and make as much money as they possibly can. Ideally, they would heavily increase their mom,
05:55their moderators, the number of moderators and the quality of moderators, they would bring all those
06:00moderators in house, rather than farming it out to different companies. So they subcontract with
06:04different companies to do it. They need to provide adequate mental mental health checks, and and help
06:12for the moderators who doing it because they look at the worst things in the world every single day.
06:17They need to give them much more money than they're giving them. And and they need to they need to
06:23have
06:23much more human moderation rather than relying on algorithms. Because, you know, every time they're going
06:29to choose algorithms because it's the cheaper way to go and is is a, you know, the FBI, our own
06:54government is listed as a potentially dangerous organization. And it's not an organization. It's
06:58obviously a conspiracy theory. But it but it was able to spread because of the social networks,
07:05they the social networks had every opportunity to stop it from spreading. And they didn't. And they
07:13took a very hands off approach, which they always do. And now they're trying to pare it back. But you
07:18know, you can't, you know, they're closing the barn door after the horses already escaped. They've
07:23they've already, they've been responsible for its rise. So, you know, it's it's one of the probably
07:30the biggest cults ever created online, and, and they're wholly responsible for it. As far as the
07:39politics of the situation, I think because of the politics of the moment, the social networks didn't
07:45want to do anything because they didn't, they didn't want to risk raising the ire of our current
07:53administration. So they allowed it to continue. And so I don't, across the board, I don't think that
08:01moderating decisions should be made based on politics, and the politics of the moment, I think
08:07it's incredibly dangerous. And it's incredibly dangerous, no matter who's in office, to have,
08:11uh, to have anyone right, left, center, be, uh, be catered to by the social networks that
08:19have that much control over information. I think is an incredibly dangerous place to be. And I think
08:24that's why they need an outside source to, uh, to govern them, whether that's, uh, regulation or, uh,
08:31whether it's, um, an outside oversight board. Um, I think it's really important.
08:35Siga el cubrimiento especial de las elecciones de Estados Unidos en www.elespectador.com.
Comments