- 4 weeks ago
In today’s digital age, misinformation and disinformation have become pervasive forces, impacting everything from personal decision-making to public policy. This session will explore the critical need to recognize, understand, and combat these harmful narratives that distort truth and threaten the integrity of our communities. In this discussion, we’ll unpack how misinformation and disinformation spread, the societal consequences they have, and strategies to protect ourselves, our communities, and our institutions from their impact.
Category
🛠️
LifestyleTranscript
00:00How are y'all doing? Y'all excited?
00:04We're here to celebrate culture, diversity, equity, inclusion, right?
00:11My name is Ebony McMorris.
00:12Look, I'm excited to be here because, as the announcer said,
00:16I'm in the White House every day, and I see literally every day
00:20how propaganda has become policy.
00:24And it's July 4th, a day that the country is celebrating independence,
00:27while we are literally seeing people lose their freedom in real time.
00:34So I'm thankful to those who are watching.
00:36I'm thankful to Essence for even creating this space where we can talk.
00:41And I know the panel before broke down what's happening in the White House
00:46with this big, ugly bill that's robbing the people
00:49and giving to those who are wealthy.
00:52But I also want our panelists to just talk about,
00:55as we discuss misinformation and disinformation,
00:59what it's looking like in their spaces.
01:02And if y'all have any questions, look, I don't know we're not doing too much Q&A,
01:05but I do like amens.
01:07Can I get an amen this morning?
01:09Amen.
01:10Amen, amen, amen, amen.
01:12So let me ask you this.
01:14We're talking about misinformation and disinformation.
01:18Talk about the difference and how it's showing up in your work.
01:21Yes, absolutely.
01:24So there's something called the information disorder spectrum, right?
01:29It's really called like the spectrum of lies.
01:32And we've got misinformation, disinformation, and malinformation.
01:37Misinformation is just incorrect information, any incorrect information.
01:41So let's say incorrect polling hours, right?
01:45Disinformation requires intent.
01:48It's lies with the intent to lie.
01:51And so if somebody's actively pushing incorrect polling hours to communities,
01:56they're spreading disinformation.
01:58And then the third is malinformation.
02:01Now, malinformation is information that's either true or has a kernel of truth in it, right?
02:07And is being used to manipulate and harm communities.
02:10That can be things like hacked materials.
02:13That can be things like putting your personal information online.
02:17Or, as we see right now, taking a kernel of truth and twisting it and pushing it towards communities
02:27to make them believe that certain things are happening that are not.
02:32And so all of them are just basically, you can just call them all lies.
02:35I was about to say, it sound like lies.
02:36Because I had never heard of malinformation, but it sounded like spin.
02:39Got a little bit of kernel of truth, but it's still a lie at the end of the day.
02:44Catherine, let me ask you, you've built an entire digital, you're building an entire digital museum
02:50around preserving the stories of Afro-descended people globally.
02:56What role does historical erasure play on misinformation?
03:00And how can we begin to re-educate people?
03:03I mean, our history is how we understand ourselves.
03:09It shapes our understanding of who we are, what legacies have come before us.
03:16And in all of that, I think one of the biggest things when it comes to our historical erasure,
03:22it disconnects us from our sense of identity.
03:26When we lose that, we lose an essential part of who we are.
03:30And I think the biggest way for us to re-educate ourselves and how we begin to re-educate ourselves
03:34is, one, by having humility.
03:37It's recognizing that we do not know everything, but we are capable of learning everything.
03:42And it's really powerful because once we're able to do that, we allow ourselves to have room
03:49to do what we need to do in figuring out the next steps.
03:54As a day, let me bring you into the conversation because you've competed on global stages
04:01where media narratives don't often tell the full story, especially for black women, for different cultures.
04:10Have you experienced that misinformation or disinformation?
04:13I'm just calling it lies.
04:14Lies.
04:14Lies.
04:15The lies people tell in the world of sports and in the world of media.
04:20How are you flipping that script also?
04:22For example, I'm flipping the script just by showing up myself authentically every day,
04:28unapologizing myself, being myself of who I've always been my whole entire life.
04:33Being in a male-dominated sport, especially like basketball, oftentimes when you fight for certain rights
04:38or when you fight for better treatment or even better pay, you get this black woman
04:43or this angry black woman narrative.
04:45And it's all about just trying to reshape that.
04:48You know, it's okay to stand up for yourself.
04:50It's okay for you to fight for better pay, better wages, or just better conditions in your work setting.
04:55So I think for me it's just about showing up through my nonprofit, showing young girls how to tell their stories,
05:01tell it the truthful way, using my platform to kind of—it's my responsibility to use my platform in a way
05:06where I can give the correct information to young girls, to the youth, to whoever's listening.
05:11And I think I'm doing a great job at that now, flipping the narrative.
05:14But it's important for us to just show up as ourselves and not allow these narratives to kind of tear us down.
05:20And sometimes that's kind of hard, showing up.
05:24And I know there may be some people here, you go in and out to your job, and you feel like your voice is being silenced
05:29even when you're hearing disinformation.
05:32And so many times we've just—we've taken it.
05:34We've leaned into it.
05:36But it is so—so talk about some of your work with Onyx Impact that's rooted in data and narrative strategy.
05:43Yes.
05:43What—how does that—how are you answering that, that issue that we're having sometimes
05:48and being able to combat that disinformation and misinformation?
05:51So Onyx Impact, we're a research and digital innovation lab that is fundamentally focused on fighting disinformation
06:00and using AI and tech to help build brighter black communities online.
06:07And I think what's so important here is to realize that, one, disinformation works on everybody.
06:12It works on everybody in this room, everybody in this crowd.
06:16Why?
06:17Because our brains work in a similar way.
06:19And that is that the more times you hear something, the more likely you are to believe it's true,
06:23and the more likely you are to believe other people believe it's true as well.
06:26Our brains substitute repetition for truth.
06:29And in a moment that we are in of hyperamplification, when social media can put the same image or message in front of you
06:40over and over and over and over again, it's never been easier to convince people of things that are not true.
06:47And so Onyx Impact, we've got amazing digital tools, the digital green book.
06:52If y'all take out your phones right now and go to digitalgreenbook.org, you will find tools, tips, resources for how to spot disinformation,
06:59how to fight back, where to go.
07:01And you'll find an AI tool named Ayesha that can answer all types of questions.
07:07What is going on with this federal bill?
07:09How is it going to impact black people?
07:11What is AI?
07:12How do I use it?
07:13How do I find investment opportunities in my community for black people?
07:17Whatever you got, Ayesha is built on black news.
07:19She's built by us.
07:20She's built for us.
07:21And she's here trying to fight this exact issue that you're talking about today.
07:24Can we give it up for Ayesha?
07:26Ooh, yeah, Ayesha.
07:27Can we give it up for a digital green book?
07:30I love it.
07:30But let me just poll the audience really quickly.
07:32How many of you are on social media, Instagram, TikTok?
07:37How many of you all, leave your hand up.
07:39Out of those who have lifted their hands, how many have shared something that you thought was true,
07:46later discovered that it wasn't, but you shared it?
07:49I'm going to leave mine up.
07:50Yeah, me too.
07:52How many have then gone back to take it down?
07:56You just leave it.
07:56Okay, sometimes, right?
07:58Y'all go back and take it down.
08:00But we're talking about how quickly lives travel.
08:04I'll give an example.
08:05I just saw something on Instagram.
08:07It was a preacher cussing.
08:08And I was like, ooh.
08:10And then I looked in the comments.
08:12And then everyone that was like, that's AI.
08:14When I tell you that thing looks so real, I was sharing it with everybody.
08:17But that's how quickly information travels.
08:21And we treat it like it is truth.
08:24Catherine, let me ask you.
08:25We're seeing AI-generated images now rewriting history.
08:30We can even look politically at what happened with the gentleman of Rego Garcia, who they had sent away.
08:37And then I was in the briefing room when they said that he had MS-13 tatted on his hands.
08:43But that was photoshopped.
08:45That's coming from the highest levels of government, right?
08:48So can you talk to us about AI images rewriting history?
08:53I think the biggest thing about AI is recognizing that AI isn't a human, right?
09:00AI can't create a soul.
09:02And so we can utilize AI as a tool and a helpful tool in a beautiful way.
09:08But it can't take away what we as humans have, that soul that we have and what we bring to the table and the history that we have for that.
09:16So, again, like I think AI is one of those things that I love to use personally.
09:21How many use it?
09:22I'd love to use Aisha.
09:23You use it.
09:24I want the hand.
09:25A hundred percent.
09:25But remembering that, like, in using AI, AI can't work without you.
09:30It can't work without what you bring to the table.
09:32So making sure you're bringing yourself to that.
09:34Absolutely.
09:35And there's a danger, although, in not confronting how it erases, is erasing things in very real time, right?
09:44So we have to be careful.
09:45This is just a warning to people.
09:47When you're looking at things, don't be so quick.
09:50Don't be so quick to share.
09:52Because it feeds the algorithm.
09:54It feeds the beast.
09:55And just on that point, one of the things we try to tell folks, when you're online, if there is any type of post that elicits a strong negative emotional reaction, regardless of where you are on the political spectrum, pause and double-check that it's true.
10:12Always.
10:13Yeah.
10:14All right, y'all.
10:14We pausing for a second before we start sharing, okay?
10:18Or read the comments, too.
10:20I'm a comment person.
10:22Everybody goes to the comments.
10:23I love the comments.
10:24I live for the comments.
10:25Yes.
10:25But, Azanine, let's talk about your nonprofit a little bit more.
10:29Yes.
10:29Because you're teaching young women in leadership.
10:32Yes.
10:32How are you teaching media literacy to help kind of move, have them make smart moves as it regards to algorithms and things like that?
10:41That's a great question.
10:42I love to say that the algorithm is not your friend.
10:46Likes, comments, that doesn't equal truth, right?
10:49You have to do your research.
10:50You have to make sure that information you're putting out there is valuable information, not only just to you, but to the people that you're feeding it to as well.
10:56So, young girls, we're just teaching just how to dissect what's real and what's not real.
11:00What you're putting out there, it lasts forever.
11:03One post can be on the internet forever.
11:05Unless you're, like, working with the FBI or whoever where they can just digitally delete everything.
11:10The stuff that you put out there into the media, it's going to last a long time.
11:13So, you have to know how to dissect between what's real and what's not.
11:16And also, too, the information that you want to put out about yourselves.
11:19A lot of the young girls, you know, they're making TikToks and they're making videos and they're making, you know, Instagram, Snapchat, and that's fine.
11:25Be your authentic self.
11:26You know, live your truth.
11:27Have a great time.
11:28But know when to separate what's real and not real, okay?
11:31When it comes to the Photoshop or when it comes to the AI or when it comes to just adding certain audios or tones on your video that can shift your algorithm, that's something that you don't want to do.
11:41So, it's just all about making sure the girls are being mindful of what they post on their social media accounts.
11:46Make sure that they're aware of what their algorithm is, where it's reaching to, what audience they're trying to attract.
11:51Because, for example, if you have a young teenager who's 13 years old and she posts a video of her, like, maybe getting ready for school, you don't want that algorithm to show up on an older guy or older person's account.
12:03And then now they're in the likes and comments.
12:05So, it's just about still them being the authentic selves, but just knowing what to post, how to post, and what messages that they're trying to send about themselves.
12:13Let me say, on the comment with the young women, there was a study that came out years ago by Dove.
12:18And it asked young women, hundreds of women that they had surveyed, thousands of women, actually, that they had surveyed their view of themselves and where they get that view from.
12:27Over 80% said that they got it from social media.
12:32Wow.
12:33So, when they looked at how beautiful they were, what mattered, how they viewed themselves, it came from social media.
12:39And we live in a generation where most people are like this, oh, I'm surprised there's something not wrong with our thumbs, scrolling all day long.
12:47And so, they're getting more of that, and that could be a bot or something else.
12:51So, the tools, what tools can we utilize?
12:54So, again, if you go to digitalgreenbook.org, there's an entire section on how do we protect our kids, right?
13:02How do we protect black kids in this moment of increasing digital harm?
13:06When they talk about things like content moderation policy, what does that actually mean?
13:11That means that they are allowing more racism and anti-blackness into our kids' social media feeds before we've even talked to them about what race even means, right?
13:23It teaches how do we talk about what does digital fluency look like for kids?
13:26How do we make sure that they know that the algorithms prioritize European-centric beauty standards, right?
13:33That their content might get deprioritized, right?
13:36Again, and this is also why it's so important that we have AI that is built with a perspective of black history, black excellence, black joy, like Aisha.
13:49And it's also important that we are creating healthier online communities.
13:53And so, one of my favorite parts of Aisha is if you just go there and you type your state in, whatever state you're from, she will give you a list of all the black newspapers in your state and the links to how to just subscribe right now to make your online profile healthier, to make your online community healthier.
14:15Ebony, sorry, I just want to piggyback off of the question you answered.
14:18I think also, too, it's not just about critical thinking.
14:22You got to have a lot of emotional discipline to be scrolling on social media and not kind of get that imposter syndrome like, oh, my God, this person's doing better than me or this person's doing this, this person's doing that.
14:32You got to have a lot of emotional discipline to be able to keep scrolling and say, I'm not going to feed my attention to that.
14:38I'm going to get my attention into something else.
14:40So, it's not just about knowing, like, your self-worth, but also just having the emotional discipline to know that person is separate.
14:46Their path is different than mine.
14:48My path might be slower.
14:49My path might be faster.
14:50But you got to be intentional on not getting imposter syndrome because I've had that oftentimes.
14:57Like, yes, I played in the Olympics twice.
14:58Yes, I played on global stages before.
15:01But when I see other athletes that have done it before me, I'm like, wow, am I going to get there?
15:05Is the work showing?
15:06So, it's also about me being emotionally sound and emotionally disciplined and knowing my time is going to come and not allowing what I see on social media to influence that or influence my work ethic.
15:16No, no, that's good stuff.
15:18I think all of us here, young, doesn't matter where you are in life, it is very easy to get caught up in what looks real, right?
15:27What looks real and you can begin to compare yourself and put yourself someplace.
15:31And I'm just saying this, that God ain't told you to go.
15:34That's really not a part of your journey.
15:36But because you see it so much, it has a mental health impact on you.
15:41That's right.
15:42It really does.
15:43Not to mention that on our black and brown community.
15:46So, when the president comes out and says they're eating their cats and dogs and people play that clip over and over, and then you see people purposefully create and curate content that negatively impacts communities.
16:01Now, people are scared in their own neighborhoods because other people feed off of that and then come attack them.
16:07It has real life and real world consequences, right?
16:11So, Catherine, for you, how do you see black communities fighting back and reclaiming our truths?
16:18That's a really good question.
16:21I think one of the things that comes to mind is, I know we were talking about critical thinking, but when I think about it, I think of, one, how we shift our mindsets, right?
16:33I think right now, especially in my generation, we think of accountability as an attack.
16:38We think when people are correcting us on misinformation and disinformation, they're actually criticizing us when they're really trying to make sure that they're spreading truth, right?
16:46And so, one, I think one of the biggest things is, like, take a beat, right?
16:52Take a deep breath and be able to respond and not react.
16:57And then utilize the platforms and resources that are available to us.
17:01I think Aisha is a great perspective of, like, a great point of, like, what exists.
17:06And there's also, like, Looking Glass, which is an AR platform that's developed that talks about black history in Pittsburgh.
17:12There are so many that are out there that if we just do a little bit more time doing that research, we can reclaim our history because we'll know it.
17:22I love it.
17:23And I want to just give to our other panelists how we can use, just some steps, how we can use technology not to be a problem but to also be a solution.
17:35Yeah.
17:35You talked about the digital green book, but what are some other ways that we can use it to be a solution?
17:42One of the things that I think we can use technology and social media to do is to push back against the amplification of lies around our community, around black communities.
17:55And because when we see the virality of lies about black communities, let's take DEI.
18:06What did they do with DEI?
18:08It was a disinformation tactic, right?
18:10They defined DEI as corporate classes that kind of suck.
18:15And as long as they could define DEI as corporate classes that kind of suck, they were able to defund sickle cell research.
18:24They were able to cut funding for HBCUs, take millions and millions of dollars away from black entrepreneurs, right?
18:32They were able to cut all types of Medicaid, health care programs for black people.
18:36Why?
18:37Because every time they said it was DEI funding, our brains were already trained to believe that they were talking about corporate classes that sucked.
18:46So we need to be uplifting using technology, using the amplification that technology can do.
18:54What is actually happening?
18:56What do our truths look like?
18:58Because if we don't reclaim our truths, other people are going to tell it for us.
19:03I think the sad part of that is, and you can jump in, is that especially on the issue of DEI, that some people were okay because they thought it just meant black.
19:14Right.
19:15They was okay with it.
19:17Not realizing they're coming for you too.
19:19They're coming for all America.
19:20They're coming for farmers.
19:21They're coming for veterans.
19:22They're coming for Asians.
19:23They're coming for everybody except for white land-owning men.
19:27And so now people who lean one way and thought that that vote was going to be helping them, you ended up not just destroying one community, but destroying your own backyard.
19:41Yeah.
19:42And that's why it's so important that we begin to tackle issues of misinformation and disinformation, but also to consider, like you said, the source.
19:50That's right.
19:51On the website, you said that there's a source that talks about black-owned news sources.
19:58That's right.
19:58How many here in this audience, that when you get up, I know you all may look at mainstream TV, but you actually inhabit of consuming black news?
20:09I love it.
20:10Let's double down on that.
20:12I love it.
20:12And not just sharing that, but letting other people know, those who are in your own backyard.
20:17That's right.
20:17In your field, especially dealing in sports, give us some mistruths that we have, some lies, the lies that people tell.
20:30We really want to know that, that you hear, that you want to come against some misinformation and disinformation.
20:35Yeah.
20:36Especially with sports, right?
20:37A lot of times you see headlines about a player getting traded or this player is injured or this player getting a specific salary or this player going to this team, right?
20:47And you don't really, you think all that is true until the player actually comes out and say, okay, this is true.
20:52I'm going to this team.
20:53I'm going to that team.
20:54So for me, I think it's about believing the actual source.
20:57If it doesn't come from the mouth of the actual person who is being reported on, do you have to take everything that's said with a grain of salt?
21:04I mean, it's been quite a few times where I played in maybe another country and maybe there's been a headline about me being injured or there's been a headline about me not being professional overseas.
21:16And then sometimes I have to set the record straight and say, no, this is what it is.
21:21This is how it is.
21:22And that happens.
21:22You know, you go to other countries and people don't really know how it works or what's your system in America so they think that they can put out maybe like a certain headline or maybe like a certain agenda.
21:32But it's up to you to, like I said, stand to your truth and kind of put that misinformation to bed and really tell it like it is.
21:39And it happens a lot in sports, especially in the male world, especially in the women's basketball.
21:44I'm pretty sure who here watch women's basketball, WNBA, right?
21:47And you guys see the narratives with the Caitlin Clarks and the Angel Reeses, there's so many different narratives.
21:53And my thing is, if it's not coming from Angel Reeses herself or Caitlin Clark herself, then I ain't trying to hear it.
21:58That's just what it is.
21:59Say that one more time.
22:00Yeah, exactly.
22:01And I think, and I'm sorry to be so long with Ebony, but I wanted to piggyback off the information of, not information, the concept on technology, right?
22:09I say, if you can't beat them, join them.
22:11We have to take back our narrative and start pushing positive information out there, right?
22:17Let's start following pages that's telling the truth.
22:19Let's start following black pages that give you accurate black information on our black people.
22:24Let's start following pages.
22:25Let's start sharing pages.
22:26Let's start posting the information that we need so that our people around us and our community are getting the correct information.
22:33I love that.
22:34And I think that's important because how many can feel the fatigue sometimes of trying to figure out whether something is truth and a lie?
22:41Like, it is tiring sometimes.
22:43If there's a fatigue behind that, and I think to combat that is to let's go against that and set a new algorithm.
22:51Let's make sure that we are putting forth things that are positive, those things that are lovely, those things that are good to strengthen our communities because it is leading to poor mental health in many communities.
23:03It is driving many people to have a lot of issues mentally because of what we see online.
23:09Talk about what you're doing, what people can expect with this digital museum.
23:13We just have, like, about two minutes left, but I'm so excited about the work that you're doing.
23:17Just talk about that very quickly.
23:19Yeah, so the whole point of the museum is to connect the African diaspora.
23:24Growing up, my family was from Uganda, and I didn't hear about anything about East Africa or Africa in my college classes or really in my high school, middle school.
23:33And so I was like, hey, as I'm learning about history, as I'm traveling to these places, I really feel like it would be impactful for people across the world to be able to have a platform to be able to see this.
23:43I was able to live in Brazil for a few years, and I got to see all these different things about the African diaspora that connected that I had no idea about.
23:52And so I was really big on how do we disrupt this narrative that we're so disconnected when, like, in reality, there's a through line between all of this and our history.
24:02And so I really wanted to do a job at promoting it and promoting the right history, right, not having the misinformation.
24:08So going back to the people and asking them questions, going straight to the sources and asking them, hey, what makes sense?
24:15What looks good?
24:16What is the thing that really ties in our history all together?
24:21And so we have just, like, 30 seconds left.
24:24Just a quick wrap.
24:25And Esna, I'm going to start with you.
24:27Just a quick 10 seconds as we leave and talk about this, what you want people to leave knowing?
24:33I want you guys and ladies, guys, ladies, all of you to leave knowing that it's okay to be yourselves.
24:42It's okay to speak your mind, speak your truth.
24:45Just do it in a way where nobody can put this stigma or this narrative on you that you're angry or that you're frustrated.
24:50No, you're speaking your truth.
24:52Do it how you see fit.
24:55Read, read, read.
24:56Do your research.
24:57If you feel like information is not real, don't share it.
24:59If you feel like you're passionate about something, share it.
25:01If you see something positive, share it.
25:03Don't feed into negativity because that negativity will come back to you tenfold.
25:08And Esna, I'll let you have the last word.
25:10Yes.
25:10I would say that right now in this moment, AI is already determining the news we see, who gets hired, who gets healthcare.
25:20Right?
25:20It's determining our very reality.
25:23And if we're not building it, someone else is going to be building it for us or more likely against us.
25:29Right?
25:30And so we need to focus on and uplift black news, black resources, black information.
25:38And we need to go to places like Aisha and the Digital Green Book because, like, creating AI for our community should not be new.
25:47It should not be novel.
25:48It is fundamentally necessary at the end of the day.
25:51I love that.
25:52And I'm just going to piggyback as we end to say, in this space of AI, that we must be not just consumers but developers of it so that we can also control our own narrative.
26:03I also want people to support black-owned news outlets and black-owned businesses as well.
26:10I heard it said earlier that all we got is us, but that ends with all we need is us.
26:18I want to thank everybody for listening in on this panel.
26:21Thank you so much.
Comments