00:00Hi, everybody. I'm Brittany Lewis, a breaking news reporter here at Forbes. Joining me now
00:07is Melissa O'Leary, CSO at Fortilis Solutions. Melissa, thank you so much for joining me.
00:14Thanks for having me.
00:15I want to talk about two stories today that have made headlines, big stories. They are
00:20vastly different, but each contain, and really when it comes down to the crux of the story,
00:25a cybersecurity element that I would love to get your take on. So first, I want to talk
00:29about 23andMe. They filed for Chapter 11 bankruptcy earlier this week, and they're looking to
00:35sell the company. As most of us know, 23andMe is a genetic testing company. A lot of people
00:41sent their personal genetic data to them. What do you make of this news?
00:48I think it's, you know, there's two sides to the coin. I think a few years ago, there
00:55was so much interest and really groundbreaking things that occurred for many families, many
01:05different people across the world with this new technology. The brand promise at that
01:11time was that the information would be secured and not be sold. I don't think anybody really
01:18thought through the implications of a bankruptcy and what this could mean for the data of hundreds
01:24of thousands, maybe millions of people.
01:27I think that's my next question that I want to pose to you. What is the implication here?
01:32If you have your data with 23andMe, what is going to happen to people's data?
01:39We don't know yet. What we know from 23andMe is that they're stating that any purchaser
01:46of the company would be liable for the data, would have to meet the same standards that
01:5323andMe is meeting today. And that could be the case in terms of, you know, inking a sale.
01:59But I think, you know, there's certainly a lot of value in the data, current uses of
02:07that data and future uses of the data as well. And it's to be seen how any purchaser, you
02:14know, could kind of get around anything that 23andMe has in place today.
02:2023andMe wrote an open letter to their customers, and this is what the company said. I want
02:24to read it in part. Quote, your data remains protected. The Chapter 11 filing does not
02:29change how we store, manage or protect customer data. Our users' privacy and data are important
02:35considerations in any transaction, and we remain committed to our users' privacy and
02:39to being transparent with our customers about how their data is managed. Any buyer of 23andMe
02:45will be required to comply with applicable law with respect to the treatment of customer
02:49data. As a security expert here, does that exactly inspire confidence? Do you think people's
02:54data will be safe? I mean, what would the data be used for?
03:00I think the data as it's stored today is very likely safe. I don't personally believe that
03:06it was, it will be safe in the future. And I think that, you know, the number of organizations
03:13that are saying, you know, prioritize going into 23andMe and deleting your data now are
03:20absolutely right with their guidance. We just simply cannot know what the future holds.
03:25And in terms of how that information can be used, I think a really use case would be,
03:32let's say that the future purchaser is part of the insurance community and they use that
03:37data to create actuarial models, or they use that data, you know, in a worst case scenario
03:43to pinpoint what genetic issues you or your family members might have in the future and
03:48then use that on insurance rates. We don't know at this point in time if that's, you
03:53know, something that's in the making or if there's any interest there, but it's just
03:58an example of the value of that data and how it can be used in ways that folks never intended
04:04when they provided it. That's a really scary thought when it comes
04:08to insurance companies, that an insurance company can buy this, see that you're predisposed
04:13to X, Y, or Z, see your health information and say, hey, you are charged a higher rate
04:18of insurance opposed to Joe Smith over there. So I'm thinking about if someone nefarious
04:23comes in, does your data have to have the same protections that it does right now with
04:2823andMe as it is today? Would your personal information be more hackable if someone else
04:33buys it? I'm not, I can't really say whether it would
04:36be more easily hackable. I think it would be, you know, again specific to whatever protections
04:43are in place by any purchaser of the information. But if somebody were to access the information,
04:49it's not only the genetics and the DNA, it's usernames, it's passwords, which have breached
04:54everywhere for many of us in different breaches already, but also family connections. So something
05:01that I find interesting is I personally have not done genetic testing, but members of my
05:06family have. So, you know, I had kind of thought, well, I'm safe from this and from
05:13having any implications occur for me, but you could certainly draw a connection between
05:18myself and my family and their genetic makeup that can be used, you know, kind of in future
05:24cases or it could even be used in social engineering campaigns that we see every day to make those
05:30a little bit more sophisticated because perhaps you couldn't determine, you know, who my second
05:35cousin is that I grew up with, but now you could go into the data and potentially find
05:40that and say, hey, cuz, what's going on? And have a real conversation there. So there's
05:48just so many different uses of the data for, you know, nefarious people, but then also,
05:54of course, those that want to monetize that kind of data.
05:57I think that's an interesting point, because when I was speaking with this story with a
06:01host of people, even if you haven't used 23andMe yourself, you have a sibling, an aunt, a cousin
06:08out there who have put their data in here, who have submitted that. So what do you think
06:12people should do? Let's say you did submit your data to 23andMe to protect it.
06:19If you submitted your data to 23andMe, I do suggest going in. There's a number of guides
06:24available and deleting that data. I think that you should do it both within the app
06:30or within the site, but then to also email them to state, I've deleted my data from your
06:35website so that you have a record of the deletion. It's not to say that 23andMe is not deleting
06:42data, but there's so many users going in and very likely doing this right now that they
06:47might not have a good trail of that occurring. So I think the best case scenario is go into
06:53the app, follow the steps and the settings, get the information removed, and then also
06:59just follow up with an email stating that you did it. I've also read that swabs are being stored
07:06and that in those preferences and in the settings, those are things that you can say you don't want
07:11them, you don't want your swab stored, you want it removed. And then there's, of course, the
07:16third-party use of the information and you might have opted into that. You can still go into your
07:22settings and opt out of that. It's really interesting. In the beginning of this conversation,
07:27you said how these types of companies a few years ago were really touted as revolutionary. It can
07:32tell you so much about yourself. It can tell you about your family, who you're related to. It can
07:38give you insight into your health you otherwise wouldn't know. Now there are a lot of privacy
07:43concerns here. So in light of this Chapter 11 bankruptcy filing here, what do you think,
07:49or should there be anything done in the future to make sure that going forward,
07:53competitors of 23andMe, companies like this will take more steps to protect people's data?
08:02You know, there's already a law in California and I'm sure other states will follow.
08:06A challenge that we have in the privacy area is that it's a state-by-state patchwork. It's also
08:11a patchwork when you factor in the global regimes. So I think that there is a need for a federal
08:17mandate on this that protects people and doesn't make it, you know, challenging for somebody to
08:24opt in or opt out of a service like this. If we don't have those protections in place, companies
08:30are going to kind of fall back on their terms of service and we'll continue to have these challenges.
08:35So I do think it's an area where the federal government should be involved and kind of
08:40lay out a framework and policy for companies. I do now want to make a totally left turn here.
08:47And talk about some news that broke just a few days ago. And that's the Atlantic's editor-in-chief
08:52wrote that he was inadvertently added to a group chat with the nation's top security advisors and
08:58leaders that detailed U.S. plans to conduct airstrikes in Yemen against the Houthis.
09:04What are your thoughts here from a cybersecurity expert?
09:09You know, definitely mixed thoughts on this topic. Signal is really a best-in-class app for secure
09:15and encrypted messaging. And, you know, according to the director of national intelligence, Tulsi
09:22Gabbard, it's actually pre-installed on government laptops and phones. So, you know, it's kind of to
09:29be seen whether the communication there was on personal devices, government devices, all of that
09:35back and forth. But I think that in terms of, you know, inadvertently adding a journalist,
09:42you know, certainly unprecedented. And there's just a lot that we need to continue to kind of
09:49learn here. I know that there's a meeting on the Hill in progress right now on what happened,
09:56why it happened. And we'll learn a lot from that. You know, and just a final thought is that I
10:03worked in the government under the Bush administration. And a lot of the policies and
10:09the laws in the space have to do with the classification of the information
10:13and protecting that information. So the administration has stated that there was
10:17no classified information in the exchange, which I think will have to be, you know, fully vetted
10:25by the intelligence committee and by the director of national intelligence.
10:30But that would really kind of drive, I think, you know, whether the administration ran afoul of
10:37anything. I guess my question to you is, as the cybersecurity expert, you're not as concerned
10:44that signal was used. You're more concerned that the journalist was inadvertently added,
10:49plus that this type of classified sensitive information was shared on this app, or it's
10:55not even the app. It's just that it was he was added. I think, you know, it's as a cybersecurity
11:03expert, it's that he was added. I don't personally have issues with the signal app. My philosophy
11:09is that everything is hacked or will be hacked. If there's a motivated adversary on one end of
11:17the, the, the, if there's a motivated adversary on one side of the equation, they're going to find a
11:23way to get in. And it doesn't matter if it's signal, WhatsApp, telegram, iMessage, an adversary
11:31is going to get it. It doesn't also matter if it's a, you know, kind of a highly classified
11:35system in the government. Those can also be infiltrated, obviously, a lot harder for an
11:40adversary. So I do think it comes down to the information and how we're trained to share
11:45information and where we're sharing it. The White House, the Trump administration,
11:50National Security Advisor Michael Waltz himself said that they're still trying to figure out how
11:55exactly Jeffrey Goldberg, that journalist from The Atlantic was added to the chat.
12:00According to the screenshot, it says Michael Waltz added you to the chat.
12:04Is there any way he could have been nefariously added to this chat or based on signal based on
12:09what you know of signal Michael Waltz himself or whoever was using his account had to add him?
12:16That's a really intelligent question. And, you know, from my experience with signal,
12:21Michael Waltz would have had to have added him. The logging is pretty good on signal. So I do
12:27think that that account is, you know, Michael Waltz's true account account is the one that
12:33would have added the journalist. There's obviously been a lot of
12:37criticism here on what was shared on signal, what was in those text messages,
12:42the fact that there were even different emojis used to talk about airstrikes in Yemen.
12:47As someone who worked for a White House in an administration, what would you advise the
12:53government to do here at this point? Should you still use signal or is a skiff, which is a room
12:58where you go in, you can't even bring a phone or a laptop. You can't have information. Leave the
13:02skiff. Is that protected room really the best way to keep classified information secret?
13:11You know, as somebody that worked in the administration, I was incredibly focused on
13:16what I put in writing and in all courses of action to include, you know, just a simple
13:22kind of one liner. Back in the day, we weren't using emojis as much. So I would hope that
13:28the future Melissa would would not speak in emojis and official communications. But maybe I would. I
13:33think, you know, we're kind of in that era where emojis are a form of communication. And, you know,
13:41I think it's a lesson to the administration that, you know, your information can and will leak,
13:48even if you might be the source of your own leak as well. So to really kind of perform that
13:54litmus test of what I'm saying right now, if this were in the press, is this would this be,
14:00you know, something I would be proud to have said? Or should I maybe kind of scale back what I'm
14:05saying with regard to whether with regard to whether communication should occur and skiffs?
14:13I think the challenge there is really in the human user story. And, you know, all of these
14:17officials seem to be mobile at the time of the planning and the coordination for the strike.
14:25And, you know, I think it's on the administration, as well as the civilian government and the DoD and
14:30others to come up with better ways to securely communicate while officials are mobile, because
14:37that is just the the way that things are today. We're not often, you know, at the office and
14:44especially in politics, you have a busy schedule, and you're kind of running around. So, you know,
14:50that's definitely something that I think needs to be pursued here.
14:55I think to your point, and a wise piece of advice for everyone is to always watch what you put in
15:00writing. As a cybersecurity expert, again, do you think that there's anything missing then
15:05from this really national conversation when it comes to this story?
15:08What I think is missing is the magnitude of, you know, just our simple actions and the ripple
15:19effects that they can have. You know, we've all had those kind of, you know, we're all communicating
15:26on our phones, we're, you know, probably in our phones too much, and our personal communications
15:33or official communications are all conflated in these apps. And I think what's important is to
15:40talk about, you know, how do we break that apart? How do we make it easier to have more streamlined
15:46communication and not have to, you know, kind of be in this bifurcated world of communication,
15:54especially when it comes to personal and professional, because a device now isn't, you know,
15:5820 years ago, it was, you had your work cell phone, and you did your work stuff on it. And
16:03maybe you had a personal cell phone. And now today, it's all in one place. And we need to do a
16:10better job as security practitioners, but then also as technologists to give better options to folks.
16:16Melissa O'Leary, I appreciate your insight into these conversations.
16:20Thank you so much for joining me. You are welcome back anytime.
16:23Thank you. It was nice to meet you.
Comments