- il y a 1 semaine
Future Living Ethics and Opportunities of Biohacking
Catégorie
🤖
TechnologieTranscription
00:14Welcome back to VivaTech and here at stage 3.
00:19It's time to launch our third deep tech session entitled Future Living, Ethics and Opportunities of Biohacking.
00:28Now it's the first of two discussions that will focus on exciting progress in the healthcare sector and this will
00:34be an interactive session so this is your chance to ask your questions.
00:39Don't hesitate to submit those questions or comments you might have for a panelist as you listen.
00:44You just go to the VivaTech platform on the app, select the tab in the drop-down menu titled Interactive
00:51Sessions with Slido, select stage 3 and you are ready to roll.
00:56So please get typing and send in your questions. Our moderator will make sure to take time to get you
01:02some answers.
01:03So let's get started. Please join me in welcoming Baldomero Alonzo Latore, partner at McKinsey and Gabriela Ramos, Assistant Director
01:11General for Social and Human Sciences at UNESCO.
01:42Hello everyone and welcome to this session on the topic of biohacking. It's a really exciting topic.
01:49We have a raving set of panelists with us today. But let me quickly introduce myself. I'm Baldomero Alonzo, partner
01:57at McKinsey, where I support life science companies on innovation topics, usually having to deal with data, digital and analytics.
02:05And today I will be moderating this session on biohacking, right? And we will start it with a fireside chat
02:14with Gabriela Ramos here today, Assistant Director General for Social and Human Sciences at UNESCO.
02:23Gabriela, welcome to this panel and thank you so much for joining us.
02:27Thanks. Thank you so much for the invitation and it's a pleasure to be with all of you in this
02:31fantastic exhibition out there.
02:33Yeah, super. So let's deep dive into it, Gabriela. And actually, as we start diving into the topic, I thought
02:42it could be helpful for everyone in here, myself included as well, like to frame a little bit the boundaries
02:49of the discussion in terms of like, what do we mean by biohacking? What is your definition of biohacking, Gabriela?
02:57Well, I think it's not, there is not a common definition of biohacking. And I have to confess that when
03:04I was invited to this panel, I was like, why is it framed like that?
03:07But biohacking for all of you, it's about getting to know your own biology and to do things to improve
03:16it. And therefore, we were talking with our speakers in the back, and it can be when you need to
03:23study and you take a coffee to keep you awake.
03:26That could be biohacking, because you're trying to improve your performance by external means. So you can just have an
03:33array of many ways in which we can biohack ourselves to improve our performance.
03:37Then you can do with medicine, you can do with many things. But in the world of artificial intelligence, in
03:45the world of progress in the human genome and the understanding of the human genome, in the world of neurotechnologies
03:51and the learning of the brain, this is becoming more and more a mainstream issue where you can really, really
03:59improve your dotation as human, your capacities.
04:04And it's a growing, growing industry. Right now, it's 14 billion, but it's growing 20% per year. And therefore,
04:13it's very important that we look at all of the angles because it's not an individual issue also.
04:18Of course, it's your own decision if you want to be faster, if you want to track your running capacities,
04:24if you want to track your blood pressure with all these tools that we are developing.
04:28But there is also a systemic issue of how do we do that?
04:32Okay. Really helpful. And thank you for sharing that with us, Gabriela.
04:38And I was wondering, on the back of this, I know that making sure that science and technology, right, is
04:47used for humans' benefit without actually, you know, within the boundaries of human rights and human dignity is at the
04:54core of UNESCO.
04:56And especially in your sector, right, human and social, social and human science, right?
05:02I was wondering, Gabriela, do UNESCO's frameworks in the topic of bioethics and the regulation of ethics on science and
05:16technology, do they address biohacking?
05:19By all means, by all means, because remember in 97, when everybody was so hype because you could clone the
05:28dolly, you remember that we were going to clone chips and we were going to clone each other and we're
05:33going to be getting these cellules to do impressive things.
05:36Because UNESCO came out with the declaration on human genome just to ask the question, should we?
05:45Because I think this is not a technological discussion. It's a societal one.
05:50We all talk about technologies, artificial intelligence, generative AI, all these things are just amazing.
05:56But we need to ask ourselves, and this is what UNESCO is doing with our work on the ethics of
06:01science and technology, being in the human genome, being on the question of bioethics, being on the question of artificial
06:07intelligence,
06:08because in 2021 we just approved our recommendation on the ethics of artificial intelligence, is to put the conversation back
06:15of how much these technologies are going to help us to have more inclusive societies, fair outcomes, not abuse, rule
06:26of law, putting the conversation back where it belongs.
06:30Okay, I like that.
06:33And Gabriela, biohacking often involves the convergence of multiple innovative technologies, like neurotechnology, like gene editing, right, as you have
06:48described earlier.
06:49How does UNESCO's work on ethics, on these frontline technologies, right, how does it work?
06:57Does it have relevance for the regulation and the discussion on biohacking?
07:03Well, just very fast, it's super timely, the discussion, because on the 13th of July, at UNESCO here in Paris,
07:12we're going to have an international conference of neurotechnology.
07:15If you think about biohacking, imagine the brain enhancements that you can do, to be faster, to do more synopsis,
07:24to get your brain to develop and perform much better.
07:28These things are at hand, they can be done, but we need to have this ethical conversation.
07:34What are we doing with artificial intelligence, first and foremost, ensuring that we all understand, because this is not a
07:41question of the companies or the government, of course, each one of them have their responsibilities, it's a societal debate.
07:48Everybody needs to be aware of what is the impact of these technologies in your family, in yourself, in your
07:54daughter, and therefore to put the guardrails at the individual level,
07:58but also to understand how do we ensure that we have the right regulatory frameworks, and this question of regulating
08:05or not is completely out of place, because it's how do we do it right.
08:10This is what we're doing with artificial intelligence, with our recommendation.
08:14We are now working with 40 countries, looking at how they can put in place rules that will at least
08:21hold actors accountable, which seems quite straightforward.
08:28Logical, indeed.
08:30And actually, now that you say that, and you have touched on human enhancement as well, Gabriela, what can UNESCO
08:37do to promote safe and responsible practices for human enhancement?
08:43First, you need to get the countries to agree.
08:46Yeah.
08:47That's exactly what we did with the human genome, that's exactly what we did with artificial intelligence.
08:52We need them to agree that, first and foremost, we need to ensure that there is a protection and promotion
08:58of human rights and human dignity, which is not always easy, because there is geopolitical competition, economic competition, technological interest.
09:07But first and foremost, for them to agree that this is the most important issue whenever we talk about technologies.
09:15And that's what we are hoping we will do with the neurotechnologies, because again, you need to have everybody at
09:22the table.
09:23The goal is for the governments to take action, but of course you need to increase awareness, to inform, to
09:30provide more elements for people to understand what is happening, and then to work with all the countries, because as
09:37usual, these are highly concentrated technologies.
09:40It's the G7, it's France, it's the US, it's Canada.
09:43What happens with the rest of the world when you have half of the population not connected?
09:48Half of the world population not connected?
09:51So this is a very important role that institutions like UNESCO play in bringing the countries together, and at least
09:58to have the same narrative, and to ensure that they take action to channel these amazing technologies for the common
10:05good.
10:06I love, Gabriela, that you're touching on the issue on how to handle potential inequalities arising from biohacking.
10:16I think we're definitely going to keep touching on these topics in the next part of the panel.
10:22These were brilliant insights.
10:24Thank you so much for sharing, Gabriela.
10:25And I suggest we actually welcome our two other panelists that are going to be joining us today for the
10:31rest of the discussion, and we get the conversation going.
10:34So, I would like to welcome Ramses and Bill.
11:00So, Ramses and Bill, thank you so much for joining.
11:04Thank you.
11:05Short introduction of the two of them.
11:07Dr. Ramses Al-Qaeda, he's the founder and CEO of Neurobol.
11:13Neurobol is a company that he founded, right, where they developed a brain-computer interface allowing people to control software
11:21and digital devices using only their brain activity, which sounds extremely, extremely impressive, Ramses.
11:30He has been working for over a decade on neuro-technology topics, right, so he's definitely like a great addition
11:36to the panel here.
11:37Please, a warm welcome to Ramses.
11:44And we also have Bill, Bill Liao.
11:46He's a general partner at SOSV, which is a venture capital firm dedicated to funding early-stage startups on deep
11:56tech, which has, and I know you have like over 1.5 billion in assets managed currently, right?
12:02Many of them actually dealing with, or they are bio-hacking firms of companies, or companies.
12:07So, word welcome again to Bill, please.
12:11So, I think, you know, we have a really fantastic roster to touch on on this topic today.
12:17And now that we have Ramses and Bill on the stage, you know, I wanted to ask with like a
12:23very, let's say, you could say generic question, but I think it will still be helpful to keep painting the
12:30picture for what bio-hacking can do.
12:33How can bio-hacking help improve health conditions or optimize human performance?
12:40Ramses, do you want to start and give us some examples?
12:43Yeah, definitely. So, bio-hacking has the ability to be incredibly empowering technology.
12:49Just to give you all some kind of highlights in some of the work that we do, we actually develop
12:54these headphone devices that have these silver bands, and they record your brain activity.
13:00And so, what we can do with this is we can track cognitive health.
13:03We can understand when somebody's tired, let them know when they should be taking breaks, which prevents depression and other
13:10issues associated to that.
13:11And on top of that, we're also working with some key military officials to prevent aircraft accidents.
13:17So, you can already see that by understanding ourselves more, we can then build on top of that to create
13:22solutions, right?
13:23And the biggest issue with the body, it's even though we feel a lot of things happening to us, for
13:30example, being fatigued, people don't take breaks early enough.
13:34And so, you really don't get this biological response until you're already burnt out.
13:37Or in the case of dehydration, you don't feel like drinking water until you're already thirsty.
13:42And that means you're already dehydrated, you're already getting the performance loss.
13:45So, by self-understanding, we're able to then create solutions that we can intervene on.
13:50Fantastic.
13:51So, I'll give you a couple of examples.
13:54I also am chair of Social Tech Trust, and we invest for tech for good.
13:58A very nice biohack that we invested in is 3D printed limbs for kids who have had a limb loss.
14:08And it's called Open Bionics.
14:10It's a fantastic company.
14:12Now, as kids grow, they need new limbs if they've lost a limb.
14:17And guess what?
14:18You can hack together with a 3D printer artificial limbs that grow with the child.
14:23So, that's a fantastic biohack.
14:25Another fantastic biohack that's direct impact on health is an organization called Open Insulin.
14:32That is a group of biohackers in the U.S. that are creating a system where people can make their
14:38own insulin cheaply at home.
14:40It's a moonshot, but what an amazing difference to be able to make a drug like insulin,
14:46which is actually a natural hormone, in a lab in your home.
14:50And then a further biohack might be to actually do a genetic change to your own pancreas so that it
14:56starts making insulin again.
14:58So, there are lots of potential benefits.
15:02I love it, Bill.
15:03And picking up on these examples that you were giving, could you tell us a little bit more about what
15:10you think is the landscape for the integration of biohacking practices into, let's say, mainstream healthcare practices in the healthcare
15:19system?
15:20So, about 20% of all the technology you use was invented by a hacker.
15:25You know, you look at your phones, a lot of that tech wasn't invented in the mainstream.
15:30A lot of the software wasn't built in the mainstream.
15:33There are enormous opportunities to integrate into the mainstream ideas that have been developed by individuals or small teams.
15:42And those ideas and those developments, they're often in the nature of invention.
15:48You know, industry is very good at innovation, taking an idea and making it better.
15:52Hacking is where you actually have come across a stop, and you need to break through that wall.
15:58And that's the sort of ethical good hacking where you're trying to solve a problem.
16:04There is, of course, bad hacking.
16:06You know, somebody running a meth lab could be considered a black hat bad biohacker.
16:12Somebody who's doing psilocybin research in order to cure, you know, their own depression could be considered a good biohacker.
16:20And that psilocybin research is actually coming mainstream, even though it was banned back in the 60s.
16:26I think one thing I want to add to that, too, is that every startup that we see to a
16:30certain degree is hacker, right?
16:32The only difference is that we gave them money, so now we call it a company, right?
16:37But think about all these things that they do to break the rules.
16:39There's a reason that we see huge innovation in startups and larger corporations want to buy them up.
16:44It's because you need those small teams not following the rules to actually innovate and break the establishment to really
16:52create a new solution to a problem.
16:54Okay, I love it.
16:56And you were alluding to the notion of, like, white biohackers versus black biohackers.
17:01What's the balance that you think exists between those two?
17:06Currently, in terms of, like, the scary thing that people talk about is, ooh, gene editing, you know, you're going
17:11to make a horrible disease and kill us all.
17:13There are no black hat bad biohackers in that space.
17:17None.
17:17Because it's just so easy.
17:20Like, making a change to your own genome for a good purpose is so dangerous and difficult that we're probably
17:27going to see somebody die horribly in the not-too-distant future trying that.
17:30But trying to kill everybody else using biohacking, really hard.
17:36Much more likely that you kill yourself.
17:38So, the balance on black hat, I mean, black hat biohacking is really at the stage of people having meth
17:46labs making new recreational drugs.
17:48That's about where it's at.
17:50And it's not that big a deal.
17:52White hat, there's a huge, enormous movement of people that are getting their hands on this democratized technology.
18:01AI is enabling that.
18:03And, of course, they are bumping up against regulation.
18:06Like, for instance, in the EU, genetically modified crops have been banned for a long time.
18:11There has not been a single death from a genetically modified organism in our history.
18:16And there's no real chance of that happening.
18:19But that blanket ban is causing people to starve.
18:22So, the deaths are on the side of the ban and on the side of preventing a perceived black hat
18:28when actually it's all white hat stuff.
18:30Okay.
18:31That's good to hear.
18:32Gabriela?
18:33I just want to chip in.
18:35And that's why it's so incredibly refreshing to have the industry.
18:42Because you are like the go-get-it.
18:44And that's fantastic.
18:46I have to say from the systemic perspective of somebody working in an international institution but also being a government
18:52official before.
18:54Governments shape markets.
18:56This idea that everything occurs in a little garage with people.
19:00Of course, you need this entrepreneurial mindset and the way to put the things.
19:05But lots of these inventions are financed by public funds.
19:09The U.S. is an amazing funder.
19:11Here, you have a lot of financing from the French government.
19:15So, let's just get that right.
19:16I think that it's always a public-private partnership.
19:19And in many ways, of course, the innovation comes from where it comes from.
19:24How much is good and how much is bad?
19:26We don't know.
19:28These are very fast-moving markets.
19:32And the way we should think about it is do we have the framework that will incentivize people to invest
19:41more in the good technology than in the bad technology?
19:45I don't know.
19:46Because if you think about what President Biden did to put 300 billion to invest in climate technologies, that is
19:55what's driving the climate transition in the U.S.
19:58and then the Europeans are doing, so governments have a role to play and I'm not sure.
20:02The question of the benefits, and I completely agree with you, saving lives because you have this in your head,
20:09that's fantastic.
20:11What would you do if the people that gathered that information of somebody that got tired,
20:17oh my God, this worker is getting tired too often, so I'm not going to promote him.
20:21And therefore, then you are getting information that is so private, so, and you don't even know what efforts this
20:29worker can be doing to do something else.
20:31My concern is not about these technological improvements that I completely agree with both of you.
20:37My concern is that we are producing a massive amount of data in terms of your brains, of your feelings,
20:43of how you do, how you perform,
20:46who is going to decide how to use that information?
20:50I want to just rebuttal to that because I agree with you to a significant extent.
20:55But the part that makes me worried, you know, and you brought this up, there's going to be a UNESCO
20:59talk about brain ethics and brain data.
21:02And I actually just came back from a panel with the CIA on export controls.
21:06They wanted to put export controls on brain-computer interfaces.
21:10And the thing that resonates with me, and I'm sure Bill feels the same way, and I want you to
21:15speak afterwards,
21:16is we have a lot of people who don't know the technology making rules for the technology.
21:20Now, I agree with you that I have concerns, and we've actually written numerous books at Neuroble when it comes
21:26to neuroethics and how to think about them.
21:30And I agree that employers are going to want to use that technology in ways that are wrong.
21:35But the thing is, we've been approached from that perspective, and we told them no.
21:40Actually, what you want to do is you want to use it as an empowerment tool.
21:43You want people to self-understand their cognitive state and them to make that decision.
21:49You're going to get a more productive employee.
21:51They're going to be happier.
21:52They're going to have lower stress.
21:53We've shown this in all our research.
21:54If you're using this to monitor them, they're just going to get pissed off.
21:58And the thing is, we already have monitoring equipment.
22:00Like, whether it's from the brain or whether it's from software tracking your keyboard,
22:04if your employer is going to use it, they already have easier ways of getting that data
22:08that is tied to other things than whether you should take a break or not.
22:12And so my biggest fear, and I've seen this time and time again,
22:17is individuals who don't understand the technology making decisions for those who do understand the technology.
22:22And that's a really big issue with AI as well, too.
22:26And I have to say, I was an environmentalist long before it was cool.
22:32I've planted 100 million trees and lifted 3 million women out of poverty doing that.
22:37And I can tell you, Biden's initiative is about 20 years too late.
22:44It wasn't government that inspired me and our team to go and actually start the work.
22:49But regulatory is often, and more often, is tailwind, not actually up front.
22:56You say, oh, we're spending money on research.
22:58Actually, a lot of this biohacking stuff gets no government funding at all, can't get it,
23:04because the people who control the grants don't understand it.
23:08So, you know, as much as I agree with you that regulatory frameworks are useful,
23:14I cannot say that government is the leading edge.
23:18In fact, it is often a tailwind.
23:20And those tailwinds, when they are ill-informed, have hampered progress significantly,
23:28while not delivering the perceived benefit of protecting us from something.
23:35I think that we're closer than what it looks.
23:40First, institutions like UNESCO, like the government,
23:44trying to deal with things that they don't know about.
23:46We equip ourselves.
23:48I'm not a neurotechnologist, but I have a committee of experts,
23:52the best and the brightest in the world.
23:55Rebesh Snipe-Vice.
23:56I think you believe they are the best and the brightest,
23:58because I've been on those panels.
24:00You cannot, you cannot.
24:00And a lot of them, I think, like, wow.
24:02Please, please, Ram says, Ram says, Ram says, Ram says, please, please, please, please,
24:06please, please, hold it, hold it, hold it.
24:08And you will be in my 13th July conference,
24:11and you will tell me if I don't have the greatest experts.
24:14What I agree with you is that governments to deliver on their duty of care.
24:20Governments have the duty of care.
24:22I pay my taxes to be protected from the downsides of these technologies.
24:28You are not going to do it.
24:29I'm happy that you are a responsible person
24:31that are looking also at how to protect the people
24:33and how to use these technologies.
24:35You are not in charge of that.
24:37You are in charge of innovating.
24:38It's for the governments that we pay taxes for
24:41to get their equipment to be in shape to deal with these issues.
24:46When we develop the readiness assessment methodology
24:49for the artificial intelligence,
24:51I'm not going to tell countries
24:52how are they going to regulate artificial intelligence
24:54because it depends on their societal preferences,
24:58on whether they are light touch, hard touch.
25:00Today, the European Union is going to do that.
25:03But if things go wrong, you have to be accountable.
25:07And who is going to hold accountable
25:09whomever did something wrong?
25:11It's not the technologies.
25:13It's going to be the governments,
25:14and they need to be well equipped.
25:16I'm going to tell you a story of a lack of regulation.
25:19No, let me just finish.
25:21A lack of regulation that then the government
25:23needs to pick up the pieces of an economic sector
25:26that created so much harm,
25:28which was the financial sector in 2008.
25:31Light touch regulation, we're lagging behind.
25:33We don't know how to do it.
25:34They're very innovative.
25:35Thank you very much.
25:36Then, boom, trillions.
25:38$15 trillions that the governments have to put in
25:41to save the financial system.
25:43Therefore, I think we better be well equipped
25:47before these things happen
25:48and try to do it also with you
25:50because you are the ones that know.
25:52But I have to say there is a responsibility.
25:55I have to respond immediately to that.
25:56I agree with you 200% on accountability.
26:00Right?
26:00Accountability, fantastic.
26:02Have accountability.
26:04Have governments regulate accountability.
26:05My problem is accountability looks like
26:10I'm going to ban this at the moment.
26:13A lot of the time.
26:15Try and get a necessary genetic modification
26:19to a crop in Europe.
26:21It's impossible.
26:23Like, totally impossible.
26:24Forget accountability.
26:25You're never going to be accountable for anything
26:28because you're never going to do it.
26:30Right?
26:30And let's take the financial crisis.
26:33How many of those bankers went to jail?
26:36One.
26:37One.
26:38So, don't talk to me about accountability for it.
26:41There was none.
26:43Okay.
26:43I love the energy of this discussion.
26:46This is a lot of fun.
26:48But actually, you know,
26:49what I think we can convene
26:52is that certainly there needs to be collaboration
26:55between biohackers, biohacking companies,
26:58scientists, policymakers, right?
27:01And I would love to take your view on, like,
27:05how can we foster that collaboration, right?
27:08Maybe the start of it is what we're doing here today.
27:10But I would be keen to get your take
27:13on how can we foster the collaboration
27:15between all of these actors
27:16in order to actually lead and handle
27:19and address the regulatory and ethical implications
27:22of biohacking in the best way possible
27:23for the general public.
27:25Gabriela.
27:26Well, I think recognizing who is responsible of what
27:30and who is in charge of what.
27:34International institutions,
27:35we are not going to be developing
27:37the regulatory rules in the member states.
27:39That's for legislators.
27:41That's for the governments.
27:42That's for the citizens to get engaged
27:44and to deal with that specific comment.
27:46What we can do is to raise awareness
27:49about the downsides
27:50and the problems of not doing it.
27:52And that's exactly what we're doing.
27:55I give you the example
27:56of how we are implementing
27:58the recommendation on the ethics
27:59of artificial intelligence in UNESCO.
28:02First and foremost,
28:03we decided,
28:05193 member states,
28:07that these technologies
28:09should not impeach
28:10on human rights and human dignity.
28:12Whatever that means in your context,
28:13you define it.
28:14But this is the rule.
28:16Should lead to fair and inclusive outcomes,
28:19which is not the case many times.
28:21should be used to solve more problems
28:24and not to create more.
28:25I mean, this is a narrative.
28:27How do you get that done
28:28with accountability,
28:30with transparency,
28:31and we're getting into the detail
28:32and with policies.
28:34Policies matter.
28:36How you enact the policies to deliver.
28:38What did we do?
28:39We created a diagnostic tool
28:41that members will be delivering
28:4340 national reports.
28:45And we're working with Microsoft
28:47and with Telefonica.
28:48Because they think
28:49that it's much better
28:50to have a sound regulatory framework
28:52than the uncertainty
28:54that we're getting into,
28:56where you have regulatory fragmentation,
28:58where nobody's with the interoperability
29:01necessary to understand
29:02that these are global technologies,
29:04and they are into the room.
29:05But the responsibility
29:07of developing the protective laws
29:10for something goes wrong,
29:13and it's not getting,
29:14I'm going to give you another example.
29:16The rules about having
29:20safe provision of food
29:22in the restaurants.
29:24You and I were not going to go
29:25into each kitchen
29:27to see that the food is good.
29:28You create the norm,
29:30you do some audits,
29:31and if something goes wrong,
29:33there should be the mechanisms
29:35to repair.
29:37That's all what we're doing.
29:38It's not getting into the nitty-gritty
29:40to tell you how to do the technologies,
29:41because that's not possible.
29:43But at least there should be a framework
29:45that when something goes wrong,
29:46when a young lady in the UK
29:48suicides herself
29:49because she got access
29:50to all this information,
29:52there should be somebody responsible.
29:54And who is the one that protects
29:55and we pay our taxes
29:56to be protected?
29:58The governments.
29:59So yes,
30:00they need to equip themselves.
30:01And I think they have not been investing
30:03a lot in these areas,
30:04but they better catch up.
30:07I think that
30:09I agree with the core concept
30:11that you're saying.
30:11There has to be accountability.
30:13Where I get a little tied up
30:15is especially with this new type
30:16of technology like AI,
30:18like brain-computer interfaces,
30:19it's not like the telecom industry
30:21where you break up monopolies.
30:22It's not well understood.
30:24Just to give you some highlights,
30:26when I went to this
30:27export controls meeting,
30:28the whole meeting was focused on
30:30should we send this technology
30:32to China?
30:34Should we send it
30:34to these other countries?
30:35What if they make fighter pilots
30:37that can control planes
30:38with their brains?
30:39And, you know,
30:40the scientists
30:41and the startup companies
30:42and they are looking at each other
30:43and they're like,
30:44why would anybody
30:44want to control
30:45a plane with their brain right now?
30:47It doesn't make sense.
30:48We want to make this technology
30:49so that somebody with ALS
30:51can communicate
30:51to their caregiver.
30:52And we're so far away
30:53from even just that.
30:55And whether somebody
30:56is in China
30:57or in the United States,
30:58it doesn't matter
30:59if they have ALS,
30:59they should be able
31:00to communicate
31:00with their caregivers.
31:02And we shouldn't even be thinking
31:04about fighter pilots
31:05or all this other stuff
31:06maybe until like 100,
31:07200 years from now.
31:08Then we can at least
31:09start that debate.
31:11And so it's because
31:12this technology is so new
31:15and the solutions
31:17that are being shared
31:19are so nuclear,
31:21it's should we just ban
31:22all exports
31:22of this technology,
31:24that that's actually limiting,
31:25for example,
31:26how researchers in China
31:27can work on
31:28brain computer interfaces
31:29as well too.
31:30How we can make money
31:32in other countries.
31:33For example,
31:34we're not going to be doing
31:35our launch in Europe
31:36for a little bit longer
31:37because there's so much regulation
31:38that we need to get to
31:40that all these things
31:41get in the way.
31:42And they get in a way
31:43in a way that doesn't
31:44actually help the consumer
31:45or doesn't make them
31:46any safer.
31:46It just means
31:47it's going to cost more money,
31:48it's going to cost more time.
31:50And what does that mean?
31:51More startups are going to die
31:52on the way there.
31:53Luckily,
31:53we're not going to be one of them.
31:54We're well-funded,
31:55we're executing really well,
31:56but a lot of other innovation
31:58is going to be stifled.
31:59And that's the part
32:00that I worry about.
32:01Okay.
32:02And I would just say
32:03that the number
32:05of startup entrepreneurs
32:06that I've met
32:07traveling the halls
32:08of the UN
32:09is kind of close to zero.
32:12And when you actually
32:14talk about
32:15the best and the brightest,
32:16you've got to actually
32:17change the way
32:19you interact
32:20with the best
32:21and the brightest
32:21because the true best
32:22and the brightest
32:22don't have a lot of time
32:24and they are not
32:26at big corporates
32:28and they're not
32:29at academic institutions
32:31a lot of the time.
32:33And when it comes
32:33to biohacking,
32:35most of the time.
32:36So,
32:37there has to be
32:38new ways
32:39of government
32:39to create outreach
32:41to find the best
32:42and the brightness
32:42and actually make
32:44efficient use
32:45of their time.
32:46and like,
32:47I'll show up anywhere
32:48that anyone asks me
32:49about anything
32:51that I care about
32:51provided I can find
32:54that as a useful
32:55use of my time.
32:56Many times
32:57I've shown up
32:58and have had
33:00my time
33:01brutally wasted
33:02and I know
33:03other best
33:04and the brightest
33:05also have that problem.
33:06So,
33:07if you want to look
33:07at engagement
33:08and you want to
33:08work together,
33:11meet us halfway.
33:12Yeah.
33:13And just to add to that,
33:15I agree.
33:16I actually just spent
33:17a significant amount
33:18of time at a conference
33:19with some of the
33:20best scientists
33:20in the world
33:21and the ones
33:21that are being asked
33:22to join these panels
33:23on brain computer interfaces,
33:24sure,
33:24they have 50 years
33:25of experience
33:26but 50 years
33:27isn't what's happening
33:28nowadays.
33:29They're not caught up.
33:31What startups are doing
33:32is so much farther ahead
33:33than what I see
33:34in a lot of lab
33:35and research work
33:36that,
33:36you know,
33:37like,
33:37we're missing
33:38that entire scope.
33:39Just because the resume
33:40looks impressive
33:41doesn't mean that
33:41they're working
33:42on the best work.
33:43It means that they
33:43used to work
33:44on the best work
33:45and they've been there
33:45for a very long time.
33:49Helpful insights
33:50and a sparkling discussion.
33:51I don't know
33:52if we have time
33:53for one last question
33:54in here.
33:54I don't know
33:55if you can help me
33:55with that.
33:56But anyway,
33:57there was a question
33:58that was asked
33:59in the Q&A
34:00here that I have
34:01that I would love
34:02to get your quick date on.
34:04It is the following.
34:05Biohacking is sometimes
34:06attacked
34:07because it might
34:08widen inequalities.
34:09what do you think
34:11about that
34:11and how can we do
34:13to avoid it?
34:14I don't know
34:15who wants to take that.
34:16Gabriela?
34:17I think that this
34:18is exactly
34:19the kind of service
34:20that institutions
34:21like us do
34:22and I found
34:23it a little bit
34:24arrogant to say
34:25that we don't invite
34:26no,
34:27we can have
34:28this conversation
34:29among ourselves
34:30and then
34:30you will bet
34:31my list
34:32and you will tell me
34:33if I got the right
34:34people or not.
34:35the point here
34:37is
34:38and by the way
34:39when we created
34:39the question
34:41is not even
34:41to invite
34:42the hackers
34:43or the not hackers
34:44this is a societal
34:46conversation
34:47and we need
34:48to bring
34:49everybody on board
34:50and you cannot say
34:52because they are
34:52not equipped
34:53with the understanding
34:54it's our responsibility
34:55to make things
34:57understandable
34:58and to explain
34:59them in a way
35:00that anybody
35:01can participate.
35:02I was very proud
35:03when we developed
35:04the recommendation
35:05on the ethics
35:06of artificial intelligence
35:07with all of our offices
35:08around the world
35:0960 offices
35:10around the world
35:11we did global consultations
35:12you know what happened
35:13I got 55,000 comments
35:15from indigenous communities
35:17from single moms
35:19from
35:19I'm not
35:21I'm not privileging
35:22those that can speak
35:23to your level
35:24I'm privileging everybody
35:26because everybody
35:26is going to be impacted
35:27and one of the worst
35:29impacts
35:29is that these technologies
35:31are benefiting
35:32the lucky few
35:34when you have
35:35half of the world
35:36population
35:36not connected
35:37to internet
35:38or not having
35:39stable access
35:40to broadband
35:41then we are just
35:43missing the whole picture
35:45and it's institutions
35:47like us
35:47that can say those things
35:49because of course
35:49you are
35:50with your ecosystem
35:52that is so dynamic
35:53and so interesting
35:54and so exciting
35:55that then you turn around
35:56and you say
35:57excuse me
35:57this example
35:58that you give
35:59about being banned
36:00or not banned
36:00I don't think
36:01that's what we are
36:01talking about here
36:02because banning
36:04actually
36:04is not about
36:05human rights
36:06it's about
36:07geopolitics
36:08it's something else
36:09what we are saying
36:11is like
36:11whatever we do
36:12let's ensure
36:13that human rights
36:13and human dignity
36:14are really there
36:15and first and foremost
36:17and this is for all
36:18the people here
36:19and for you
36:19and for everybody here
36:20let's get informed
36:23let's know
36:24what does it mean
36:25what does it mean
36:26the neurotechnologies
36:27the neurotechnologies
36:29are there already
36:30you get there
36:31and suddenly
36:32you want to go to Mexico
36:33you and I
36:33go to Mexico
36:34very often
36:35and you receive
36:36packages of information
36:37about Mexico
36:38why
36:38I don't know
36:38well maybe
36:39because there is
36:40information about yourself
36:41that is being tracked
36:42by an algorithm
36:43with artificial intelligence
36:45that is telling you
36:45to do so
36:46what happened
36:47with misinformation
36:48why some people
36:49are getting tricked
36:50to go and attack
36:52a congress
36:53or a president
36:54or to contest
36:55because there are
36:57some weaknesses
36:58and some cognitive biases
37:00that can be abused
37:01that's why I care about
37:03and that's not your duty
37:05it's the duty
37:06of international institutions
37:07to ring a bell
37:08and then governments
37:10to take action
37:10I think we're all
37:12on the same side
37:13when it comes to
37:14protecting human beings
37:15and making this technology
37:16accessible as possible
37:17I think that
37:18where there has been
37:20some difficulty
37:21in communication
37:22is that
37:23you know
37:23we're dealing with
37:26really trying to bridge
37:27the gap of information
37:28between the two groups
37:29so that the decisions
37:30that get made
37:31are made in a way
37:32that minimizes
37:34the impact to innovation
37:35while still protecting
37:36the individual
37:37and on top of that
37:39every single time
37:40we do add one of those barriers
37:41you know
37:41you spoke about equality
37:43making sure people
37:43have access to it
37:44now it raises costs
37:45right
37:46every single time
37:47we add one of those barriers
37:48it only makes it more difficult
37:49to get into that space
37:50and create that change
37:51and so
37:52you know
37:53at least for these
37:53newer technologies
37:55my perspective
37:56is leave them alone
37:57for the next
37:57at least 10 to 20 years
37:59and then
38:00you know
38:00we're going to get
38:01to a better place
38:02because we're able
38:02to innovate quickly
38:03you know
38:04and I know you laugh at that
38:05but really
38:06especially
38:07you know
38:08we see this with AIs
38:09there's so much hype
38:10about them
38:10really what we created
38:11is a glorified calculator
38:13for the English language
38:14right
38:15and
38:16you know
38:17understanding sentience
38:18that's
38:18we're still really far away
38:20from that
38:20you know
38:21understanding general intelligence
38:22and still really far away
38:23from that
38:23and so
38:24you know
38:25from an outside looking in
38:26you might think it's fear
38:28that we shouldn't be
38:29you know
38:29creating these barriers
38:30but
38:31if you're the one
38:32actually working on the tech
38:33you kind of understand
38:34how far you really are
38:35Bill please
38:35last intervention
38:36we have 30 seconds
38:37two very quick things
38:39one
38:39it's every human's
38:40moral and ethical duty
38:42you know
38:43like every
38:44it's not
38:44you know
38:44I have a duty
38:45two
38:46something like open insulin
38:48benefits
38:48the poorest of the poor
38:50because it's being produced
38:51without a profit motive
38:53and therefore
38:54is an equalizer
38:55and that's the kind of biohacking
38:57that we need
38:58because biohackers
38:59don't just work for money
39:00they work for what they believe in
39:03and that's how you
39:06cure inequality
39:07more than anything else
39:08fantastic
39:09well
39:11a sparkling
39:12and energizing conversation
39:14this was a lot of fun
39:15I hope it was for all of you
39:16also
39:17all of you as well
39:18I take like maybe three things
39:20I take many actually
39:22but if I have to summarize it
39:23it's about
39:23there is impact
39:24I love the word that Ramses is doing
39:26on Neurable
39:27the example that you presented
39:30Bill about like
39:31building your own insulin at home
39:34there is a need for
39:36collaboration
39:37between like all the different players
39:39like companies
39:41biohackers
39:41scientists
39:42policy makers
39:43to make sure that we can
39:45govern this
39:46on the right
39:47putting the right boundaries
39:49and there may be like
39:51an opportunity also
39:52for like regulators
39:53to
39:53they will need to
39:55boost your capabilities
39:56because like everything
39:57is moving
39:57so quickly
39:58right
39:59that is difficult
39:59definitely to
40:00to catch up
40:01so
40:02again just to close it here
40:04I would like to thank
40:05Gabriela
40:05to thank Ramses
40:07to thank Bill
40:08for being here today
40:09this was like
40:10extremely energizing
40:11and fun
40:12and I would like to thank
40:14all of you
40:14for attending this session
40:16and please remember
40:17in five minutes
40:18we're going to have another one
40:19on the topic of gene editing
40:20so stay here
40:21stay tuned
40:22because it's going to be
40:22as fun as this one
40:24so
40:25thanks everyone
40:42yes
40:43thank you
Commentaires