00:00Oh boy, well folks, I'm writing it on the fly and see where it goes. I went long.
00:09First of all, it's just an honor to be here and to be among all of you. I'm
00:14humbled by all of you in this room. Anyway, hey folks, indeed I want to echo
00:25the big boys with a resounding thanks again to Melania's stellar education
00:31programs. Like that viral video said, wherever would our country be without
00:37those? I mean that sincerely, you know, just in case. Three years ago I found
00:46myself on stage in the Time 100 Gala in Manhattan, espousing a measure of
00:50concern around the potential for a collective tumble down into the world of
00:54AI without clear guardrails or open AI or open eyes into the looking glass with
01:01no road map home. Well, at the time it seemed quaint, you know, charming,
01:07inconsequential, a non-threatening speech so random ahead of its time and off
01:14topic that I was even able to call upon beloved and fringe ukulele performer Tiny
01:19Tim who sang tiptoe through the tulips to sweet laughs and nods of agreement all
01:25around. Now the feeling in the room was unanimous and palpable. Why ever would we
01:30be silly enough to give up and let AI win the day when we were so clearly at the
01:37peak of our powers as leaders and innovators? Well that speech slash warning
01:42came off as a comfort among that crowd. We shared this mutual knowingness, you know,
01:48that we're all part of this beautiful and perfect human ride together. I'm grateful to
01:54Time for listing me yet again. After all, in the ensuing years, in fact, I fully went down the
01:59rabbit hole with you all. I co-founded a studio with brilliant filmmakers and engineers and we had
02:06the profound distinction of the first underlying licensed model, of which I am deeply proud. I also
02:14had front seat access to the wider world of this wild west. And I'm not talking Tinseltown and movies.
02:22I mean the real world. And boy do I gotta say, I am quite shocked by the accelerated pace of our
02:30current situation. For reasons unknown, we have willingly submitted to a full surveillance state,
02:37done away with all copyright law, agreed to data theft for illusory convenience, and perhaps most
02:43egregiously allowed for sweeping and irresponsible data farming in our working-class communities,
02:48a clear and present danger to our environment and society. What are we doing, friends? Like,
02:53not to ask the obvious, but just, like, why? I mean, maybe, maybe my question tonight is a naive one,
03:04and that's A-okay. After all, I'm just an over-the-hill bimbo who sounds like Sylvester Stallone.
03:14But I guess it did feel incumbent on someone to just step up and at least question that old notion.
03:20You know, the one with great power comes great responsibility and so forth.
03:28Well, I need to believe in hope that we can wrestle some grace back here if we get unified around our
03:35shared humanity. Wait, what's that? An optimist in this economy? In this regime? And yet here I stand,
03:44humbly before you, wanting to believe. Real talk. Whatever happened to being cool?
03:52Wouldn't all those heavy hitters rather a legacy of some form of good? Like, for real,
03:57what's the vibe around here? It's just a bad look, friendos. Hot pitch. Let's turn it around.
04:07Back when I first learned of the existential risk posed by AI, I immediately had tentacles up. How
04:14swiftly we went from quaint paperclip problems to an empire of AI. One of our honorees tonight,
04:19Karen Howe, has made it quite clear we are now consolidating extraordinary amounts of political
04:24and economic power, laying waste to the environment, shortchanging labor, stealing data and IP and
04:30eroding democracy. What gives? When I was your age, that used to be the kind of thing that gave us
04:36collective pause to responsibly reassess. Well, still a dilemma. This promise, right, of progress,
04:43simultaneously appealing and yet eerily increasingly inevitable despite all better judgment. It seems the
04:49least we should do is get to protecting one another, maintaining this tidal wave rightly.
04:54To quote the great and recently deceased Asana Shakur, nobody in the world, nobody in history,
05:00has ever gotten their freedom by appealing to the moral sense of the people who were oppressing them.
05:05So one wonders, why bother? Why stand up here? And yet to quote Buckminster Fuller,
05:11we are called to be the architects of the future, not its victims. So I guess this one says,
05:17hey, fuck it, why not? Yeah, let's pause a moment, right? So we don't lose the plot, celebrate some?
05:26Congratulations to everyone here. You're all already winners, let's toast to that.
05:33Okay, I'm guessing that this speech won't win me too many invites to New Zealand safe houses,
05:38but I'd rather die in Manhattan anyway, so I'm not sweating it. You know, I've gotten some flack for,
05:45like you, being curious by all sides of this seductive conundrum, and maybe I'm guilty as charged.
05:51But, well, I believe it is a crucial moment, a crucial move in this moment to ensure that
05:57artists have a seat at the table rather than the other way around. You know, back when we set out
06:02to create this first licensed model in town, everyone said it was impossible. Turns out that was just a
06:08euphemism for costs more, meaning anything is possible at scale. I learned a lot from that lesson,
06:17and I'm proud by what we pulled off, and now I'm here and in it in this stew with you all, with more
06:24intel and more, you know, information than most perhaps. So yeah, I feel pretty committed to looking
06:33out for one another, seeing this thing through, before this whole thing derails into Upton Sinclair's
06:39The Jungle. You know that classic. Oof. Let's get some appropriate attributions and regulations going,
06:47rather than just burning down the house and acting like it was inevitable. I know it's inconvenient,
06:53it's expensive, but let's get real as technologists. It's possible if we want it and we fight for it,
06:58right? Let's also keep championing hope when we see it. Hey, shout out to AB 1064, the leading ethical
07:07AI development for Kids Act, which had real teeth. Kudos. Let's keep that kind of needed courage up.
07:15Call me a fool, but I got to believe in us and believe in hope, or at the very least the hope that
07:19we get very serious about those respectable throwbacks, you know, shame, accountability, tithing.
07:26Jesus is still popular, right? Whatever happened to philanthropically mandating wide sums towards
07:33clean energy? Let's protect our rural communities as we scale. It reminds me of, uh, once you had one,
07:41you know, Jacques's favorite dude. Oh, uh, Augustus Octavian. Festina Lente, make haste slowly,
07:49meaning thoughtful, mindful, measured ingenuity, lest we find we've played ourself a mess of baffled
07:58kings. And we all know that's coming, right? I swear there's room for nuance here if we just
08:04demand it of ourself, a sacred pause to look out for each other as we scale. There's no need to be
08:10quite so ruthless with our own species, folks. Yeah, I know, I know it's Altman's kink, but holy hell,
08:16is it reckless? May this be received with the humility and heart and open mind I intend it.
08:26I care deeply and believe this room is crucial enough to say this stuff and give it a shot.
08:33Now, before I peace out to a lifetime of being over audited by the IRS, I'd just like to thank
08:42time for this honor and the chance to be here with you all tonight. The opportunity to express some free
08:49speech while we still can. So, to quote the other tiny Tim, may God bless us all and to all a good night.
08:59Thanks.
Comments