00:00Very quickly to put it, I think there are many positives that didn't exist as early
00:09as five years ago, but there are as many negatives that one just has to be conscious of.
00:13I think technological advancement is inevitable.
00:16It's bound to happen.
00:17I remember similar conversations were happening when phones were just coming in, when phones
00:21went to smartphones, people talked about it, when internet went to high speed internet.
00:25So technological advancement will continue.
00:29A few of the advantages for actors, I would say, I remember I came to Bombay at a time
00:33not so long ago, when the only way to communicate with a production house, a studio or a producer
00:38or a casting director was to give a hard copy photograph.
00:41This makes me sound very old, but it was 10 years ago, where they wouldn't like emails.
00:46They would say, we want to meet you and we want to see a hard copy photograph.
00:50That went to digital pictures, to WhatsApp, to now Instagram.
00:55So the fact that you can, you have this immediate connect with a large audience is a positive
01:00that you can get different aspects.
01:02The downside, at least from my perspective, is algorithms, is that you leave it all to
01:07machines to decide what is popular and what is not popular and somewhere over there, originality
01:13takes a hit.
01:14And kills the experimentation, the scope for experimentation.
01:17Scope for experimentation, because it is at the end of its show business and an enterprise
01:21would any day want to, an enterprise that's trying to play it safe, would want to bet
01:26on something that is tried and tested.
01:28But the beautiful intangible thing about entertainment or acting or films and series is that sometimes
01:38the most untried and tested thing can be the biggest, biggest success.
01:43True, true.
01:44And Arunoday, you've been sitting quiet for a while now.
01:47Everyone has such interesting things to say.
01:51The art won't change.
01:52The artist won't change.
01:54We have more information, more access to things.
01:56So they'll either get rid of us or they won't.
01:59But our job doesn't change.
02:01So technology can get as nice as it wants, it still can't feel like I feel.
02:05So till then, I have a job.
02:07Yeah, even as far as AI is concerned, at the moment, it can give you an amalgamation of
02:11everything that it has learnt.
02:13But it can't come up with something original.
02:15Till such time as that doesn't happen, I think we're okay.
02:19We're in a good space.
02:21And Aanchal?
02:22Well, I'm petrified of so much developments happening and the kind of technology that's
02:31happening.
02:32Like, you know, the other day I read somewhere that you can, it was I think in Japan, that
02:41for a house help, you can have a robo.
02:45I mean, that sounds very dangerous to me.
02:47Because it's a fact, like today, whatever we are talking right now, I know the moment
02:52I open my phone, we'll be reading the same things.
02:55So I'm actually very scared of technology.
02:57And I'm not a very technologically, like, I can barely use apps also.
03:04So yes, you're right, there are a lot of positives to it also.
03:08But I think it's so important to keep things separate and not let technology get involved
03:15in our lives to that extent.
03:17And for artists and actors, yes, as they said, we will, like, technology doesn't have emotions,
03:23right?
03:24AI cannot do that.
03:25We can do that.
03:26Yet.
03:27Yet.
03:28No, no, but I'm sure, like, see.
03:29No, no, they'll get there.
03:30No, I don't think that what God has created, the heart, humans can create that.
03:35I don't feel that.
03:37But what is scary is that we have deep faith today.
03:41So there are a lot of negatives and cons to it.
03:45So I think we should have measures where we can control it.
03:49But the way things are going ahead, I don't think...
03:52It's a very unregulated space as well.
03:54It is very unregulated.
03:55Like what Black Mirror was like eight years ago, we're living now.
03:58Yeah, absolutely.
03:59I don't know if you've seen Black Mirror, yeah.
04:01And Shweta and Guru?
04:03I think things are not bad.
04:05The way we use it, that responsibility is on us.
04:09Like, for example.
04:10If we want to make a person, we make him worse than the condition.
04:11Wow.
04:12We are quoting from the show.
04:13Can we keep him with us for the rest of the talk?
04:18The show will be good.
04:19Yeah, yeah.
04:20I think what is very important is expression, is a voice that collectively we have and as
04:28individuals we have.
04:30If that starts going, like the algorithm, stats, numbers, when that starts dictating,
04:38then there is a problem.
04:40Because then we are changing the fabric of the society.
04:42Because we are telling them, it's a demand and supply game.
04:46But here, if you are focusing so much, like in the 70s and 80s, the voices that came out,
04:52then they won't be there, people won't get a chance to express.
04:56And as sapiens, one thing that gets us together is the conversations we have, what we think,
05:06what we like.
05:07But when it spreads so much, then it goes in a very dangerous territory because then
05:13the powerful will keep getting more powerful.
05:16And the technology is related to money, so those who don't have it, then they.
05:21So that is what scares me because of the misuse of technology.
05:25And Guru, closing notes?
05:55Exactly.
Comments