- hace 5 semanas
Categoría
📚
AprendizajeTranscripción
00:00To do Christmas shopping, because to make is to elaborate, to make is to create, so in this case you are simply doing an action but you are not elaborating anything, you know, to do some shopping, so what else?
00:17And I also, just today it was just like a before day care, where tomorrow it is, so it was just that.
00:44Continue.
00:47I need a full description, you know, I mean, the idea is like supposedly you are having a conversation, right, as if you were having a conversation with any of your friends, so you have to, you know, whenever you have a conversation with any of your friends, you continue, you make questions, you interact, you know, so you try to follow the conversation.
01:10So, what did you do to get there to the supermarket? Did you take a taxi, a bus, your car, or what method of commuting did you use?
01:22I took the car, we, well, I drove, you know, we went, I drove the car, or we went there by car, you know, so try to repeat that, we went there by car.
01:44We went there by car.
01:46We went there by car.
01:48Okay, one more time. How did you, how did you get to the supermarket? Did you use a taxi, a car, or you went there by walking, or what?
01:56We went there by car.
01:58We went there by car.
02:00Excellent, very good. How long did it take you to get there?
02:04Like, 18 to 20 minutes.
02:08So, give me a full statement using a simple password. For example, what is the simple password of take?
02:14Took.
02:16So, elaborate your statement.
02:18It took us, like, 18 to 20 minutes because we lived on the, well, outside the city.
02:34Okay, the supermarket is outside the city.
02:38No, I, I, we lived, my family and I, we lived, uh, outside the city.
02:49Okay, my family and I live, my family and I live. We don't say, my family and I will live, because that's adding a pronoun that is not necessary. So, my family and I live.
03:00Okay. My family and I live outside the city. Okay. So, that's the reason that, uh, it take us, uh, it took us, like, like, 18 to 20 minutes.
03:18That's good. Excellent, very good. So, for example, what about your, what about your job? Does it take you the same to get to your job?
03:25Yes, it, it, well, to arrive, like, the, the, the, like, the city, it took, it took us, it took us, one more time.
03:39Okay, one more time, one more time. How, how much time do you usually take to get to your job?
03:44Like, another five minutes, like, 25 minutes or even 30 minutes.
03:54Okay, remember to use, you need to use the action, Naomi. You know, you need to use the action to describe the action, the verb that you are using here.
04:04So, the, the common answer, you would say something like, it usually take us from 30 to 45 minutes to get to my job.
04:14Or it usually take us 30 to 45 minutes to get to the supermarket. So, you need to elaborate a full statement.
04:20I mean, the answer is okay. If you simply say 15 to 30 minutes, that's fine. But I need you to use your English.
04:26And most importantly, I need you to conjugate the verbs, right? If you don't know how to conjugate the verbs, then there is a problem.
04:34Because conjugating a verb is necessary. Right? And I see that whenever I ask you a question, you try to avoid the verb. And that's not okay.
04:45You need to create a full statement. Because if, if you don't create a full statement, it means that you have some issues at the moment of speaking.
04:53And remember that the most important aspect of speaking is to be clear and concise. And in order to be clear, verbs should be used consistently. Right?
05:02So, one more time. How much time, how much time do you take on getting to your job daily? You know?
05:11Well, it takes me about 30 minutes to get to work.
05:28Okay. Excellent. Very good. So, Avelia is here, right? Avelia, good evening. How are you tonight?
05:33Hi, teacher. Welcome. Welcome. Where were you yesterday, Avelia? We were waiting for you.
05:45After you started, I didn't, I didn't, possibility for connecting.
05:54Okay, one more time. Can you, can you re-express that? Can you re-elaborate that? Because that's not the way that you should usually say that, you know?
06:02So, one more time. Why weren't you here yesterday? Just to know, you know? There's nothing wrong with that.
06:07But I know that it's Christmas time and everyone likes to be with the families and all that.
06:12But just to know, you know, just to have a little bit of a conversation here. Why weren't you here last night?
06:21Yesterday, I need to go to the office.
06:31What is the past tense of go? What is the past tense of the verb go?
06:35Gone. No. No. What is the past form of the verb go?
06:44You know, I go, you go, he goes, she goes. So, what's the past form? I mean yesterday, two hours ago. What is the past form of go? Do you remember?
06:55No, the job. Sorry? No? Okay. The past form is went. You know? So, I went. You know? I went to the park yesterday. I went to the burger shop yesterday. You know? I went to the coffee store yesterday. I went to the library yesterday. I went to the hardware store to buy a new computer yesterday. Right? So, I went. I went to visit my family. I went to visit my parents. So, where did you go yesterday?
07:24I went to the office? Okay. To your office? Okay. Okay. I worked for a long time. Really? I thought you were on vacation. I have a vacation.
07:40I... I... I have a vacation.
07:44You do have vacation or you don't?
07:46No. Yes, I have... Yes.
07:50Yes or no?
07:52Yes. Yes, I have a vacation.
07:55How many days do you have to rest? Like two weeks or what?
08:00No, only one week.
08:02Oh, wow.
08:04So what about the plans of Elia?
08:06Do you plan to go somewhere with your family, with your friends?
08:09No.
08:10No. With your husband, right?
08:12So what's the plan for this Christmas?
08:15The plan is to stay in my home.
08:18Okay.
08:19In the two or three days.
08:21And the next day I plan to go outside of my home.
08:27Okay, excellent. Very good. That sounds fantastic.
08:29Resting is important all the time.
08:31So tell us, Elia, what did you do today?
08:34How was your day?
08:35Tell us what kind of activities did you... did you have today?
08:40Today I am very busy in my job because today is the last...
08:45You were very busy.
08:46Last day, the working or formal working and I need close to reporting, the different areas,
08:58or set areas.
08:59The kids made the information or specific in this day is the last day for the registered
09:13work account in the system.
09:15Mm-hmm.
09:16And the, the, the, my guys, my guys, uh, I, uh, very busy.
09:25Well, um, um, for, um, I, I, I, we need, uh, elaborate the, the final report.
09:38Oh, wow.
09:39Oh, wow.
09:40Yes, that's a lot of work.
09:41You need a year.
09:42The all in the year.
09:43I, uh, reviewed the whole information in the all year.
09:49I need the, the, this, today, this, the, the last opportunity, the change, for example,
09:59the difference in the accounts, um, or whatever, other adjustments.
10:06Okay.
10:07Okay.
10:08Okay.
10:09That sounds like a little bit heavy.
10:10Right?
10:11That sounds like a lot of work.
10:12Okay.
10:13So, um, besides of that, what do you usually like to do in Christmas with your family?
10:17You know, like, what are the activities that you, that you usually like to do with your,
10:22with your relatives?
10:23Um, in the, with my family, I, it's common, uh, spend time, uh, in my home.
10:37At home?
10:38Uh, I prepare, I have prepared, uh, the, the dishes, the different, typical dishes, uh,
10:44favorite, uh, family.
10:46Mm-hmm.
10:47Mm-hmm.
10:48So, what's, like, your favorite dish that you like to prepare for Christmas?
10:53Uh, um, for example, I prepare the, uh, turkey.
10:58Oh, delicious.
10:59Yes.
11:00With the chili and the seafood, the dishes, uh, inside of the, the turkey.
11:09Um, salad and pasta.
11:13Oh, delicious.
11:14Yes.
11:15That's good.
11:16And that's, that's the point.
11:18Perfect.
11:19Mm-hmm.
11:20And what about for Christmas Eve?
11:21Are there any, uh, activities that you like doing for Christmas Eve?
11:25You know, for December 31st?
11:28Um, in December 31st, I, normally, I have spent, and, uh, in my home, too.
11:39Mm-hmm.
11:40Or, when the, my family, uh, um, father and mother are now.
11:45Mm-hmm.
11:46But, in this case, it's very sad, or, very trist, trist.
11:52Ah, no.
11:53No, no, no.
11:54What, what do you mean by trist?
11:55Trist.
11:56I mean, that's not a word.
11:57You know?
11:58Sad.
11:59Sad.
12:00Why, why, why do you feel that these, uh, these are sad?
12:02Sad, because my father died.
12:04Oh, I'm sorry about that.
12:07Mm-hmm.
12:08Uh, I, we don't, uh, more animated.
12:14Yes, I understand.
12:15Yes, I understand the experience, you know?
12:18The same happened with my grandfather, you know, a couple years ago.
12:21He, he passed away in Christmas, unfortunately, you know?
12:25Uh, quite a, quite a particular day to pass away, right?
12:28So, that's, uh, that's really sad, but I really hope that you, uh, that you feel better
12:33soon with your family, Avelia, right?
12:35I mean, that's important.
12:37Right.
12:38Okay.
12:39Excellent.
12:40Very good.
12:41Well, uh, the most important thing is, uh, well, to spend time with our relatives,
12:45with, uh, with our loved ones, you know?
12:48Do you have children, Avelia?
12:50Yes, two children.
12:51Yeah.
12:52I have, uh, two children.
12:53Okay.
12:54Two daughters.
12:55Oh, fantastic.
12:56Well, that's good, you know?
12:57I mean, I mean, now you, you can, you can spend precious time with your, with your children,
13:03with your daughters, and, uh, and I think that, that's actually helps a lot, you know?
13:09I mean, that helps a lot.
13:10I don't have children personally, but, uh, but, uh, a lot of people say it's quite an experience,
13:16right?
13:17Yes.
13:18Okay.
13:19Excellent.
13:20Very good.
13:21Well, that's the point.
13:22Yesterday, we were discussing a little bit of, uh, technology.
13:25Right?
13:26Last, uh, last week, we were also discussing a little bit of, uh, of technology.
13:31Let me show you, right here, the, uh, little bit of the readings that we were, that we were
13:40doing here.
13:41Right?
13:42Uh, Naomi is here, you know?
13:43I think it's a great opportunity for us to have a conversation all together, you know,
13:48to discuss about different topics.
13:51Now that Naomi is here, I think it would be fantastic for you, Ebelia, to, to practice
13:56English with her, you know?
13:57So, I think, uh, I think that's important because, because you can, you can listen to
14:03me, but, um, but sometimes it's, uh, it's also necessary to listen to, to other people,
14:09you know, to other people, you know, to other people that also speak English because, um,
14:16because sometimes listening only to the teacher may be a little bit, uh, you know, uh, common,
14:22you know, or regular.
14:23So, it's always good to have other people to practice with.
14:27Mm.
14:28So, uh, Naomi, what about you?
14:30Do you like, uh, technology or what is your experience with technology?
14:37Technology like, um, that like the tablet or laptop or...
14:43Yeah, or computers, right?
14:45A lot of people say that, that people and teenagers are becoming so dependent on technology,
14:50right?
14:51So, there's a lot of, a lot, there is a lot to discuss about that, but I would like to
14:56know what's your opinion about technology.
14:59Like, did you like it?
15:00You don't like it?
15:01Uh, like, what's the deal with that?
15:03Well, I, I grew up with, uh, a lot of technology.
15:10Oh, fantastic.
15:11And I, I, I think, in my opinion, uh, everything, like, like this, need to be, uh, need to be,
15:26like connected.
15:27Like connected?
15:28You did, you did, you did in a, in a, in a good way, because you can be dependent on
15:33that, and you can, um, you can, well, you can, how, how can I say they have it?
15:48Uh, to, to, to stop, stop doing something?
15:52Yeah, you can stop doing so much other important things.
15:57Okay.
15:58That's a good point.
15:59Like, go, go to, I don't know, to, to the major or...
16:04To the park with friends, you know, to have a beer.
16:07Yes.
16:08Yes.
16:09Okay.
16:10Excellent.
16:11So, that's the point.
16:12So, do, do you know who, do you know who Elon Musk is?
16:18Do you know who he is?
16:20Yes.
16:21You know, that's like, he's like very top these days in technology and all that, you know,
16:27he's like creating, uh, so advanced computer these days to, to analyze a lot of information
16:36and to try to improve the artificial intelligence in, in a great way.
16:42So, what do you think?
16:43Do you think that actually he's planning to, to, uh, send, uh, or I don't know if he has
16:50already sent some satellites to the space?
16:53Yes.
16:54So, what's your opinion of, uh, Elon Musk?
16:56Do you think, uh, it's a visionary?
16:59It's an inspiration?
17:00It's, uh, a bad influence for the world?
17:03Uh, what do you think?
17:09Well, I, I just know Elon Musk for Tesla Motors.
17:14Mm-hmm.
17:15Oh, that's true.
17:16I think we, we, we, we, we're, um, wait, learning, learning, learning, learning.
17:30Mm-hmm.
17:31Yes, don't worry.
17:32You can also write your ideas, right?
17:34I mean, that's a lot easier.
17:36Mm-hmm.
17:37Well, we don't, we don't, we, we, we don't deserve Elon Musk, you know?
17:49We are not ready for Elon Musk, you know?
17:51So, okay.
17:52Yes.
17:53We are becoming, like, very dependent of, of technology and, and even, even more on AI.
18:02That's true, right?
18:03So, so, so, like, the ideas and the, the things that he wants to, to make.
18:12Mm-hmm.
18:14There are good, but, maybe not for our time.
18:20Like, we are not ready for, for all of that.
18:24Okay.
18:25I think.
18:26Yes, that's true.
18:27That's, uh.
18:28Because we, we don't, we don't know how to, to maintain, like, to hold on every, everything.
18:39Okay.
18:40That's good.
18:41So, uh, so you mentioned artificial intelligence.
18:45What do you, what do you think about that?
18:47Uh, is it something, uh, do you like it?
18:49I mean, do you use it?
18:51What's your opinion about artificial intelligence?
18:55Yes.
18:56In the beginning, where, I, I just use, uh, chatty-edit.
19:01Mm-hmm.
19:02With the only AI that, that, that, I, I know.
19:08Mm-hmm.
19:09And, um, it's good, but, like, I don't know if you, you saw something, like, on TikTok,
19:20that they, they made, um, some kind of videos with AI.
19:26Oh, yeah.
19:27You know, that some people.
19:28You, you, you don't really know what is through.
19:32What's real, right?
19:34Yeah, that is true.
19:35Mm-hmm.
19:36The, that, those, those, those kind of videos, that is for me, because you, when you accept
19:45all the rice, or all the, um, yeah, the rice on, on, in South Korea, you have, you, you
19:57give them all, like, permission to, to make those kind of videos, or, or, or audios, or
20:04photos, of you, with, with your information.
20:08So, that's kind of weird.
20:12That's true.
20:13Yeah, that's, um, that's the point.
20:16That's, uh, I mean, it's, uh, becoming something really weird.
20:20Uh, more and more people are, uh, depending on artificial intelligence.
20:24But, artificial intelligence is good, right?
20:27I mean, uh, it creates, or it makes that everything, it's so simple these days.
20:32You know, for example, if you want to consult any information, you simply ask to chat GPT,
20:38and it gives you the instant, the immediate answer.
20:42So, isn't that good for us?
20:45What do you think?
20:46Yeah.
20:47Isn't it good?
20:48It is good, but, but, some people, um, use it in a bad way.
20:55Like, making those videos, or, or even, uh, fake your, your information, or, or something like that.
21:08And, and I know that is, that is better.
21:12And, if we want to, like, grow, grow up, like, as a, I don't know, as a society, or a passive limitation.
21:23If you have, though, maybe, maybe we just need to learn how to use it in a good way, or, or develop more, like, security, security, um, security.
21:41Mm-hmm.
21:42Mm-hmm.
21:43Like, for example, in your opinion, Naomi, what would be, like, a good strategy to, um, like, to learn how to use artificial intelligence responsibly?
21:57Like, what would, what could be a good strategy?
22:00Because I've seen that in students, you know, especially in teenage students.
22:04Yeah.
22:05Like, in high school, or in middle school, I've seen that they practically depend on the artificial intelligence for everything, right?
22:12So, so, so what would be, like, a good way to teach them, or to teach children to use that responsibly?
22:20Maybe.
22:21Maybe.
22:22Mm-hmm.
22:23Okay.
22:24Come on.
22:25We can, we can put, well, the AI or the activity in, in specific, um.
22:45Uh, well, that's a good point, you know, let's start with ChatGPT, right?
22:53How to, how to teach students, or how to teach, uh, teenagers to use ChatGPT responsibly, because,
23:00uh, well, artificial intelligence, they can use it to make videos, you know, but most of
23:05those videos are, like, like, surrealistic videos, you know, like fantasy videos, but
23:10whenever it comes to, to studying, they use ChatGPT to answer practically anything about
23:16history, about math, about, uh, um.
23:20In English, it's a little bit more difficult, you know?
23:22In English, it's a little bit more difficult because with English, we, uh, we, we have to
23:28speak, you know?
23:29And, uh, and to this point, artificial intelligence is not able to speak for us, you know?
23:35Uh, to this point, but in the future, perhaps it will, but, uh, but now, children are, um,
23:42like, using, like, using, uh, AI for, for answering all of the questions related to their subjects
23:50in school.
23:51So, what would be, like, a good strategy to tell them, hey, guys, you know what?
23:56You shouldn't be using that often, you know?
24:00Well, I think maybe, um, we need to teach them the beginning, like, the, the AI, they, well,
24:11it doesn't really think they just process information.
24:16Mm-hm.
24:17So, when you, um, tell them, like, uh, I need to make a resume of, uh, something, they just
24:31don't really think, and, you know, be able to, I don't know, to elaborate, I could recommend
24:40that maybe you, you need.
24:42Yes.
24:43Yes.
24:44Yeah?
24:45You need to, um, you need to, um, how, how can I say pedir?
24:55You need to...
24:56Like, to ask?
24:57Yeah.
24:58You need to ask, ask more specific, in a, in a, in a specific way, um, how to, how, how
25:11do you want to do it, how do you want them to do it, and, and teach them that the, the,
25:26the AI doesn't came to, you know, replace us, or replace our minds.
25:33They just, I mean, to help us, to make, um, easier the, the, our activities to, I, I think,
25:45like, to have more time to make other things.
25:48Mm-hm.
25:49I think that, that's, uh, that's a really good point, but at the same time, it's gonna be,
25:54I think that it's gonna be something really difficult to achieve, you know, I mean, that's,
25:58that's, that's gonna be something really difficult to do, um, because it, it occurs, like, with
26:07the calculators, you know?
26:10Yeah.
26:11But, but I think, uh, the AI, or, or, well, the chatGBT, they have, like, an option, or,
26:19or, uh, well, not an option, but you can ask them, like, if, if you make, uh, a resume,
26:27or, or, uh, project.
26:32Yeah.
26:33You cannot, you, you can give them, like, all the bases.
26:37Okay.
26:38Of, of that, and then, you can ask them to explain your mistakes.
26:44That's true.
26:45And, and you can learn.
26:47It, it's a win-win, because when you give them the, the information, you, you are, like,
26:56a, okay.
26:57Yes.
26:58Well, they are growing up, like, all their systems, all, all them systems, and, and all
27:05of that, and you can win to learn even more.
27:11Okay.
27:12Excellent.
27:13Very good.
27:14Well, that sounds fantastic.
27:16It's, like, uh, like, uh, it still has, uh, certain limitations, right?
27:22We, we can say that, uh, so that's very interesting, Naomi.
27:26That's great.
27:27So, let's see, uh, Evelia, what about you?
27:30How do you feel with, uh, like, let's write this question right here.
27:34How do you, uh, feel personally with, uh, the advancements that technology, that technology
27:47is having these days?
27:49Right.
27:50So, what about that question, Evelia?
27:52What do, what do you think?
27:53How do you feel personally, uh, as a person, um, with your children, you know, with relatives,
28:01uh, I don't know, tell me?
28:04Um, nowadays, I think it's very important, um, essential, uh, the intelligent artificial
28:14in the, all activities.
28:16Uh, like to, can we say, like, to integrate it?
28:20To all the activities?
28:21I integrated, integrated the, this technology, and all the activities in, in, in the job.
28:30Uh-huh.
28:31Uh-huh.
28:32Not only in the personal activities, but it's essential, the, um, included the technology,
28:37uh, in the, um, different, uh, different, uh, different, uh, like different activities.
28:50A company, company, for example, in my case, for example, my case.
28:55The, in the, in the, in the, in the, in the, in the, in the, in the, in the intelligent,
29:00artificial, in different apps.
29:01Intelligent.
29:02For example, yeah, chat GPT is internal chat, uh, AC, AstraZeneca, uh, included the special
29:12chat.
29:13Okay.
29:14So that you, you can, um, um, compare, or, um, chart, or compare, or share, uh, the information.
29:23Yes.
29:24Or you, you can use this technology, this app, for, um, create, uh, presentations, or, uh,
29:35different projects.
29:36Uh-huh.
29:37Like to, to make expositions, you know, to make expositions and stuff.
29:40Uh-huh.
29:41Uh-huh.
29:42To, uh, imagine, uh, um, um, I don't know, for example, the lands, or, um, brands, brands,
29:49for the, uh, the sales.
29:51Uh-huh.
29:52Uh-huh.
29:53Or, um, or in different projects, you can use this app.
29:59Uh-huh.
30:00Uh-huh.
30:01Uh-huh.
30:02Uh-huh.
30:03Uh-huh.
30:04Uh-huh.
30:05Uh, this, this app is, is, is internal of the company
30:08Okay.
30:09Because, uh, the secret, cyber security, uh, review the information that you, uh, shared,
30:15and internal, uh, internal, information.
30:20Uh-huh.
30:22And that's all right.
30:23So, uh-huh.
30:24So, uh, so you mean that companies are surveilling the, the information of, uh, of their users?
30:29Do you know what's the meaning of surveil?
30:33Surveil is like to spy, you know?
30:38So let's make a question.
30:42Now let's say, are you referring that companies are surveilling their users' information?
30:54That's what you mean?
30:59Like spying the information from other people?
31:04Yes, you can share information but not outside with the other person outside the company.
31:16Only it's internal app.
31:18Okay.
31:19It's limited access on the employees with your ID.
31:23But it's similar to GPT and the other application in the person, but this is internal access.
31:34I permit or I provide the information with security when they do violence, the information, or...
31:45It's more controlled for the company.
31:48Okay.
31:49But that's legal or illegal, Vivalia?
31:53What do you know about that?
31:54It's legal.
31:55It's legal?
31:56Okay.
31:57It's legal.
31:58Okay.
31:59That's good.
32:00Well, because sometimes there's a little bit of a discussion, you know, between...
32:03If it's ethical for business and, well, different companies to actually track or to actually surveil their users' information, their users' personal information.
32:22You know, I mean, there was like a scandal a couple years ago because seemingly Israel and the United States were using a software, a computer program that actually gathered information from many computers all around the world.
32:41And a lot of people dislike that because they said, hey, you are spying on us, you know, you are checking our personal information, you are invading my private life.
32:54So that's a...
32:56It's a very controversial topic, right?
32:59Yeah.
33:00So that's the point.
33:01Excellent, Brigade.
33:02So, and what about artificial intelligence, Vivalia?
33:05What's your opinion about that?
33:07Have you used it?
33:08Do you like it?
33:09You don't like it?
33:10You don't like it.
33:11How do you feel with that?
33:12I like it because I need the artificial intelligence.
33:25Permite?
33:26How do you say permit?
33:27Allows?
33:28Allows the answer very quickly or very easy answers.
33:33Okay.
33:34It's more fast that you finish the general reports or...
33:38It's really fast, right?
33:39It's really fast, right?
33:40It's immediate.
33:41Yes.
33:42I think we don't depend on the artificial intelligence.
33:46We need to review the information received for the EI and the next share with the other people.
34:07It's necessary to review what information the intelligence received.
34:13Is it true?
34:14Is it not true?
34:15Is it confidential information?
34:20I need to review this information.
34:21I need to review this information.
34:22But it's very easy obtaining the answer.
34:35It is good.
34:36It is good.
34:37And nowadays, I think in the next year, more companies introduce this intelligence architecture
34:50in your own apartment.
34:54Oh, yes.
34:55Yes.
34:56That's a goal, actually, from Elon Musk.
34:57You know, I think Elon Musk is working on that.
35:04So, would you like to have that possibility, Avelia?
35:06I'd like to have this technology at home, for example, with your voice to control your,
35:12I don't know, your house, your apartment, your entrance door with your voice?
35:19Or do you completely disagree on this?
35:22What's like your perspective about that?
35:24In my opinion, in my personal life, I think it is the more, the more, the more, the more,
35:42let's say, what are you trying to say?
35:44Like the most, what, careful?
35:49Most careful?
35:50How most careful when I introduce this technology in my home?
35:54Okay.
35:55Because, for example, in my case, I have two daughters.
35:58Yes.
35:59I need, I need, what is the access to daughters?
36:03Because it's the general tool, the total access for Sheep.
36:11But, I think it is the technology that uses for help, for the, for example, the security, no?
36:22For example, in this, in this photo, we can see the camera.
36:29For example, my case is the camera.
36:32If you use it in different homes, they check, they check what is the activities in, in, in, in that home,
36:43what is the people introduce, or open the door, what is the, the access in, in, in the home.
36:56Yes.
36:57Okay.
36:58But, I think, eh, the other, eh, technology is the very careful, eh, they use it.
37:09Mm-hmm.
37:10Because, the, this technology, eh, is, eh, is, eh, is introduced, eh, more information.
37:19Mm-hmm.
37:20Mm-hmm.
37:21That is, for example, the finance case, or.
37:24Oh, that's true, yes.
37:26Information, ah, information, very confidential, is, a, is, a very careful, no?
37:35Yes.
37:36Because, eh, they attack, I suppose, the, the, the, the, the, the, the, the, the, the damage.
37:42Mm-hmm.
37:43For, the finance, or access.
37:47That's true, well, there, there's a lot of people that has, eh, extremely personal information in their, in their cell phones, in their computers, for example, their credit cards.
37:56You know, their debit cards, and, and what to do with that, you know?
38:00Exactly.
38:01Exactly.
38:02Exactly.
38:03It's a big problem.
38:04Mm-hmm.
38:05It's a big problem.
38:06Mm-hmm.
38:07But, you don't, I use it, well.
38:10For example, in, in your case, Avelia, do you, do you trust on a cell phone, or, on a technology to have, for example, your, your credit card number?
38:23Do you actually trust on that, or, or you prefer to use cash, like, physical money, to buy stuff?
38:30Personally, I prefer to use cash, you know?
38:33I, I, currently, I don't have a credit card, or a debit card.
38:37I know that sometimes is necessary, and it's useful in certain situations, but I, I wouldn't like to have that kind of an information, or my money, in the, in the cell phone, you know, in a credit card.
38:48I think that's extremely dangerous, because if you lose a cell phone, you are also losing your personal information, you know?
38:56And that's, and that's a problem, you know?
38:59So, what, what is your opinion about that?
39:02In my opinion, it's, it's, it's difficult to use the ID card, for example, when I, to travel, or when I, to travel, or when I need a phone, by internet.
39:17But if, for example, in my case, I prefer, pay for cash, physical, but, in a specific, specific, store, it's more comfortable, or security, to buy for my ID card.
39:37But, for example, when, when, when, when we, to buy, to buy internet, it's important to verify, verify, verify, verify, verify, verify, verify the access, currently, when, when, when we, to buy, in, by internet.
40:01Mm-hmm.
40:03It's, it's, it's more careful, that the, the, the, the, the money, or, ...
40:07Mm-hmm.
40:08Because, it's, more, easy that, drag your, or access.
40:13Mm-hmm.
40:14And, keep up, or, the, the, the money, or, ...
40:19Yes.
40:20... or, ... attack.
40:22Mm-hmm.
40:23Mm-hmm.
40:24Just, like, to keep our money safe, you know, from any, any, any cyber attack, from any hackers, and all that.
40:30Uh, I think that's really a controversial topic, since, since a credit card was able to, to, to be linked with the cell phone.
40:40A lot of people, a lot of people, uh, were worried on, uh, on, uh, the possibilities, you know?
40:47That they could actually have on, on having their personal information in their cell phone, because it's a great tool.
40:54I mean, it's something that you can really take advantage from.
40:57It's something that really helps you to, to have, uh, everything at hand.
41:02Yeah, but at the same time, if you don't know how to, um, how to handle that responsibly, it's really, it's, it's really, uh, dangerous, you know?
41:13To have that information in your, in your cell phone, especially in Mexico, you know?
41:17In Mexico, we, we, we go through so many situations where, for example, you're on the balls, or you are, you're on the balls, or you're on, uh, on a taxi, and, uh, I don't know, something may occur, a robbery, and suddenly you lose all of your information that is contained in your cell phone, and, well, that's, that's very dangerous, you know?
41:39I mean, you also have your credit card, and now when the government is trying to implement this aspect of, uh, scanning your, your iris from your eye, to have your personal information, you know?
41:54To have, like, your biometrics information, that's gonna be very dangerous as well, because a lot of people could actually, could actually replace your identity.
42:04And, uh, and, uh, well, that's, that's the situation that we are currently living, in terms of, uh, technology, right?
42:12Okay, so that's the point. Uh, Naomi, what about you? What do you think about, uh, having your money on, um, on a credit card, or on a debit card, and at the same time, having that information on your cell phone?
42:27Uh, do you usually do that, or what's your opinion on that?
42:32I think it's really practical, you know? But, I don't know. In my case, I, I, I try to be cautious with that, you know?
42:40Yeah, um, it's, it's a little bit funny because I, I don't usually have, uh, cash.
42:49Okay, I, I, I, you don't like it?
42:53Well, it's not that I don't like it, but I, I think in my personal case, I spend more money if I have cash.
43:03Okay, yeah, okay, okay, that's a good point.
43:07If you don't see money, don't, don't, don't.
43:11You don't, if you don't see your money, you don't spend it. Okay.
43:14Yeah.
43:15Okay, that's good.
43:17Um, I, I usually have, like, a more, uh, more, uh, better way to...
43:26Mm-hmm, like more control?
43:28To secure.
43:29Yeah, yeah, more control.
43:30Uh, in, in my bank account.
43:33Mm-hmm, okay.
43:35And I can see where, where I spend it, and, and all that.
43:39Mm-hmm, mm-hmm.
43:41And that's a good strategy, you know, that's a good point.
43:44Yeah.
43:45Well, I, but I don't know, I think that, well, with a credit card, you have a lot of money
43:50available for you to use, you know?
43:52Yeah.
43:53But it's like a temptation that you have it right there, you know?
43:56It's like, I could actually buy something using my credit card, but eventually, I will
44:01have to, I will have to pay it using my money to pay the bank that provided that credit
44:08for me.
44:09Yeah.
44:10So I think it's, uh, it's, it's a kind of a psychological exercise, you know?
44:14Mm-hmm.
44:15Yeah, so that's, uh, I don't know, uh, I, I, for me, that's, that's really, uh, complex
44:21to analyze, and, uh, and it requires a lot of, uh, well, a lot of discipline to know how
44:27to handle responsibly your, your bank account, your credit card, your debit cards, and so on.
44:32Okay.
44:33That's good.
44:34So, for example, here, yes, tell me.
44:36Tell me.
44:37Yeah.
44:38And we were talking about all the technology and that, and I, I think, I thought, on a,
44:45on a Netflix show.
44:48I don't know if you know it.
44:50Oh.
44:51It's Love, Death and Robots.
44:53Mm, I think no.
44:54You know?
44:55I think no.
44:56Is that like a series?
44:57No.
44:58It, it, it, it is a good one, and it also talks about all the, the, the, the technology
45:05development, and all of that.
45:08Mm-hmm.
45:09And they, they, they doesn't have like a, uh, a full story.
45:14It, it doesn't, even, even chapter.
45:17Mm-hmm.
45:18It doesn't have like a, a picture with someone.
45:22Okay.
45:23That they talk about the, the human relationship with the technology, and how that, if we don't
45:30use it in a better way, they can have like a really bad consequences.
45:37Mm-hmm.
45:38So, I think it's, it's a good show.
45:42Mm-hmm.
45:43So, if you have time, and if you like it, you can watch it on Netflix.
45:49What is the name?
45:50Love, Death, and Rabbits.
45:51Oh.
45:52Okay.
45:53Which is?
45:54It is, it is not for a, well, I, I think I, I, at least that you have children.
46:05Maybe you, you, um, watch with, with, with, with her, with them.
46:12Mm-hmm.
46:13Okay.
46:14It's for a, it's a, uh, an animated show, but it's not for kids.
46:21Mm-hmm.
46:22This, this is the one, right?
46:23Yes.
46:24Okay.
46:25Yeah.
46:26That's the one.
46:27You know, it's called, Love, Death.
46:28Okay.
46:29Yeah, it looks fantastic.
46:30You know, I like animated shows, yes.
46:31Mm-hmm.
46:32And that's like on 3D, like, it looks like really futuristic and all that.
46:37So, that's, uh, the recommendation, so in these, mm-hmm.
46:41Mm-hmm.
46:42All the chapters are in a different, they, they have a different story, different visuals,
46:47different everything.
46:49Mm-hmm.
46:50So, it really attracts you.
46:52Okay.
46:53Excellent.
46:54Very good.
46:55So, that we have, that we have Naomi's recommendation tonight, right?
46:58That's a good point there.
47:01I mean, there are so many, uh, good series out there that actually portray, uh, the, uh,
47:07well, the, the, these kind of, uh, interaction between humans and artificial intelligence that,
47:13at some point, could actually become really, um, crucial for everyone and, and possibly dangerous,
47:20you know, because we're, we're heading to a future where we will be more and more dependent
47:26to technology because everything now runs on technology.
47:30All of the services, you go to a store and in order to pay something, they use a terminal
47:36to track the payment and, uh, you go to the bank and they use, uh, well, your bank account
47:42is linked to a system and, uh, and now they're trying to implement that, uh, on identifying people,
47:50you know, on identifying people, on having your credit card, sorry, your identification
47:55card on a, on a database, in a server, you know, and, uh, that you can access that information
48:02online.
48:03So, that's, uh, that could be very dangerous because a lot of people would be able to, to
48:10see your, like, uh, well, your information, you know, your personal information and where
48:16do you live and where you live and what you do, you know, and, and all that.
48:21But, but the promise for people is that in the future, everything is going to be, well,
48:27a lot easier, you know, a lot more practical.
48:30So, you don't, you won't have to, you won't need to have, like, uh, anything physical kept
48:37on your, kept on your, uh, on your wallet, you know, because, um, because in that case,
48:46uh, why would you need to keep something on your wallet if there is available on your, uh,
48:52if there is available on your, on your database information, you know?
48:56So, that's the point.
49:01So, there we have it. Excellent.
49:04So, we're gonna have some questions here to analyze for, for homework.
49:10We're gonna have three questions here so that you can elaborate a little bit here.
49:15So, we're gonna write, uh, number one, how do you imagine, how do you imagine that, uh,
49:25uh, future will become, let's say, in 50 years, in terms of technological advancements?
49:38Advancements, right, so that's number one.
49:42Uh, what other problems could humankind, what other problems could humankind face?
49:54In the future, related to artificial intelligence, intelligence, intelligence, and technology.
50:07Right, that's another one. So, then we have three.
50:13How do you, how do you manage, uh, how do you manage, uh, how do you manage, or how do you create,
50:25how do you manage a responsible strategy to use technology in your daily life?
50:35Right, so that's, we're gonna discuss those, uh, three questions, so that we can analyze this.
50:44So, let me send you these, uh, these questions, right, so that you can, you can get them, uh, for homework,
50:51like to analyze, and once that you finish, you can send your, your, uh, response to the WhatsApp group,
50:59so that we can see how, how you can, uh, create this idea, and, uh, and how you interact, right?
51:07So, you can use, like, uh, you can also quote different references, you know, you can also quote different references
51:14from authors or any other person that you might have read that actually talks about technology.
51:20So, I think that would be fantastic, right? So, do you have any, any questions in, in terms of, uh, this, uh, assignment that we have tonight?
51:33Not the term.
51:34Really?
51:35No.
51:36No, not the term.
51:37Well, uh, the, sorry, um, how many lines or how many...
51:42Oh, like, like three to four, you know, three to four, that would be good, you know, like a common response
51:48where you can elaborate a little bit, you know, where you can describe, right?
51:53So, I think that would be fantastic.
51:58So, that's the point.
51:59Okay.
52:00So, that's gonna be it for today, and, well, we will continue, uh, well, until next week,
52:06because, well, tomorrow is 24th and, uh, Thursday 25th.
52:11So, if I'm not mistaken, I will see you, well, on Monday, right?
52:16Right.
52:17Right.
52:18However, I'm gonna send you some, some links, you know, I'm gonna send you some links so
52:21that you can, you can, uh, check some, uh, interesting English analysis, and, uh, uh, and,
52:30well, we will do it that way, right?
52:33So, do you have any other questions so far?
52:36No, that's right.
52:38No?
52:39No?
52:40Okay.
52:41Here.
52:42That's good.
52:43That, uh, that you have a great, uh, Christmas, you know, that you share a great time with
52:48your family, and, uh, thank you.
52:50Thank you for coming, and I will see you really soon, right?
52:53Bye.
52:54So, have a good night.
52:55Happy Christmas.
52:57Happy Christmas.
52:59Bye-bye.
53:00See you soon.
53:01Take care.
53:02Bye-bye.
Comentarios