00:00I think that AI can stimulate interaction, but it lacks that emotional depth, cultural awareness and professional judgment required for psychological treatment.
00:09We know that AI cannot navigate the grey areas of trauma, identity, grief and complex interpersonal dynamics the way that a trained psychologist can.
00:19And AI lacks that contextual understanding and it might give advice that feels helpful in the moment, but it fails to meet psychological needs and may even reinforce harmful beliefs.
00:30So our research warns of what we would call pseudo intimacy.
00:34That's where users might feel connected, but the interaction really does lack that emotional reciprocity.
00:40So we know that therapy isn't just about words.
00:42It's about meaning.
00:43It's about connection and trust.
00:45And those are things that AI can't replicate.
00:47And it's an crucial part of the therapeutic process and recovery for our most vulnerable in our communities.
00:53Yeah.
00:53And Sarah, is it also of concern that the whole way that AI works is that it learns?
00:59So it's learning the sort of responses that you want it to give or the types of questions.
01:05And therefore, that's not tailored.
01:07It's tailored specifically for you, but it's not necessarily tailored specifically for helping you go forward.
01:15I completely agree with this.
01:16And I really like the question.
01:17I think we need to be very cautious not to mistake AI for a replacement for psychological support and mental health support, especially without regulation.
01:25We risk actually normalising the unqualified transactional support, which is what you're talking about there, Lorna.
01:32AI tools are designed to give responses that some might find comforting, but they can't actually assess risk.
01:38They can't adapt to our individual unique histories or intervene in a crisis.
01:42So users may come to depend on AI rather than seek that qualified care.
01:48And especially for those most vulnerable and those most in distress, we know that even in these specific issues, risk can include a reinforcement of negative thinking patterns, which we call confirmation bias.
02:01So when we have individuals at their most dire need, the last thing we need is for AI to repeat back those concerns back to them.
02:10We need to be providing them with appropriate challenging support and escalation, understanding that escalation in symptoms to get them the support that they need when they need it the most.
02:20Are there also privacy concerns?
02:22I'm thinking if a person is chatting to a bot and might have thoughts of harming themselves and that then is on the computer, it's on the chat sequence in an office of a psychological professional.
02:38That stays there.
02:40But once it's on the Internet, does that concern you?
02:43It concerns me greatly.
02:45Psychology is a regulated profession.
02:47We spend a very long time understanding the nature of information, how it is that confidential information should be stored and managed and used.
02:55And a crucial part of engaging with an individual for the first time is educating them on being very clear about where it is that their information is going to be stored and used and how it's going to be used in the future as well.
03:10Those are those ethical concerns, lack of consent that we don't see addressed yet by the government.
03:17We need really to understand data privacy and transparency.
03:22Who owns this data?
03:23Who's liable if something goes wrong?
03:25We know that only trained professionals can actually discuss that informed consent in a really safe way with an individual.
03:32And effectively, we know that AI is evolving fast.
03:37Our ethical responsibility is timeless and we haven't yet effectively caught up with technology.
03:46It's moving too fast for our regulation to actually keep up with.
03:49So our government must regulate AI and mental health, especially in those high risk settings that you're speaking about.
03:56Stronger protections are needed for our community and investments required in the research to understand the impacts of the psychological use of AI, especially on our youth who are very comfortable being engaged with AI.
04:10And we really don't know what the impacts of that use will be.
Comments