this post was submitted on 15 Feb 2025
-20 points (30.0% liked)

science

16066 readers
1027 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 1 week ago (1 children)

Must be because AI 🤔 The study itself suggest that could be an improvement and not to replace a therapist by Ai

Using different measures, we then confirmed that responses written by ChatGPT were rated higher than the therapist’s responses suggesting these differences may be explained by part-of-speech and response sentiment. This may be an early indication that ChatGPT has the potential to improve psychotherapeutic processes. We anticipate that this work may lead to the development of different methods of testing and creating psychotherapeutic interventions. Further, we discuss limitations (including the lack of the therapeutic context), and how continued research in this area may lead to improved efficacy of psychotherapeutic interventions allowing such interventions to be placed in the hands of individuals who need them the most.

[–] [email protected] 13 points 1 week ago

Its possible that AI was the reason.

But the far bigger reason is that chatgpt does not mix with mental health. Not because it is bad but because it is unreliable, potentially hazardous, controlled by a corporation (and soon elonazi iirc), has potential to manipulate a person into even worse things than advertisement is currently doing.

Science should be on an ABSOLUTELY NOT stance towards chatgpt (and all corpo ai) in therapy.

Please all watch the mandatory viewing exercise called idiocracy as many times as needed until it is understood that we are on the totally wrong path.