this post was submitted on 31 Jul 2023
2 points (62.5% liked)

Health - Resources and discussion for everything health-related

2403 readers
190 users here now

Health: physical and mental, individual and public.

Discussions, issues, resources, news, everything.

See the pinned post for a long list of other communities dedicated to health or specific diagnoses. The list is continuously updated.

Nothing here shall be taken as medical or any other kind of professional advice.

Commercial advertising is considered spam and not allowed. If you're not sure, contact mods to ask beforehand.

Linked videos without original description context by OP to initiate healthy, constructive discussions will be removed.

Regular rules of lemmy.world apply. Be civil.

founded 2 years ago
MODERATORS
 

It's a weird study, comparing ChatGPT answers to answers provided for free on Reddit, and the results then being rated by physicians rather than patients.

Still, in my fairly rich experience with doctors, I'd also prefer to talk to a robot if I knew the answers to be reliable.

top 1 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 year ago

I mean, it's really not that shocking when you get down to it. Most doctors act like they're being pursued by the Terminator, and talking too long with you might just be the thing that costs them their escape. Of course, they are being pursued, just by private equity instead of the Terminator. And then, there's also doctors who are bankrupt of social skills out there; it's a fact nobody's happy to admit is a problem, but it is, and they can't communicate (that means listening and speaking) with the patient how they need to to be as effective as they could be. What's more is that you've got the human element, and doctors are human; sometimes they're tired, frustrated, or distracted and they forget things; sometimes you'll get some dino doc that insists that we should just go back to giving kids whiskey-cocaine-morphine tonics for influenza or some shit because that's how they did it when they went to school in the 1920s, instead of evolving their practice along with the evidence; sometimes, very rarely, you'll get a bad actor, maybe they're a huckster, maybe they're just a mean ass burnout, but they're not always working with the patient's best interest in mind.

ChatGPT suffers from none of this. It has as much time as you do to talk, it's happy to speak to you on your level and explain as much as you need it to. It always listens to what you have to say and makes no character judgments, it's just there to function in a clinical capacity. ChatGPT is trained with the latest medical advice, evidence, and reccomendations and isn't attached to "the old way of doing things". ChatGPT doesn't get tired, distracted, mean, or burned out. There's a lot to like there. I think that we're not far from seeing LLM-driven care becoming a thing, at least as an entry point for the average consumer.