this post was submitted on 15 Apr 2024
57 points (74.4% liked)

Technology

34995 readers
284 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Etterra 28 points 7 months ago (11 children)

I wonder why nobody seems capable of making a LLM that knows how to do research and cite real sources.

[–] NosferatuZodd 15 points 7 months ago (1 children)

I mean LLMs pretty much just try to guess what to say in a way that matches their training data, and research is usually trying to test or measure stuff in reality and see the data and try to find conclusions based on that so it doesn't seem feasible for LLMs to do research

They maybe used as part of research but it can't do the whole research as a crucial part of most research would be the actual data and you'd need a LOT more than just LLMs to get that

[–] BigMikeInAustin 11 points 7 months ago

Yup! LLMs don't put facts together. They just look for patterns, without any concept of what they are looking at.

load more comments (9 replies)