this post was submitted on 11 Jun 2023
2 points (100.0% liked)

ChatGPT

1527 readers
1 users here now

A community to discuss about ChatGPT and AI. Not affiliated with OpenAI.

Rules:

  1. NSFW must be marked as such

  2. No porn

  3. Don't be toxic

  4. jailbreaks are allowed for now at least

  5. No trashposts.

  6. Keep the memes to a minimum

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 2 years ago (2 children)

Why would anyone want to cite something in a scientific paper that the reader can not access or verify and that could or could not be partially or completely fictional?

[–] ndr 1 points 2 years ago

This is a good problem. I do not think it should be used as a reference in scientific papers (at least the way it works now), but there should still be a way to at least let people know where you got that information from.

This is more useful with things like Bing Chat that uses external sources to craft the answers, so you have “real” citations. But yeah, ChatGPT is currently impossible to verify.

I wonder if a seed could be implemented in the future to re-generate the same answers.

[–] [email protected] 1 points 1 year ago

At the very least when it comes to analyzing the LLMs, e.g. to discuss biases, there should be some common understanding of how to reference the things that are discussed.

But this is a poor attempt. They just point to the general ChatGPT URL and the date, ignoring that at any time multiple models are available.