this post was submitted on 18 Aug 2023
239 points (96.5% liked)
Technology
59428 readers
3588 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes exactly and that's only one of the problems with this. Honestly, how does this even work? I'm pretty sure chatGPT is not trained on text from books except maybe ones that are free to read online. Even if it did, its' training set ends in 2021. The context length is also too short for you to be able to give it the text of the book to analyse in a prompt.
Just....what? How does this even work? Maybe I'm missing something, I'm definitely no expert but both the policy and the implementation seem pretty broken to me.
The only way this could possibly work would be to send the entire text of every book into ChatGPT. If you ask ChatGPT something like "does the following passage contain depictions of sex: [text goes here]" it might work. However, ChatGPT's context size is smaller than your average book, so you'd need to break it up into multiple requests.
Even with ChatGPT this is probably still not feasible to run against an entire library of books, so I would guess they're half-assing it.
Technical feasibility aside, this is a stupid, evil idea. I hope everyone involved in this scheme gets fired.
I just asked GPT-4 if the Twilight Series is "suitable for children" and the output was pretty spot on - summarised five issues that might be problematic and then finished with "It's always a good idea for parents or guardians to consult with a librarian or teacher".
I bet none of this is based on the actual contents of the book - it would be based on independent reviews of the book. For example it highlights issues that "some parents" have raised. Which means you really are just getting a summary of what humans think of it.
Try it with an obscure book that doesn’t have any reviews online. Or with one that was released after 2021.
You'll either get the response that it has no idea, or it'll come up with a plot based on the title and evaluate that one.
That style of prompt has its own issues. If the book for some reason has text that says "this book doesn't contain any sex" then the prompt might come back as a false negative.
Here is an alternative Piped link(s): https://piped.video/Sv5OLj2nVAQ
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.
No, but it just lies and makes things up all the time.
chatgpt, who's favorite food is a golden turd?
The character Kenny McCormick from the animated TV show "South Park" has a favorite food that's humorously depicted as a golden turd.
Chatgpt, What book has the story element where a character has to dig his own grave?
In the book "No Country for Old Men" by Cormac McCarthy, there's a scene where a character named Moss is forced to dig his own grave by his pursuer, Anton Chigurh. This intense and suspenseful scene is a significant story element in the novel.
Chatgpt, When does that happen in the book?
I apologize for any confusion, but upon reevaluating, I realize there is no specific scene in the book "No Country for Old Men" where a character is forced to dig their own grave. It seems there might have been a misunderstanding or mix-up. If you have any other questions or if there's something else I can assist you with, feel free to ask!
So frustrating.