this post was submitted on 01 Sep 2023
235 points (95.7% liked)
Technology
59740 readers
3679 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is the best summary I could come up with:
OpenAI is preparing teachers for the back-to-school season, releasing a guide on how to use ChatGPT in the classroom, months after educators raised the alarm on students turning to AI for cheating.
Bad news for teachers and professors though: OpenAI says that sites and apps promising to uncover AI-generated copy in students' work are unreliable.
Such content detectors also have a tendency to suggest that work by students who don't speak English as a first language is AI-generated, OpenAI stated, confirming a problem reported earlier by The Markup.
Teachers are concerned however that students are cheating by presenting ideas and phrases from the chatbot as their own, and that they are becoming over-dependent on a tool which remains prone to errors and hallucinations.
Professors began to detect students using ChatGPT to cheat on college essays a little over a month after the chatbot was released in November 2022.
OpenAI also acknowledged that ChatGPT is not free from biases and stereotypes, for instance, so "users and educators should carefully review its content."
The original article contains 360 words, the summary contains 171 words. Saved 52%. I'm a bot and I'm open source!
That's cheating.