this post was submitted on 01 Sep 2023
235 points (95.7% liked)

Technology

59742 readers
4059 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

There's no way for teachers to figure out if students are using ChatGPT to cheat, OpenAI says in new back-to-school guide::AI detectors used by educators to detect use of ChatGPT don't work, says OpenAI.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

If someone can use the tool to do the job successfully, I don't see if that learning was actually necessary. Like I learned to use a phone rather than a telegraph. I learned how to drive a car rather than ride a horse. I learned a calculator rather than a sliderule.

Of course we're still at the stage where you need to double check the tool,but that skill is maybe more like supervising someone rather than directly doing the task.

I can imagine prompt engineering will actually be a thing, and asking the AI to fix parts that don't work is the short term. We already can ask the AI to look over it's own work for mistakes, I have to imagine that's going to be built in soon...

The worse thing is if the student can actually ootimize the learning away with the AI, so too can employers optimize away the potential employees.

[–] [email protected] 4 points 1 year ago (1 children)

This is a very output-driven perspective. Another comment put it well, but essentially when we set up our curriculum we aren't just trying to get you to produce the one or two assignments that the AI could generate - we want you to go through the motions and internalize secondary skills. We've set up a four year curriculum for you, and the kinds of skills you need to practice evolve over that curriculum.

This is exactly the perspective I'm trying to get at work my comment - if you go to school to get a certification to get a job and don't care at all about the learning, of course it's nonsense to "waste your time" on an assignment that ChatGPT can generate for you. But if you're there to learn and develop a mastery, the additional skills you would have picked up by doing the hard thing - and maybe having a Chat AI support you in a productive way - is really where the learning is.

If 5 year olds can generate a university level essay on the implications of thermodynamics on quantum processing using AI, that's fun, but does the 5 year old even know if that's a coherent thesis? Does it imply anything about their understanding of these fields? Are they able to connect this information to other places?

Learning is an intrinsic task that's been turned into a commodity. Get a degree to show you can generate that thing your future boss wants you to generate. Knowing and understanding is secondary. This is the fear of generative AI - further losing sight that we learn though friction and the final output isn't everything. Note that this is coming from a professor that wants to mostly do away with grades, but recognizes larger systemic changes need to happen.

[–] [email protected] 1 points 1 year ago

I am very pro learning, but I also have basically seen that our society doesn't value it. We're anti expertise to our detriment. I like figuring things out and learning... But I am not sure that that's any more than an opinion I hold. If the learning doesn't help you in life, I have a hard time defending it as more than a preference.

I guess what I'm trying to say is - my values and motivations aren't the only ones, and I can't prove them as the right ones. If someone is primarily motivated by making money, learning is a little correlated with that, but it's not overwhelmingly so. More specifically - writing ChatGPT style essays are something I believe plenty of people have lucrative careers without ever doing.

I not even convinced college has positive ROI anymore. In that context, the output is the issue. In the context of most jobs it is also the issue.

Maybe this analogy will help - do you feel that all the people taking better pictures than ever thanks to AI in their cellphone cameras and automatic post processing have missed an important skill of working out ISO, aperture and shutter speed? Do you think they would mostly agree those skills are useful? Are there a lot of jobs for "camera technicians" where the manual settings are what they're hired for?

Now, I agree that in my analogy - if you know how the settings relate to freezing motion or background blur or whatever, you can take better pictures and likely have a higher hit rate. But I don't think the world prioritizes that, and I am not sure in the bigger picture they are wrong.