this post was submitted on 13 Aug 2023
926 points (97.7% liked)

Technology

60123 readers
5102 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

top 50 comments
sorted by: hot top controversial new old
[–] AlmightySnoo 132 points 1 year ago* (last edited 1 year ago) (1 children)

I think that's actually a good idea? Sucks for e-learning as a whole, but I always found online exams (and also online interviews) to be very easy to game.

[–] [email protected] 80 points 1 year ago (9 children)

Really sucks for people with disabilities and handwriting issues.

[–] [email protected] 75 points 1 year ago (1 children)

It's always sucked for them, and it always will. That's why we make accommodations for them, like extra time or a smaller/move private exam hall.

[–] [email protected] 19 points 1 year ago (1 children)

And readers/scribes! I’ve read and scribed for a friend who had dyslexia in one of her exams and it worked really well. She finished the exam with time to spare and got a distinction in the subject!

[–] [email protected] 13 points 1 year ago

Yep, my girlfriend acted as a scribe for disabled students at a university. She loved it, and the students were able to complete their written work and courses just fine as a result.

[–] [email protected] 19 points 1 year ago (4 children)

My handwriting has always been terrible. It was a big issue in school until I was able to turn in printed assignments.

Like with a lot of school things, they do a shit thing without thinking about negative effects. They always want a simple solution to a complex problem.

[–] [email protected] 12 points 1 year ago

My uni just had people with handwriting issues do the exam in a separate room with a writer for you to narrate answers to.

People have been going to universities for millennia before the advent of computers, we have lots of ways to help people with disabilities that don't require computers.

load more comments (3 replies)
load more comments (7 replies)
[–] HexesofVexes 129 points 1 year ago (88 children)

Prof here - take a look at it from our side.

Our job is to evaluate YOUR ability; and AI is a great way to mask poor ability. We have no way to determine if you did the work, or if an AI did, and if called into a court to certify your expertise we could not do so beyond a reasonable doubt.

I am not arguing exams are perfect mind, but I'd rather doubt a few student's inability (maybe it was just a bad exam for them) than always doubt their ability (is any of this their own work).

Case in point, ALL students on my course with low (<60%) attendance this year scored 70s and 80s on the coursework and 10s and 20s in the OPEN BOOK exam. I doubt those 70s and 80s are real reflections of the ability of the students, but do suggest they can obfuscate AI work well.

load more comments (88 replies)
[–] [email protected] 78 points 1 year ago (3 children)

They're about to find out that gen Z has horrible penmanship.

[–] Holyginz 25 points 1 year ago (12 children)

Millennial here, haven't had to seriously write out anything consistently in decades at this point. There's no way their handwriting can be worse than mine and still be legible lol.

[–] crwcomposer 17 points 1 year ago* (last edited 1 year ago)

As a millennial with gen Z teens, theirs is worse, though somehow not illegible, lol. They just write like literal 6 year olds.

load more comments (11 replies)
load more comments (2 replies)
[–] [email protected] 69 points 1 year ago (3 children)

has led some college professors to reconsider their lesson plans for the upcoming fall semester.

I'm sure they'll write exams that actually require an actual understanding of the material rather than regurgitating the seminar PowerPoint presentations as accurately as possible...

No? I'm shocked!

[–] [email protected] 48 points 1 year ago (12 children)

We get in trouble if we fail everyone because we made them do a novel synthesis, instead of just repeating what we told them.

Particularly for an intro course, remembering what you were told is good enough.

[–] zigmus64 22 points 1 year ago

The first step to understanding the material is exactly just remembering what the teacher told them.

load more comments (11 replies)
load more comments (2 replies)
[–] aulin 61 points 1 year ago (5 children)

There are places where analog exams went away? I'd say Sweden has always been at the forefront of technology, but our exams were always pen-and-paper.

load more comments (5 replies)
[–] [email protected] 54 points 1 year ago (6 children)

This isn't exactly novel. Some professors allow a cheat sheet. But that just means that the exam will be harder.

Physics exam that allows a cheat sheet asks you to derive the law of gravity. Well, OK, you write the answer at the bottom pulled from you cheat sheet. Now what? If you recall how it was originally created you probably write Newtons three laws at the top of your paper... And then start doing some math.

Calculus exam that let's you use wolfram alpha? Just a really hard exam where you must show all of your work.

Now, with ChatGPT, it's no longer enough to have a take home essay to force students to engage with the material, so you find news ways to do so. Written, in person essays are certainly a way to do that.

load more comments (6 replies)
[–] [email protected] 46 points 1 year ago (5 children)

Am I wrong in thinking student can still generate an essay and then copy it by hand?

[–] [email protected] 58 points 1 year ago (3 children)

Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.

load more comments (3 replies)
load more comments (4 replies)
[–] [email protected] 39 points 1 year ago (15 children)

When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn't exactly a new development.

[–] [email protected] 28 points 1 year ago (1 children)

So what you’re telling me is that written tests have, in fact, existed before?

What are you some kind of education historian?

load more comments (1 replies)
load more comments (14 replies)
[–] UsernameIsTooLon 34 points 1 year ago (1 children)

You can still have AI write the paper and you copy it from text to paper. If anything, this will make AI harder to detect because it's now AI + human error during the transferring process rather than straight copying and pasting for students.

load more comments (1 replies)
[–] [email protected] 32 points 1 year ago (8 children)

This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.

load more comments (7 replies)
[–] joel_feila 30 points 1 year ago (1 children)

Well if i go back to school now im fucked i cant read my own hand writting.

load more comments (1 replies)
[–] [email protected] 29 points 1 year ago (4 children)

Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?

[–] [email protected] 36 points 1 year ago (23 children)

It actually is artificial intelligence. What are you even arguing against man?

Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn't AI because you don't like it is like saying rock and roll isn't music

load more comments (23 replies)
[–] joel_feila 14 points 1 year ago

But then the investor wont throw wads of money at these fancy tech companies

load more comments (2 replies)
[–] thedirtyknapkin 26 points 1 year ago

as someone with wrist and hand problems that make writing a lot by hand, I'm so lucky i finished college in 2019

[–] [email protected] 26 points 1 year ago (5 children)

Wouldn't it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?

There could still be classes / lectures that cover the more classical methods, but I remember being told "you won't have a calculator in your pocket".

My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can't solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?

[–] [email protected] 16 points 1 year ago

Training how to use "AI" (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use "AI", you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.

Without some means of forcing students to engage cognitively, there's little point in education. Pen and paper seems like a pretty cheap way to get that done.

I'm all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.

[–] [email protected] 14 points 1 year ago (8 children)

That's just what we tell kids so they'll learn to do basic math on their own. Otherwise you'll end up with people who can't even do 13+24 without having to use a calculator.

load more comments (8 replies)
[–] settxy 13 points 1 year ago

There are some universities looking at AI from this perspective, finding ways to teach proper usage of AI. Then building testing methods around the knowledge of students using it.

Your point on checking for accuracy is on point. AI doesn't always puke out good information, and ensuring students don't just blindly believe it NEEDS to be taught. Otherwise wise you end up being these guys... https://apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa122437dbb59b

load more comments (2 replies)
[–] [email protected] 24 points 1 year ago

Chat GPT - answer this question, add 4 consistent typos. Then hand transcribe it.

[–] SocialMediaRefugee 17 points 1 year ago (7 children)

Might as well go back to oral exams and ask the student questions on the spot.

load more comments (7 replies)
load more comments
view more: next ›