926
College professors are going back to paper exams and handwritten essays to fight students using ChatGPT
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
Ah the calculator fallacy; hello my old friend.
So, a calculator is a great shortcut, but it's useless for most mathematics (i.e. proof!). A lot of people assume that having a calculator means they do not need to learn mathematics - a lot of people are dead wrong!
In terms of exams being about memory, I run mine open book (i.e. students can take pre-prepped notes in). Did you know, some students still cram and forget right after the exams? Do you know, they forget even faster for courseworks?
Your argument is a good one, but let's take it further - let's rebuild education towards an employer centric training system, focusing on the use of digital tools alone. It works well, productivity skyrockets, for a few years, but the humanities die out, pure mathematics (which helped create AI) dies off, so does theoretical physics/chemistry/biology. Suddenly, innovation slows down, and you end up with stagnation.
Rather than moving us forward, such a system would lock us into place and likely create out of date workers.
At the end of the day, AI is a great tool, but so is a hammer and (like AI today), it was a good tool for solving many of the problems of its time. However, I wouldn't want to only learn how to use a hammer, otherwise how would I be replying to you right now?!?
I found this too generalizing. Yes, most people only ever need and use productivity skills in their worklife. They do no fundamental research. Wether their education was this or that way has no effect on the advancement of science in general, because these people don't do science in their career.
Different people with different goals will do science, and for them an appropriate education makes sense. It also makes sense to have everything in between.
I don't see how it helps the humanities and other sciences to teach skills which are never used. Or how it helps to teach a practice which no one applies in practice. How is it a threat to education when someone uses a new tool intelligently, so they can pass academic education exams? How does that make them any less valuable for working in that field? Assuming the exam reflects what working in that field actually requires.
I think we can also spin an argument in the opposite direction: More automation in education frees the students to explore side ideas, to actually study the field.
"I don’t see how it helps the humanities and other sciences to teach skills which are never used." - I can offer an unusual counter here, you're assuming the knowledge will never be used, or that we should avoid teaching things that are unlikely to be used. Were this the case, the field of graph theory would have ceased to exist long before it became useful in mapping - indeed Bool's algebra would never have led to the foundations of computer science and the machines we are using today.
"How is it a threat to education when someone uses a new tool intelligently, so they can pass academic education exams?" - Allow me to offer you the choice of two doctors, one of whome passed using AI, and the other passed a more traditional assessment. Which doctor would you choose and why? Surely the latter, since they would have also passed with AI, but the one without AI might not have passed the more traditional route due to a lack of knowledge. It isn't a threat to education, it's adding further uncertainty as to the outcome of such an education (both doctors might have the same skill levels, but there is more room for doubt in the first case).
"Wether their education was this or that way has no effect on the advancement of science in general, because these people don’t do science in their career." - I strongly disagree! In an environment where knowledge for the sake of knowledge is not prised, a lie is more easy plant and nurture (take for example the antivax movement). Such people can be an active hinderence to the progress of knowledge - their misconceptions creating false leads and creating an environment that distrusts such sciences (we're predisposed to distrust what we do not understand).
Not exactly. What I meant to say is: Some students will never use some of the knowledge they were taught. In the age of information explosion, there is practically unlimited knowledge 'available'. What part of this knowledge should be taught to students? For each bit of knowledge, we can make your hypothetic argument: It might become useful in the future, an entire important branch of science might be built on top of it.
So this on it's own is not an argument. We need to argue why this particular skill or knowledge deserves the attention and focus to be studied. There is not enough time to teach everything. Which in turn can be used as an argument to more computer assisted learning and teaching. For example, I found ChatGPT useful to explore topics. I would not have used it to cheat in exams, but probably to prepare for them.
Good point, but it depends on context. You assume the traditional doc would have passed with AI, but that is questionable. These are complex tools with often counterintuitive behaviour. They need to be studied and approached critically to be used well. For example, the traditional doc might not have spotted the AI hallucinating, because she wasn't aware of that possibility.
Further, it depends on their work environment. Do they treat patients with, or without AI? If the doc is integrated in a team of both human and artificial colleagues, I certainly would prefer the doc who practiced these working conditions, who proved in exams they can deliver the expected results this way.
I feel we left these lands in Europe when diplomas were abandoned for the bachelor/master system, 20 years ago. Academic education is streamlined, tailored to the needs of the industry. You can take a scientific route, but most students don't. The academica which you describe as if it was threatened by something new might exist, but it lives along a more functional academia where people learn things to apply them in our current reality.
It's quite a hot take to paint things like the antivax movement on academic education. For example, I question wether the people proposing and falling for these 'ideas' are academics in the first place.
Personally, I like learning knowledge for the sake of knowledge. But I need time and freedom to do so. When I was studying computer science with an overloaded schedule, my interest in toying with ideas and diving into backgrounds was extremely limited. I also was expected to finish in an unreasonably short amount of time. If I could have sped up some of the more tedious parts of the studies with the help of AI, this could have freed up resources and interest for the sake of knowledge.
Yes, within limits. Due to the information explosion, it became impossible to learn "everything". We need to make choices, prioritize.
How does your voting behaviour suffer because you lack understanding about how exactly potentiometers work, or how to express historic events in modern dance?
Both have inherent worth, but not the same for each person and context. We luckily live in a society of labor division. Not everyone has to know or like everything. While I absolutely admire science, not everyone has to be a scientist.
Because there is more knowledge available than we can ever teach a single person, it is entirely possible to spend a lifetime learning things with no use informing your ballot decision. I would much rather have students optimize some parts of their education with AI, to free up capacity for other important subjects which may seem less related to their discipline. For example, many of my fellow computer science students were completely unaware how it could be ethically questionable to develop pathfinding algorithms for military helicopters.
While all of what you say is true, we simply cannot teach everything since there is just too much knowledge and too little time in a human life.
And not everyone is equally interested or capable in learning everything.
This is necessarily the world we live in, even without adding capitalism or any evil intentions to the mix. Any education you can get or offer can only be a more or less well selected subset of the knowledge available.
In this light, I don't see it as a dramatic loss to remove educational emphasis from skills which can easily be replaced with modern technology. It would make sense to shift the focus to teaching a critical usage of said technology.
So ... I honestly think this is a problematic reply ... I think you're being defensive (and consequently maybe illogical), and, honestly, that would be the red flag I'd look for to indicate that there's something rotten in academia. Otherwise, there might be a bit of a disconnect here ... thoughts:
calculator
was in reference to arithmetic and other basic operations and calculations using them ... not higher level (or actual) mathematics. I think that was pretty clear and I don't think there's any "fallacy" here, like at all.value of learning (actual) mathematics
is pretty obvious I'd say ... and was pretty much stated in my post about alternatives to emphasise. On which, getting back to my essential point ... how would one best learn and be assessed on their ability to construct proofs in mathematics? Are timed open book exams (and studying in preparation for them) really the best we've got!?Still forgetting with open book exams
... seems like an obvious outcome as the in-exam materials de-emphasise memory ... they probably never knew the things you claim they forget in the first place. Why, because the exam only requires the students to be able to regurgitate in the exam, which is the essential problem, and for which in-exam materials are a perfect assistant. Really not sure what the relevance of this point is.Forgetting after coursework
... how do you know this (genuinely curious)? Even so, course work isn't the great opposite to exams. Under the time crunch of university, they are also often crammed, just not in an examination hall. The alternative forms of education/assessment I'm talking about are much more long-form and exploration and depth focused. The most I've ever remembered from a single semester subject came from when I was allowed to pursue a single project for the whole subject. Also, I didn't mention ordinary general coursework in my post, as, again, it's pretty much the same paradigm of education as exams, just done at home for the most part.Rebuilding education toward employer centric training system
... I ... ummm ... never suggested this ... I suggested the opposite ... only things that were far more "academic" than this and were never geared toward "productivity". This is a pretty bad staw man argument for a professor to be making, especially given that it seems constructed to conclude that the academy and higher learning are essential for the future success of the economy (which I don't disagree with or even question in my post).OK Mr Socrates how else would you assess whether a student has learned something?
Ha ... well if I had answers I probably wouldn't be here! But seriously, I do think this is a tough topic with lots of tangled threads linked to how our society functions. I'm not sure there are any easy "fixes", I don't think anyone who thinks that can really be trusted, and it may very well turn out that I'm completely wrong and there is not "better way", as something flawed and problematic may just turn out to be what humanity needs.
A pretty minor example based on the whole thing of returning to paper exams. What happens when you start forcing students to be judged on their ability to do something, alone, where they know very well that they can do better with an AI assistant? Like at a psychological and cultural level? I don't know, I'm not sure my generation (Xennial) or earlier ever had that. Even with calculators and arithmetic, it was always about laziness or dealing with big numbers that were impossible for (normal humans), or ensuring accuracy. It may not be the case that AI is at that level yet for many exams and students (I really don't know), but it might be or might be soon. However valuable it is to force students to learn to do the task without the AI, there's gotta be some broad cultural effect in just ignoring the super useful machine.
Otherwise, my general ideas would be to emphasise longer form work (which AI is not terribly useful for). Work that requires creativity, thinking, planning, coherent understanding, human-to-human communication and collaboration. So research projects, actual practical work, debates, teaching as a form of assessment etc. In many ways, the idea of "having learned something" becomes just a baseline expectation. Exams, for instance, may still hold lots of value, but not as forms of objective assessment, but as a way of calibrating where you're up to on the basic requirements to start the real "assessment" and what you still need to work on.
Also ...
OK Mr Socrates
... is maybe not the most polite way of engaging here ... comes off as somewhat aggressive TBH.