College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.
This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.
And forget about having any sort of integrity or explaining to kids why it’s important for them to know how to do shit themselves instead of being wholly dependent on corporate proprietary software whose accessibility can and will be manipulated to serve the ruling class on a whim 🤦
It’s insane talking to people that don’t do math.
You ask them any mundane question and they just shrug, and if you press them they pull out their phone to check.
It’s important that we do math so that we develop a sense of numeracy. By the same token it’s important that we write because it teaches us to organize our thoughts and communicate.
These tools will destroy the quality of education for the students that need it the most if we don’t figure out how to reign in their use.
If you want to plug your quarterly data into GPT to generate a projection report I couldn’t care less. But for your 8th grade paper on black holes, write it your damn self.
Putting quarterly data into ChatGPT is dangerous for companies because that information is being fed into the AI and accessible by its creators, which means you’re just giving away proprietary information and trade secrets by doing that. But do these chucklefucks give one single shit? No. Because they’re selfish, lazy assholes that want robots to do their thinking for them.
Good luck doing one on one assessments in a uni course of 300+
In what ways do you envision working with LLMs as an educator of children?
I have used ChatGPT to explain to myself a number of fairly advanced technical and programming concepts; I work in Animation through my own self-study and some good luck, so I’m constantly trying to up my skills in the math that relates to it. When I come up against a math or C++ term or concept that I do not currently understand, I can generally get a pretty good conceptual understanding of it by working with ChatGPT.
So at one point I wanted to understand what Linear Algebra specifically meant, and it didn’t stick but I do remember asking it to expand on things it said that weren’t clear, and it was able to competently do so. By asking many questions I was able - I think - to get clearer on a number of things which I doubt I ever would have learned, unless by luck I found someone who knows the math to teach me.
It also flubbed a lot of basic arithmetic, and I had to mentally look for and correct that.
This is useful to an autodidact like myself who has learned how to learn at a University level, to be sure.
I cannot, however, think of a single beneficial way to use this to educate small children with no such years of mental discipline and ability to understand that their teacher is neither a genius nor a moron, but rather, a machine that pumps out florid expressions of data that resemble other expressions of similar data.
Please, tell me one.
Devise a physical problem that can be tested, have everyone in class pull a ChatGPT answer to it, have them read the answers out loud and vote on which one is right, then apply it to the physical version and see it fail. Show them how tweaking the answer just a bit solves the problem.
Ta-da! Just taught them that without all your years.
Then you’re not a teacher. Please don’t ever teach small children.
Well, I suppose the education system gets the teachers it pays for…