Jimmy Leonard | April 29, 2023
This post may contain affiliate links. See my full disclosure here.
Remember the ‘90s when people were all like, “Hey, wouldn’t it be cool if a robot could do your homework for you?”
Welp. The future is now.
ChatGPT — basically the 21st-century Little Shop of Horrors remake developed by tech unicorn OpenAI — is the fastest-growing consumer application in history. With 100 million users in the first two months and the early signs that generative AI will soon be part of our regular online search experience, I’m already stoked for the inevitable congressional hearing with the same erudite interviewers who have apparently never used TikTok in their lives.
Anyway, if you don’t know, ChatGPT can code for you, prepare you for job interviews, write marketing content, and, to every student’s delight, write essays.
And yes, there’s a lot of criticism about its inaccuracies and stories of people stumping it with children’s riddles, but if you’ve used it, you have to admit that it’s pretty cool. It’s so fast, so smooth, and honestly quite accurate as long as you keep your questions on the level and don’t try to play mind games with the computer. I mean, recently ChatGPT passed a coding interview for a Level 3 engineer at Google, a position that pays $183,000 in annual salary. (Let’s be real, we already knew robots worked at Google. We just didn’t realize they got paid so well!)
But when it comes to education, the swirling question on every teacher and parent’s mind is will students use this to cheat? When all you have to do is type, write a 5-paragraph essay on the causes of the American Revolution, and you instantly get a 5-paragraph essay about taxation without representation, tea parties, and the tell-all memoir “Spare” by Prince Harry, it is worth wondering if any student will ever do homework ever again.
Now, generally speaking, the people concerned about this are the same types who decry Wikipedia and tell their kids to get off YouTube and go read a book. Most of what I’ve read on this topic repeats the same overarching criticisms. Large language models can’t think for themselves, it doesn’t understand what it’s saying, it’s full of factual errors, there’s a significant risk of it spreading bias or one-sided narratives, and — if you ask the same question twice — you can get some conflicting information. It’s wrong a lot, but it’s confidently wrong, and if your own head isn’t screwed on straight, this thing can really lead you astray.
To which I say, welcome to the internet.
We’ve all heard this song before, right? Google Translate will ruin foreign language education. Graphing calculators make you forget how to do math. Kids these days don’t know how to read an analog clock because of all their gosh-darn-blasted digital watches.
For effective teachers, these are tools, not barriers. Integrating Google Translate can support foreign language learning without sacrificing academic rigor. Automating computations does not replace mathematical reasoning and proofs, and, the last time I checked, they still make Rolexes with little hands that spin around the circle.
(By the way, ask any elementary school math teacher, and they’ll tell you how useful analog clocks are for helping students conceptualize fractions. Ask those same elementary school math teachers how many Rolexes they own, and they’ll tell you they save that lesson for when they get to imaginary numbers.)
So back to the internet. For decades now, we’ve had kids in school with access to this instant wealth of information that is full of inaccuracies, one-sided, self-contradictory, and often spewing straight-up nonsense. I’m pretty sure those are the four pillars of Twitter, and that was before Elon changed the logo to a Shiba Inu. But for all its hype, ChatGPT isn’t doing anything that we haven’t already seen the internet do. It presents users with content, but users need to apply their own judgment.
Or, as Ian Bogost of The Atlantic observes, “[ChatGPT] doesn’t make accurate arguments or express creativity, but instead produces textual material in a form corresponding with the requester’s explicit or implicit intent, which might also contain truth under certain circumstances. That is, alas, an accurate account of textual matter of all kinds.”
In other words, just because it’s in a book, it doesn’t mean it’s accurate. Just because it’s in a magazine or a newspaper or an online forum or an AI chatbot, it doesn’t mean we should blindly trust it.
Recommended Reading: Why I Gave Money To Wikipedia
So let’s step back and reframe this discussion. If you’re a teacher, and your homework assignments are so easy that a language model could generate some random text and get a passing grade, you’re doing homework wrong.
Let’s break it down into a few categories. For the sake of this article, I’m assuming an audience of late middle school/early high school students, but many of these principles apply to college students as well. College students just tend to be more philosophically committed one way or the other already — they’re going to make a sincere effort because they want to learn or they’re going to find the lazy way out no matter what the instructor does. For elementary kids, let’s pretend for a moment that we don’t live in the dystopia where 9-year-olds have unrestricted access to the internet on their phones and are inexcusably unsupervised while completing copious amounts of homework for a class that has no bearing on their future careers as influencers. Anyway. Shall we begin?
What is the square root of 300? No — don’t you dare reach for a calculator.
You need to separate this bad boy into factors. 300 = 3 x 100, and the square root of 100 is 10, which you should have memorized somewhere between learning the digits of pi and the sum of all angles in a dodecahedron, so we have √300 = √100 x √3 = 10√3 which is … not helpful.
Let’s back this way up. In what far-fetched real-life scenario would you need to express the square root of 300 as a radical? How often do you need to know the square root of anything? And if you do, you use a calculator. (It’s 17.32 by the way. You’re welcome.)
This is exactly why most chalk-dust-covered, sweater-vest-wearing purists will tell you that calculations are not mathematics. If you want to learn what a square root is, conceptually, I recommend a hands-on geometry lab with little tiles arranged in squares on a table. If you just need to know the answer, of course a computer should do it for you.
So a worksheet of 50 random computations is akin to manual data entry. If a kid cheats, good for them — these kinds of tasks should be automated. Meaningful mathematics is about applied problem-solving, pattern recognition, and deductive reasoning. Use story problems. Get interdisciplinary with science labs, architecture and design, or personal finance. Math should be mentally stimulating, not six sheets of notebook paper to “show your work” to prove that you didn’t use a calculator.
What is the major theme in John Bunyan’s “The Pilgrim’s Progress?”
Write a five-paragraph essay using quotes from the book. Cite the page numbers. Try not to fall asleep face-first on your keyboard because this essay is so boring.
One of the problems with traditional ELA — and I say this as an English teacher — is that these kinds of questions give the perception that there is a “right” answer. What is the theme? Yes, teachers will encourage students to choose their own paths and defend their ideas, but if you Google this, you will find an “answer.” (It’s “the burden of sin and salvation through Christ” according to LitCharts. Et voilà!)
So, from the student’s point of view, these essays are often seen as “read a book you don’t particularly enjoy, read my mind as to what it’s about, and then write a formulaic essay with no room for creativity.” Of course there’s a temptation to use a chatbot to write that essay.
Instead, I’m a fan of writing workshops. Give students projects and assignments that allow for real choice and authentic voice, then apply those literary techniques throughout the revision process. A teacher might give a mini-lesson on theme, then give the feedback that a student’s essay is a little bit confusing because it’s discussing too many themes at once. In other words, immediately show why the lesson matters in a personal creative context.
Everything from weighing in on current events to deconstructing their favorite YouTuber’s delivery (why are they funny? What techniques do they use?) is on the table. Class time should include multiple rounds of revisions and discussion — verbal processing is hugely underrated!
I’ll put it another way. If ChatGPT could write a convincing B/B+ essay using just the essay prompt, it’s a bad essay prompt. When students find their voices and feel that their ideas matter, they won’t want to cheat anyway.
If you’re a school teacher, it’s hard to change the whole system overnight. I get it. And if you’re a homeschool educator, some of this might seem out of your wheelhouse. There’s no quick fix that will suddenly make all kids love school, but I do know that we shouldn’t be afraid of AI advances, and we shouldn’t feel stuck in the same homework ruts year after year.
Whether you want to bounce some ideas off another human or you need help with curriculum planning, I’m ready to listen. Get in touch. I promise not to quiz you on mental math.
tip jar – never expected, always appreciated – donate now
your support keeps this site free of annoying banner ads
Jimmy Leonard helps brands tell stories and reach new audiences. He enjoys running and hiking in the mountains.
© Jimmy Leonard – home – services – podcast – privacy and terms