A Massachusetts couple claims that their son's high school attempted to derail his future by giving him detention and a bad grade on an assignment he wrote using generative AI.
A Massachusetts couple claims that their son's high school attempted to derail his future by giving him detention and a bad grade on an assignment he wrote using generative AI.
An old and powerful force has entered the fraught debate over generative AI in schools: litigious parents angry that their child may not be accepted into a prestigious university.
In what appears to be the first case of its kind, at least in Massachusetts, a couple has sued their local school district after it disciplined their son for using generative AI tools on a history project. Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments and that the punishment visited upon their son for using an AI tool—he received Saturday detention and a grade of 65 out of 100 on the assignment—has harmed his chances of getting into Stanford University and other elite schools.
Looks like the handbook does explicitly mention it:
Academic Integrity: Cheating and Plagiarism
To cheat is to act dishonestly or unfairly in order to gain an advantage. In an academic setting, cheating consists of such acts as communicating with other student(s) by talking or writing during a test or quiz; unauthorized use of technology, including Artificial Intelligence (AI), during an assessment; or any other such action that invalidates the result of the assessment or other assignment. Plagiarism consists of the unauthorized use or close imitation of the language and thoughts of another author, including Artificial Intelligence, and the representation of such as one’s own work. Plagiarism and cheating in any form are considered disciplinary matters to be addressed by the school. A teacher apprehending one or more students cheating on any graded assignment, quiz or test will record a failing grade for that assignment for each student involved. The teacher will inform the parent(s) of the incident and assistant principal who will add the information to the student’s disciplinary file. The assistant principal may take further action if they deem it warranted. See Code of Discipline.
The way I see AI as a tool in a classroom or learning setting is that you should be punished if you willingly used it due to laziness, not understanding the course work, or I assume most likely both. On its own it's not terrible (environment aside), but it's certainly not something I'd accept if I were a teacher grading homework.
What fucking snowflakes. When I was a kid, if you had someone write your paper for you, you got a 0 for the assignment. When you go to college, they'll fail you out of the course for that shit (because its cheating).
The only ones harming this kid's future is the parents trying to coddle their kid and protect them from the (rather light) consequences of their actions.
I taught in Chinese universities for 16 years. Initially I liked it. The students were hard-working and respectful. Parents listened to teacher advice. If kids were caught cheating there was Hell to pay ... from the parents, not just the school.
Over that 16 year period, though, everything changed. Parents started showing up to middle schools whose response to any misconduct was to privately donate red portraits of Chairman Mao to the school administrators and suddenly all records of misconduct went missing. Marks were "reassessed". Leading to universities being flooded by the worst imaginable students who'd never had a negative effect to any shenanigans their entire lives.
Only universities are a different world entirely. It takes a whole lot more red portraits of Chairman Mao to get misconduct erased in university. Way more such portraits than all but the top 0.1% could pay. So these poor kids, having slid by for 12 years of no consequences suddenly get hit square between the eyes with consequences that for the first time in their lives Daddy couldn't erase by waving said red portraits around.
Yes, they were little shits. Yes, I hated them as students. But I still felt bad for them as people because they were made monsters. They weren't born monsters.
Still didn't stop me from quitting teaching, though.
What would the parents' stance be if he'd asked someone else to write his assignment for him?
Same thing.
Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments
I'll bet you the student handbook doesn't explicitly prohibit taking a shit on his desk, but he'd sure as Hell be disciplined for doing it. This whole YOU DIDN'T EXPLICITLY PROHIBIT THIS SO IT'S FINE!!!111oneoneeleventy! thing that a certain class of people have is, to my mind, a clear sign of sociopathy.
Basically their stance is that the school policy didn't explicitly say he couldn't use AI, so perhaps the policy specifically mentions another person doing the assignment?
You know, now that I think about it, if I were in an admissions office I'd be keeping a quiet database of news stories like this so I know which people I would automatically reject no matter what their scores.
Reminds me of some bass-ackwards story I read about boardgames. A couple was saying "the rules don't forbid this" so they were putting pieces in the wrong places. What a nightmare that would have been.
People who do that at my games table get uninvited from games nights. I might also point out that the rules don't forbid me tossing my glass of baijiu into their faces but they're probably thankful I'm not doing it.
OK, the parents are suing. And the district already filed a motion to dismiss.
Please understand, the world isn't a nuts as the headlines tell us. Judges toss frivolous lawsuits all day long. We only hear about the nut cases because they're nut cases. Money says this case is never heard.
Considering how many kids get into Ivy League schools purely because of who their parents are and/or home much money they donate, you’re most certainly right
It's been a while since teachers were allowed to give out 0s in highschool. When I taught 12 years ago the lowest I was allowed to give was a 65. Even if nothing was turned in.
I imagine this must depend on the location of the school in question. Im in my mid 20s, so my high school experience was more recent than 12 years ago, but I remember getting quite a few zeros (was an absolutely horrible procrastinator who would tend to respond to the stress of having a due date coming up by doing anything else to not think about the source of said stress, which led to a lot of simply not turned in schoolwork)
Dude, the fact that the student has to use AI tools to get by, does not mean he's going to be a success story in life. It just means he's going to find shortcuts and exploits to make things easier over everyone else that had to do things the natural way. This is no different than someone using calculators in math tests where it's not allowed. This is no different than someone simply peeking over another's work and copying down. Using AI generative tools to gain an advantage is in the same ballpark.
So these entitled parents and that entitled student can go get fucked. I hope these universities see this and recognize that this student is a borderline cheater and hopefully deny him anyways if this gets overturned.
...if you get a tough job, one that is hard, and you haven’t got a way to make it easy, put a lazy man on it, and after 10 days he will have an easy way to do it, and you perfect that way and you will have it in pretty good shape.
I can't possibly see anything that could go wrong with this attitude. Nope. Nothing whatsoever could possibly go wrong. This is all perfectly normal and not even slightly destructively solipsistic.
When I was a kid, we had a period of some repetitive math work I got sick of. So I wrote a TI-84 program to automate it, even showing its work I would write down.
I wasn't really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.
It would be awesome if there was an AI "equivalent" to that. Like some really primitive offline LLM you were allowed to use in school for basic automation and assistance, but requires a lot of work to set up and is totally useless without it in. I can already envision ways to set this up with BERT or Llama 3B.
I wasn't really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.
This is the way it should be. If you created the program on your own, as opposed to copying it from elsewhere, you had to know how to do the work correctly in the first place. You've already demonstrated that you understand the process beyond just being able to solve a single equation. You then aren't wasting time "learning" something you've already learned just to finish an otherwise arbitrary number of problems.
If the specifics of the curriculum are too tedious, that’s on the school to address.
This! This right here. So many school curricula are designed by people who seem to despise children and want to make them suffer that I wonder why we bother with schools at all sometimes.
(Of course I also refer to Chinese high schools as institutionalized child abuse, so what do I know?)
to be fair, understanding something well enough to automate it probably requires learning it in the first place. Like obviously an AI that just tells you the answer isnt going to get you anywhere, but it sounds more like the user you were replying to was suggesting an AI limited enough that it couldnt really tell you the answer to something, unless you yourself went through the effort to teach it that concept first. Im not sure how doable this is in practice, My suspicion is that to actually be able to be useful in that regard, the AI would have to be fairly advanced and just pretend to not understand a concept until adequately "taught" by the student, if only to be able to tell if it was taught accurately and tell the student that they got it wrong and need to try again, rather than reinforce an incomplete or wrong understanding, and that theres a risk that current AI used for this could instead be "tricked" by clever wording into revealing answers that its supposed to act like it doesnt know yet (on top of the existing issues with AI spitting out false information by making associations that it shouldnt actually make), but if someone actually made such a thing successfully, I could see it helping with some subjects. I'm reminded of my college physics professors who would both let my class bring a full page of notes and the class textbook to refer to during tests- under the reasoning that a person who didnt understand how to use the formulas in the text wouldnt be able to actually apply them, but someone who did but misremembered a formula would have the ability to look them up again in the real world. These were by far some of the toughest tests I ever had. Half of the credit was also from being given a copy of the test to do again for a week as homework, where we were as a class encouraged to collaborate and teach eachother how so solve the problems given, again on the logic that explaining something to someone else helped teach the explainer that thing too.