> The parents' case hangs largely on the student handbook's lack of a specific statement about AI, even though that same handbook bans unauthorized use of technology. "They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB last month. "They basically punished him for a rule that doesn't exist."
I'm going out on a limb here, but if this is the viewpoint of the people who raised him, then I'm not surprised he cheated.
If this was my son and the facts were the same, he'd be grounded in addition to whatever consequence the school deems fit.
What is unauthorized use of technology? Is the light I need to read not technology? Is using the internet to find more about a topic not technology? Where is the line that makes AI forbidden?
I think the throwaway actually raises the valid point about the rule being an exceedingly broad catchall. The type primed for selective and weaponized enforcement.
That said, the kids are clearly defenseless in this situation, for blatant plagiarism as well as just being just being factually incorrect in their report.
Have you actually read the piece? The answers to those is in the written policy the student was given. But even without the policy, it should be pretty clear that passing others' work as your own (be they people or AI) is academic dishonesty.
As judge said, "the emergence of generative AI may present some nuanced challenges for educators, the issue here is not particularly nuanced"
In education, the goal is internalizing to the individual the knowledge required for them to bootstrap into a useful, contributing member of society. Things like composition of written works, organizing one's thoughts into communicable artifacts, doing basic mathematics, familiarity with the local and encompassing polity and it's history, how to navigate and utilize institutions of research (libraries) etc... Any technology employed that prevents or sidesteps that internalization is unauthorized.
It ain't that hard to connect the dots unless you're going out of your way to not connect the dots.
The student was not punished for "using AI", but for plagiarism:
>The incident occurred in December 2023 when RNH was a junior. The school determined that RNH and another student "had cheated on an AP US History project by attempting to pass off, as their own work, material that they had taken from a generative artificial intelligence ('AI') application," Levenson wrote. "Although students were permitted to use AI to brainstorm topics and identify sources, in this instance the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations)."
Probably unpopular opinion here but families, usually wealthy, that use the legal system like this to avoid consequences are parasites. It reveals not only your poor job of raising your children. But also the poor character of the parents.
Glad the courts didn’t grant a similar “affluenza” ruling here. The student plagiarized, short and simple.
What's striking to me is that the parents sued. RNH passed off AI-generated text as their own when they knew how to cite AI generated works and were versed in academic integrity. It wouldn't occur to me to sue the school if this was my kid.
Admissions know his name and the name of the school, which helps find specific students.
It’s easy to miss, but I wouldn’t be surprised if it comes up as “Hingham High School Harris” brings up the relevant info. Further, his parents suing may be a larger issue for a college than his behavior.
Nope. I just replied above with a similar story when I was in school. My classmate got expelled for cheating and sued the school. tv segment, articles about him, etc.
Zero effect on his college outcomes. Got into really good schools.
If I were in college admissions then I'd probably think twice about admitting the candidate with a widely reported history of trying to sue their school on frivolous grounds when things don't go their way.
It strikes me that this is a foolish take to adopt.
I saw lots of students acting a bit like this but I was grateful that I could dedicate myself primarily to my schooling and took as much advantage as I could to learn as much as I could.
The credential gets used as a heuristic for the learning you do but if you show up and don't have and knowledge, then everything is harder and your labor more fruitless.
I know some people don't care and that there are degenerate workplaces but you'll still be left with having been a lot less useful in your life than you were capable of being.
Almost zero downside. I knew a student who plagiarized 3x so they got kicked out. His parents sued. It was even on the tv news because they were asking for hundreds of thousands in compensation. He lost and the school kept him expelled.
I was expecting the bad press coverage to hurt his college chances since there were several articles online about him getting kicked out for cheating and then suing.
Nope! Dude got into a really good school. He even ended up texting asking me for past essays I wrote to turn in as his own to his college classes.
And the kicker was he then transferred to one of the prestigious military academies that supposedly upholds honor and integrity.
So. There is almost zero downside for suing even if it gets you tons of negative publicity.
I don't think we can claim zero downside from one anecdote. There are always outliers that can occur from extenuating circumstances.
- The family potentially has the financial resources or possibly connections to 'make things happen'.
- Perhaps the student is especially charismatic and was able to somehow right the situation. Some people have that con-artist mindset where they're able to cheat/commit fraud through their life with seemingly minimal consequences.
- Perhaps they just got lucky and the administration didn't do their due diligence.
> Perhaps they just got lucky and the administration didn't do their due diligence.
Are universities supposed to google every applicant?
I mean I haven't been in academia for a decade, but back when I was I certainly never browsed a 17-year-old girl's instagram before making an admission decision.
On the other hand, the school caved on National Honor Society after the parents filed. So maybe the best move would have been (tactically, not as a parent) to show the school the draft complaint but never file it.
It would seem that what was put into the report is clearly wrong (in this case from generative AI, but regardless of where it came from, it would still be wrong), so it is still legitimate to mark those parts as wrong. There are other things too which can be called wrong, whether or not the use of this generative AI is permitted (and it probably makes sense to not permit it in the way that it was used in this instance), so there are many reasons why it should be marked wrong.
However, if the punishment is excessively severe, then the punishment would be wrong.
The same way you did so before LLMs existed - you rely on in-class assignments, or take-home assignments that can't be gamed.
Giving out purely take-home writing assignments with no in-class component (in an age where LLMs exist), is akin to giving out math assignments without a requirement to show your work (in an age where calculators exist).
Many years before LLMs were ever a thing, I recall being required to complete (and turn in) a lot of our research and outlining in class. A plain "go home and write about X topic" was not that common, out of fear of plagiarism.
I'm thirty something. How did my teachers engage me in doing math? How did they engage me in rote-memorizing the multiplication tables when portable calculators were already a thing, being operated by coin-cells or little solar panels?
Part of teaching is getting kids to learn why and how things are done, even if they can be done better/faster/cheaper with new technology or large scale industrial facilities. It's not easy, but I think it's the most important part of education: getting kids to understand the subjacent abstract ideas behind what they're doing, and learning that there's value in that understanding. Don't really want to dichotomize, but every other way kids will just become non-curious users of magic black boxes (with black boxes being computers, societal systems, buildings, infrastructure, supply chains, etc).
Invert the assignment, provide a prompt to supply to an essay writing AI of the students choice, but the assignment is to provide critique for the veracity and effectiveness of the generated essay
The citizens of USA re-elected an idiot to the most powerful office in the free world. We are already “powerfully stupid”. His cabinet selections already indicate a gutting of federal offices and dismantling of DoE.
"Doe" is actually a real surname, with a few thousand of them in the US. I'd guess that there probably have been people actually named "Jane Doe". I wonder if that causes many problems for them?
I just used chatGPT to code an html/css/JavaScript solution in an hour for coworkers who were having troubles. There were like wow that was fast we were trying to figure this out for a few days. I'm skilled / an expert but that would've taken me many hours vs. a few back n forth with GPT.
Overall my html/css/javascript skills I feel now aren't as valuable as they were.
I guess in this instance I cheated too or is it that my developer peers haven't gotten into using GPT or they are more moral? As well maybe this is just the new normal....
In our field you needed / need to learn new things to stay relevant yet now the new thing does it almost all for you.
As well if one generation is using AI to get things done why wouldn't a younger generation do the same? Do as I say and not as I do.. that never has held well over time.
No. This attitude of being better than coworkers, coming in and saving the day. It had nothing to do with using AI. It’s about “I am better than you” instead of helping people out, or teaching them these things you know.
It’s just a passing internet comment missing all the context, so what do I know.
My comments are to be controversial… To get people to think… What is the future with AI and using it as such… If I told my coworkers how I achieved it would they not think less present day… What about in a few years or more it's the norm and mine and everyone's HML, CSS, JavaScript skills are less valuable,… this example shows that AI will definitely take peoples jobs, including my own if I do not ramp up my skills
You ramping up your skills will do nothing for you if a machine can otherwise be delegated your job due to the overhead of human worker vs. just owning a machines output. Not having to negotiate is extremely valuable to a business owner. Mark my words. Until people realize that the whole innovation around AI is to sidestep the labor class, things'll continue getting much darker before they brighten.
And the saddest thing is, the fools think it'll work in their favor, and won't blowback with massive unintended consequences.
> The parents' case hangs largely on the student handbook's lack of a specific statement about AI, even though that same handbook bans unauthorized use of technology. "They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB last month. "They basically punished him for a rule that doesn't exist."
I'm going out on a limb here, but if this is the viewpoint of the people who raised him, then I'm not surprised he cheated.
If this was my son and the facts were the same, he'd be grounded in addition to whatever consequence the school deems fit.
What's the relevance? Are you going to embark on a "Let's Educate The Users" mission for parenting?
It would be futile. Parents and children are now united in not wanting to be educated.
What is unauthorized use of technology? Is the light I need to read not technology? Is using the internet to find more about a topic not technology? Where is the line that makes AI forbidden?
The lack of implicit or explicit authorization. As the school has lights, you may assume they are authorized implicitly.
This is unproductive and unenlightening pedantry.
I think the throwaway actually raises the valid point about the rule being an exceedingly broad catchall. The type primed for selective and weaponized enforcement.
That said, the kids are clearly defenseless in this situation, for blatant plagiarism as well as just being just being factually incorrect in their report.
Have you actually read the piece? The answers to those is in the written policy the student was given. But even without the policy, it should be pretty clear that passing others' work as your own (be they people or AI) is academic dishonesty.
As judge said, "the emergence of generative AI may present some nuanced challenges for educators, the issue here is not particularly nuanced"
Is what I wrote here mine or not? I used the autocorrect suggestions almost exclusively, wrote few letters only.
Then, no. This isn’t text you generated. No one cares on Internet forums though.
In education, the goal is internalizing to the individual the knowledge required for them to bootstrap into a useful, contributing member of society. Things like composition of written works, organizing one's thoughts into communicable artifacts, doing basic mathematics, familiarity with the local and encompassing polity and it's history, how to navigate and utilize institutions of research (libraries) etc... Any technology employed that prevents or sidesteps that internalization is unauthorized.
It ain't that hard to connect the dots unless you're going out of your way to not connect the dots.
Don't think yourself into a hole there bud
The student was not punished for "using AI", but for plagiarism:
>The incident occurred in December 2023 when RNH was a junior. The school determined that RNH and another student "had cheated on an AP US History project by attempting to pass off, as their own work, material that they had taken from a generative artificial intelligence ('AI') application," Levenson wrote. "Although students were permitted to use AI to brainstorm topics and identify sources, in this instance the students had indiscriminately copied and pasted text from the AI application, including citations to nonexistent books (i.e., AI hallucinations)."
And for AP US History... a college level course. Yikes.
Probably unpopular opinion here but families, usually wealthy, that use the legal system like this to avoid consequences are parasites. It reveals not only your poor job of raising your children. But also the poor character of the parents.
Glad the courts didn’t grant a similar “affluenza” ruling here. The student plagiarized, short and simple.
What's striking to me is that the parents sued. RNH passed off AI-generated text as their own when they knew how to cite AI generated works and were versed in academic integrity. It wouldn't occur to me to sue the school if this was my kid.
They're not optimizing for the kid's education. They optimizing for the credentials the kid is able to get.
Filing the lawsuit is an asymmetric bet:
- win, and increase college admissions odds
- lose, and be no worse off that without the suit
> lose, and be no worse off that without the suit
This kid should change his name, given his initials, high school and parents’ names are public record next to a four brain cell cheating attempt.
Do you think college admissions officers follow the news and use what they learn to maintain a naughty list?
Perhaps a business idea?
Unless he has someone who is very sympathetic to his cause, the teacher/counselor recommendation will wreck him.
This guy needs to go to a JuCo that feeds into a decent state school — he’s screwed for competitive schools.
> Do you think college admissions officers follow the news and use what they learn to maintain a naughty list?
College admissions, no. College students and colleagues and employers, being able to use a search engine, absolutely.
If you search the student's name on Google, you probably won't find this lawsuit.
Admissions know his name and the name of the school, which helps find specific students.
It’s easy to miss, but I wouldn’t be surprised if it comes up as “Hingham High School Harris” brings up the relevant info. Further, his parents suing may be a larger issue for a college than his behavior.
Nope. I just replied above with a similar story when I was in school. My classmate got expelled for cheating and sued the school. tv segment, articles about him, etc.
Zero effect on his college outcomes. Got into really good schools.
> lose, and be no worse off that without the suit
If I were in college admissions then I'd probably think twice about admitting the candidate with a widely reported history of trying to sue their school on frivolous grounds when things don't go their way.
> - win, and increase college admissions odds
Will it, though? Like if the college happens to know about this incident?
It does strike me that the purpose in attending college is the credential you get; education is a far second.
It strikes me that this is a foolish take to adopt.
I saw lots of students acting a bit like this but I was grateful that I could dedicate myself primarily to my schooling and took as much advantage as I could to learn as much as I could.
The credential gets used as a heuristic for the learning you do but if you show up and don't have and knowledge, then everything is harder and your labor more fruitless.
I know some people don't care and that there are degenerate workplaces but you'll still be left with having been a lot less useful in your life than you were capable of being.
So what would you do in the parents' shoes?
Almost zero downside. I knew a student who plagiarized 3x so they got kicked out. His parents sued. It was even on the tv news because they were asking for hundreds of thousands in compensation. He lost and the school kept him expelled.
I was expecting the bad press coverage to hurt his college chances since there were several articles online about him getting kicked out for cheating and then suing.
Nope! Dude got into a really good school. He even ended up texting asking me for past essays I wrote to turn in as his own to his college classes.
And the kicker was he then transferred to one of the prestigious military academies that supposedly upholds honor and integrity.
So. There is almost zero downside for suing even if it gets you tons of negative publicity.
I don't think we can claim zero downside from one anecdote. There are always outliers that can occur from extenuating circumstances.
- The family potentially has the financial resources or possibly connections to 'make things happen'.
- Perhaps the student is especially charismatic and was able to somehow right the situation. Some people have that con-artist mindset where they're able to cheat/commit fraud through their life with seemingly minimal consequences.
- Perhaps they just got lucky and the administration didn't do their due diligence.
> Perhaps they just got lucky and the administration didn't do their due diligence.
Are universities supposed to google every applicant?
I mean I haven't been in academia for a decade, but back when I was I certainly never browsed a 17-year-old girl's instagram before making an admission decision.
> His parents sued. ...
> He even ended up texting asking me for past essays I wrote to turn in as his own ...
> he then transferred to one of the prestigious military academies ...
>> There is almost zero downside for suing even if it gets you tons of negative publicity.
Sounds like the caveat here should be, "when your parents/family is connected".
> What's striking to me is that the parents sued
And the kid was even offered a redo!
On the other hand, the school caved on National Honor Society after the parents filed. So maybe the best move would have been (tactically, not as a parent) to show the school the draft complaint but never file it.
Here are some of the case documents:
https://www.courtlistener.com/docket/69190839/harris-v-adams...
It would seem that what was put into the report is clearly wrong (in this case from generative AI, but regardless of where it came from, it would still be wrong), so it is still legitimate to mark those parts as wrong. There are other things too which can be called wrong, whether or not the use of this generative AI is permitted (and it probably makes sense to not permit it in the way that it was used in this instance), so there are many reasons why it should be marked wrong.
However, if the punishment is excessively severe, then the punishment would be wrong.
How do you get students to engage in creative writing assignments in age of AI?
How do you get them to dive into a subject and actually learn about it?
The same way you did so before LLMs existed - you rely on in-class assignments, or take-home assignments that can't be gamed.
Giving out purely take-home writing assignments with no in-class component (in an age where LLMs exist), is akin to giving out math assignments without a requirement to show your work (in an age where calculators exist).
Many years before LLMs were ever a thing, I recall being required to complete (and turn in) a lot of our research and outlining in class. A plain "go home and write about X topic" was not that common, out of fear of plagiarism.
I'm thirty something. How did my teachers engage me in doing math? How did they engage me in rote-memorizing the multiplication tables when portable calculators were already a thing, being operated by coin-cells or little solar panels?
Part of teaching is getting kids to learn why and how things are done, even if they can be done better/faster/cheaper with new technology or large scale industrial facilities. It's not easy, but I think it's the most important part of education: getting kids to understand the subjacent abstract ideas behind what they're doing, and learning that there's value in that understanding. Don't really want to dichotomize, but every other way kids will just become non-curious users of magic black boxes (with black boxes being computers, societal systems, buildings, infrastructure, supply chains, etc).
Invert the assignment, provide a prompt to supply to an essay writing AI of the students choice, but the assignment is to provide critique for the veracity and effectiveness of the generated essay
ai hallucinated citations to non existent publications.
in this case, the ai should publish the cited hallucinated works on amazon to make it real.
not that it would help us, but the ai will have its bases covered.
Then they could train the next generation of models on those works. Nothing to scrape or ingest, since they already have the text on hand!
Discussed before the ruling:
https://news.ycombinator.com/item?id=41861818
AI is the new calc
The parents seem absolutely unhinged.
Poor kid.
Yet another “affluenza” raised child joining the ranks of society. Probably will become a future C-level exec at an American company.
One of the hallucinated authors is literally named "Jane Doe". Our society is about to become powerfully stupid.
The citizens of USA re-elected an idiot to the most powerful office in the free world. We are already “powerfully stupid”. His cabinet selections already indicate a gutting of federal offices and dismantling of DoE.
DoE is the department of energy. The department of education is ED.
I laughed out loud when I saw that McMahon was his pick. A fucking wrestling star for the department of education. This is Idiocracy.
Also I laughed because otherwise the fear takes over.
"Doe" is actually a real surname, with a few thousand of them in the US. I'd guess that there probably have been people actually named "Jane Doe". I wonder if that causes many problems for them?
In legal cases that is how one can choose to remain anonymous.
See, there's stuff even geniuses dont know.
Why do you think the previous poster found that name notable? Just because it's inherently funny sounding or something?
That's not relevant to this. It's a direct quote from the work the students handed in.
I just used chatGPT to code an html/css/JavaScript solution in an hour for coworkers who were having troubles. There were like wow that was fast we were trying to figure this out for a few days. I'm skilled / an expert but that would've taken me many hours vs. a few back n forth with GPT.
Overall my html/css/javascript skills I feel now aren't as valuable as they were.
I guess in this instance I cheated too or is it that my developer peers haven't gotten into using GPT or they are more moral? As well maybe this is just the new normal....
The rules for working are very very different from being at school.
No you were not cheating, you did what was expected from you. But you knew that.
How so and or AI is changing the rules everywhere no? Today it seems not good yet tomorrow it's how things are...
The goals are very different. It was like this also before AI.
The goal in school is to learn things. To learn to write you can't just copy an article from a paper and say it is yours. You have not learned.
At work, the goal is to get things done.
In our field you needed / need to learn new things to stay relevant yet now the new thing does it almost all for you.
As well if one generation is using AI to get things done why wouldn't a younger generation do the same? Do as I say and not as I do.. that never has held well over time.
Yeesh this is full of red flags…
What is..the new normal of using AI to do or help you get your job done and or quicker? Comment above shows it could be the new normal...
No. This attitude of being better than coworkers, coming in and saving the day. It had nothing to do with using AI. It’s about “I am better than you” instead of helping people out, or teaching them these things you know.
It’s just a passing internet comment missing all the context, so what do I know.
My comments are to be controversial… To get people to think… What is the future with AI and using it as such… If I told my coworkers how I achieved it would they not think less present day… What about in a few years or more it's the norm and mine and everyone's HML, CSS, JavaScript skills are less valuable,… this example shows that AI will definitely take peoples jobs, including my own if I do not ramp up my skills
You ramping up your skills will do nothing for you if a machine can otherwise be delegated your job due to the overhead of human worker vs. just owning a machines output. Not having to negotiate is extremely valuable to a business owner. Mark my words. Until people realize that the whole innovation around AI is to sidestep the labor class, things'll continue getting much darker before they brighten.
And the saddest thing is, the fools think it'll work in their favor, and won't blowback with massive unintended consequences.