35 comments

  • ndiddy 28 minutes ago

    The other day I read this piece on how AI is already being used in schools, and it left quite an impression on me. https://archive.is/IW4B3

    > The Chromebooks, which the students use in every class and for homework, came pre-installed with an all-ages version of Gemini, a suite of A.I. tools. When my daughter, who is in sixth grade, begins writing an essay, she gets a prompt: “Help me write.” If she is starting work on a slide-show presentation, the prompt is “Help me visualize.” She shoos away these interruptions, but they persist: “Help me edit.” “Beautify this slide.” The image generator is there, if she’d ever wish to pull the plug on her imagination. The Gemini chatbot is there, if she ever wants to talk to no one.

    I'm not as anti-AI as the author of the piece, and I think that AI could have a role as a teaching aid. It's infinitely patient and it's able to adapt to a student's needs better than a textbook. Still, I hate the idea of students being encouraged to entirely offload their cognitive work onto an online service rather than think for themselves. The point of making fifth graders write essays, make art, design presentations, etc isn't the end product, it's that they now have the experience of having done the assignment. I would rather see students get taught how to think creatively, analyze a piece of writing, coherently explain an opinion, or draw a picture on their own, rather than giving this up in exchange for the nebulous skill of being "AI native" (aka being able to ask a computer to produce work for you).

    • NewsaHackO 5 minutes ago

      Yeah, I cannot imagine how anyone could learn anything well with access to AI. I am grateful that I finished my schooling before AI hit mainstream, because it is just too easy to turn your brain off and just AI a question before thinking about it. Great for getting things done, useless for learning. I guess hallucinations still keep us on our toes.

  • schnitzelstoat an hour ago

    It reminds me of the 'IT Literacy' classes we had when I was in high school where they just taught us to use Microsoft Office products.

    • Kadecgos an hour ago

      A lot of those were definitely sponsored by MS and co as well, but at least you did learn a practical, transferable, morph-able skill. You'll come out of that with experience using the features and structures of a general purpose OS, as well as the workflow of mode-base production software (in some cases). Excel at least is also just such a powerful 'everything' tool that I'm not even that mad about it.

      'AI Literacy' is just very much not that at all and is just state-mandated brain rot.

    • jcgrillo 33 minutes ago

      Mavis Beacon Teaches Typing

    • lloydatkinson 9 minutes ago

      I had the same experience in the UK around 2005 to 2011, I wonder if it's the same everywhere?

      I feel that my experience was far worse and bordering on the absurd and bureaucratic. We spent years following instructions, taking screenshots of us opening specific windows and dialogs in Office etc, saving all these screenshots into a Word document, and then printing the document.

      To be clear, it was every single action you took. Moved the mouse to "Insert"? Don't click it yet, take a screenshot of your mouse on the "Insert" button, and then click it, and take a screenshot of the menu that opened. Then, take more screenshots of moving your mouse to buttons and lists in dialogs that opened. Then, take a screenshot of the document with the thing you just inserted.

      Now, write several paragraphs in detail about what you just did. Print everything, and that includes both the document you just created for the exercise and then the document writing about the document creating exercise with all it's dozens of screenshots.

      Each individual printed piece of paper needed to be kept in a plastic wallet, which was then kept in document folder. In the end we had multiple of these document folders that were without a doubt a complete waste of paper and time.

      The argument was that it was needed in case the exam board decided it needed to double check the teachers scores, which I think never happened once anyway. There was never once a reason given for why each individual piece of paper needed to be put in a plastic wallet.

      This was during a period of time where CS education at schools had essentially totally vanished from the curriculum for decades, it was added back after I'd finished school.

      Words cannot describe how much I despised the entire ordeal. There simply are not enough words to describe the total absurdity of an IT class requiring screenshots of clicking buttons and being printed onto paper.

      While the teacher was trying to explain how to add PowerPoint transitions I was writing scripts that would fetch currency conversions and graph them because I was that bored. One time I write some terrible "chat" system via some type of free shared HTML/PHP hosting and meta tag based auto refreshing of the chat history for a few class friends to talk across the room.

  • marricks 28 minutes ago

    > Young people increasingly hate AI[1], and children already struggle with AI-enabled harassment that traumatizes them and disrupts their learning. And studies show kids are offloading learning onto AI models, undermining their education and social development.

    [1] https://www.theverge.com/ai-artificial-intelligence/920401/g...

    The coyote is already running beyond the cliff so indoctrinating kids won't save them from an AI winter 6-18 months away.

    • bigyabai 21 minutes ago

      I swear that I read this same "6-18 months" timeframe 3 years ago.

  • saidinesh5 10 minutes ago

    Putting all the cynicism side.. it's amazing how big the changes in how we deal with information in our life time changed..

    When I was younger, to solve a problem, we had to memorize a large amount of information. Or know someone who does. Or visit libraries and pray they have a book on what you need.

    Then came the internet. All of that memorizing was replaced by web searches. You just focus on solving the problem, figuring out what you don't know and searching for that.

    Now, it feels like we're automating the searching, connecting the dots and most of the problem solving. We focus on the high level problem description, verification of the results.

    I wonder what they'd be adding to this curriculum.

    Now, it feels like we're even offloading

  • fantasizr an hour ago

    this is a step beyond the drug dealers who give you the first sample for free. Attempts at legally mandated injection sites.

    • mghackerlady an hour ago

      Not the first time this has happened. There was a big push for schools to teach windows and microsoft office while conveniently ignoring other things exist. Nowadays some have moved to the google office suite which isn't that much better

      • fantasizr 30 minutes ago

        the textbook companies give the hard sell too but it's more honorable with traditional palm greasing and what not

  • rebolek an hour ago

    Of course they will back it up. Nice source of income.

  • wiseowise 13 minutes ago

    Got to train serfs early!

  • slopinthebag 31 minutes ago

    Gotta get em hooked while they're young.

  • kmeisthax 32 minutes ago

    If by "AI literacy" they mean "learning how AI works and how to use it effectively", then this probably would wind up backfiring. Because when you improve people's AI literacy, they use it less. They don't swear off it, but because they know what it is and is not good for, they are way more cautious in their application of AI.

    Of course, they probably plan to do to education what iPads did to education: deskill children. Apple successfully abliterated the concept of a file from a generation of students by making them do their computing in a straitjacket. I can only imagine how an AI-first or AI-only educational curriculum could make kids even worse at using computers.

  • HomeDeLaPot 43 minutes ago

    Maybe a more general focus on getting students to practice critical thinking and fact-checking would be better. AI could be addressed as a small part of that, since chatbots are everywhere and students need to know how to filter out their BS.

    But are NSF grants really necessary for this? To what degree is this funneling taxpayer money to buy ChatGPT subscriptions and advertise to students by getting them to use AI in the classroom?

  • nalekberov 31 minutes ago

    What is ‘AI Literacy’? How to prepare a prompt for maximum token efficiency?

    • wiseowise 10 minutes ago

      Where to buy the subscription, how to convince parents to buy Pro instead of Plus, prevent original thought as early as possible, so they stay addicted - sorry I meant empowered - asap.

  • nathan_compton 17 minutes ago

    This is the reason I recently ran for my kids school board. I use AI every day and I think there is a lot of utility there, but I don't want it anywhere near my kids school. Honestly, I don't think kids need to even lay eyes on a screen until they are in highschool.

  • hsuduebc2 29 minutes ago

    Of course they do when it must be taught on their products which will hook the users in time and make some money.

  • cavino 43 minutes ago

    The thing about AI is it'll teach you how to use it (aka 'AI literacy').

  • sublinear an hour ago

    It will be interesting to see the backlash to this one.

  • righthand an hour ago

    I thought AI was so easy to use no one would have to be trained? Are they going to teach the kids to steal copyrighted data? And write AI slop articles? And to evangelize useless side projects as time savings?

    • SpicyLemonZest a few seconds ago

      [delayed]

    • zamadatix 8 minutes ago

      It's K12 so I'm honestly not going to try to take that dunk, as satisfying as it'd be, as plenty of things which seem blazingly obvious/intuitive to adults are complete mystery to a pool of kids where being able to read to learn (instead of the other way around) is a recent development.

      Unfortunately, the AI literacy big tech companies want to push won't align very well with the AI literacy kids need. It'd be like ad literacy for K12 being pushed by Google - obviously what's delivered would not match what the kids actually needed.

    • frangonf an hour ago

      Kids don't need to be trained in AI but the models do need to be trained by kids.

    • noobermin an hour ago

      The drug dealers get to get them hooked young.

      • spwa4 42 minutes ago

        Come on, AI can work both ways. It's easy to use AI to greatly increase your knowledge of a subject. It's also easy to use AI to prevent yourself from having to learn anything.

        Both kinds of students will exist.

        • kerkeslager 29 minutes ago

          > It's easy to use AI to greatly increase your knowledge of a subject.

          It's actually not.

          It's easy to get an AI to say a lot about a subject, but that doesn't mean anything the AI said was true. There's a significant risk that the AI has simply hallucinated the information, and now you "know" a bunch of false ideas about the subject, which is worse than not knowing anything about it.

        • xienze 40 minutes ago

          > Both kinds of students will exist.

          Yeah and I'm betting there's gonna be a whole lot more "press the button to have all your work done for you" students than "work hard" students. FFS even before all this there's been an alarming number of students attending college who have to take remedial classes.

  • charcircuit 44 minutes ago

    The entirety of school should eventually be replaced with just this one class. AI is able to teach people anything they may want or need to know and it can design effective ways for people to study. Being able to use, interpret, and work together with AI is going to be one of the most important skills of the 21st century.

    • wiseowise 7 minutes ago

      > Being able to use, interpret, and work together with AI is going to be one of the most important skills of the 21st century.

      But I thought the models are so good we don’t need humans anymore?

    • AnimalMuppet 37 minutes ago

      Maybe so. Still, learning how to tell when the AI is blowing smoke is going to be an important skill, and I'm not sure that AIs are going to be great at teaching that to you.

      And learning when other people (AI salespeople, say) are blowing smoke is also an important skill. Again, I'm not sure that AIs are great at teaching that.

    • armitron 27 minutes ago

      This level of naivety is characteristic of certain SV types where wishful thinking is the order of the day. We're already living through the disastrous effects of the "social media" revolution and this is going to be much more of the same, with even worse negative effects on society.

      Just imagine what this will do to critical thinking, interpersonal relationships and family dynamics in a country where illiteracy is rapidly climbing. I don't think it's a stretch to write that if the unrestrained capitulation in terms of societal costs towards big tech continues, we're setting ourselves up for {generational, class-based} conflict that will rip our country to pieces.