I’m sorry, so fucking angry. Students with sources that don’t exist. Students with sources that exist but then the quotation doesn’t exist.

I’m so fucking mad, because it’s extra work for me (that I’m sure as hell not getting compensated for), and it also entirely defeats the purpose of the fucking class (it’s writing/research, so like, engaging in a discipline and looking at what’s been written before on your topic, etc.)

Kill me please. Comrades, I’m so tired. I just want to teach writing. I want to give students a way to exercise agency in the world – to both see bad arguments and make good ones. They don’t care. I’m so tired.

BTW, I took time to look up some of these sources my student used, couldn’t find the quotes they quote, so told them the paper is an “A” if they can show me every quotation and failing otherwise. Does this seem like a fair policy (my thought is – no matter the method, fabrication of evidence is justification for failing work)?

foucault-madness agony-shivering allende-rhetoric

  • sewer_rat_420 [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Your policy is fair, because then your students would hypothetically actually use their damn brains a tiny bit, which is what school should be about

    I would also posit that any false quotation could just be docked one letter grade, so 5 of them is an F

  • SovietBeerTruckOperator@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I’m getting my Masters right now at a college that has a really big foreign student population, particularly Desi people. I was in a class with this one kid who straight up did not speak English, like not he spoke it poorly or with a heavy accent, just did not speak it at all. He had ChatGPT on his laptop every class. At the end of the semester we had student presentations, this kid presented a slide show that was clearly made by AI, bad images and all, and kept trying to speak English phonetically. It was awkward as fuck.

  • Seasonal_Peace [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I tried using AI to help find sources for my partners thesis. It’s a niche topic on body phenomenology and existentialism in pregnancy and birth. Instead, it cited Heidegger books that don’t even exist. A colleague recommended it, but honestly, you would have to be insane to rely on this.

    • fox [comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I get so annoyed when people tell me to ask an AI something. It has no knowledge and no capacity for reason. The only thing it can do is produce an output that an inexpert human could potentially accept as true because the underlying statistics favour sequences of characters that, when converted to text and read by a human, appear to have a confident tone. People talk about AI hallucinating wrong answers and that’s giving it too much credit; either everything it outputs is a hallucination that’s accepted more often than not, or nothing it outputs is a hallucination because it’s not conscious and can’t hallucinate, it’s just printing sequential characters.

  • Mardoniush [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Pass Fail Verbal exams. Seriously just ask them a random very simple question about their paper and if they cant answer it then 0%.

    This bypasses the AI problem because if they’ve gone to all the trouble of making a fake paper and then learning it’s bullshit by heart then hopefully they’ve learned something about how papers are written, even if by accident.

  • I was in conversation with a friend who works in tech and we were talking about a thing that we wanted to find some science on. So I found a paper on it and started to read, but before I got done he replied by sending me a chat gpt summary of the paper. And I could already tell it wasn’t correct from reading it myself even a little. What I really wanted to say to him was that I’d rather think for myself, tyvm.

    If students are all doing this now, nobody will think of anything themselves anymore or form an actual deep understanding of a thing, be it whatever. Not really. Anyone who has done reading knows how that shapes us and then can develop into something deeper. It’s the f’n “innovation” these same tech types go on and on about that dies with this tech.

  • Philosoraptor [he/him, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    told them the paper is an “A” if they can show me every quotation and failing otherwise. Does this seem like a fair policy (my thought is – no matter the method, fabrication of evidence is justification for failing work)?

    If the policy for plagiarism at your school is a F on the assignment, that seems fair to me. Asking LLMs to do your work is plagiarism.

    • ChestRockwell [comrade/them, any]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I mean, I could go to that, but I figure as a writer, to fabricate quotations and evidence is fundamentally failing work.

      I’m trying to give the student the chance to save themselves too. If they just cited that (for instance) the quotation about “all great historical figures appear twice” was from The German Ideology instead of 18th Brumaire that’s not a problem – the quotation exists, it’s simply the student being sloppy at documentation.

      However, to claim that someone stated something they didn’t – that’s just fundamentally failing work (it would be like going online and saying Mao said that “power grows out of the hands of the peasantry” instead of “power grows out of the barrel of a gun”).

      I should note - my class has a policy that students can use AI as long as they clear it with me. However, they’re responsible for their work, and I won’t accept work with fake quotes. That’s dogshit writing.

  • GoodGuyWithACat [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    BTW, I took time to look up some of these sources my student used, couldn’t find the quotes they quote, so told them the paper is an “A” if they can show me every quotation and failing otherwise. Does this seem like a fair policy (my thought is – no matter the method, fabrication of evidence is justification for failing work)?

    Most class syllabuses I’ve seen tie LLM into the same category as plagiarism. That’s an automatic failure on the assignment and sometimes failure of the class.

  • sgtlion [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Seems a fair policy. I like to imagine if you stress this policy up front in advance, students might actually check and verify all their own sources (and thus actually do their own research even with ai stuff)

  • plinky [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    8 months ago

    if two students are using exact same (unneeded) variable in a script, did they do similar prompt or do they talk to each other saruman-orb

    fucking hate this shit, something which could be done in like 20 lines of code is like 200. I don’t particularly have to care, cause i’m not teaching programming but jesus christ

  • Beaver [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    (it’s writing/research, so like, engaging in a discipline and looking at what’s been written before on your topic, etc.)

    BTW, I took time to look up some of these sources my student used, couldn’t find the quotes they quote, so told them the paper is an “A” if they can show me every quotation and failing otherwise. Does this seem like a fair policy (my thought is – no matter the method, fabrication of evidence is justification for failing work)?

    I think they will learn an important life lesson: that if they’re going to cheat, then they have to, at a minimum, be sure that they are at least “getting the right answer”. The tide of AI dystopia is unstoppable, but you can at least teach them that they can’t just completely shut their brains off to the extent that they are just presenting completely fabricated research and factual claims.

  • TrustedFeline [she/her, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Trash Future repeatedly makes the point that AI chat bots are the inverse of the printing press. The printing press created a way for information to be reliably stored, retrieved, and exchanged. It created a sort of ecosystem where ideas (including competing ideas) could circulate in society.

    Chat bots do the opposite. They basically destroy the reliable transmission of information and ideas. Instead of creating reliable records of human thought (models, stories, theories, etc.), it’s a black box which randomly messes with averages. It’s so fucking harmful

  • Esoteir [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    cheating in education in general, AI or not, is more caused by the financial and systemic repercussions of failing. When these students fail a class, it’s often another few thousand dollars they don’t have down the drain, and if they fail too many classes it locks them out of higher education entirely

    failure is one of the biggest drivers of true learning, and the educational system directly discourages it

    • ChestRockwell [comrade/them, any]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Oh I get that – the financial reality is there for sure, and I recognize they have other classes, etc. Don’t get me wrong, I know who the “true” villain is.

      Doesn’t mean I can’t be mad at these AI companies for unleashing this on us. It actively makes teaching the skills to understand writing harder since students can get close to “good” writing with these machines, but the writing it produces crumbles under the slightest scrutiny. We’re actively harming thought and understanding with them.

  • DinosaurThussy [they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Using AI to write papers for a writing class is like using speech to text for a touch typing course. You’re bypassing the exercises that will actually provide the value you’re paying for