The summer is over, schools are back, and the data is in: ChatGPT is mainly a tool for cheating on homework.::ChatGPT traffic dropped when summer began and schools closed. Now students are back, and they’re using the AI tool again more.

  • @ShittyRedditWasBetter@lemmy.world
    link
    fedilink
    English
    22 years ago

    Egh Moody adults are ignorant to how it works, how to prompt well, and are scared of using it at work.

    Kids are going to gobble that shit up. It’s being used for cheating until the kids get into the workforce.

    • @naught@sh.itjust.works
      link
      fedilink
      English
      2
      edit-2
      2 years ago

      Yeah, shitty article. AI is the same as calculators. So what! Kids don’t have to waste mental cycles with your tedious and borderline sadistic amounts of homework. Go fly a kite.

      edit: clarification

    • @Solumbran@lemmy.world
      link
      fedilink
      English
      32 years ago

      If anything, people who like chatGPT are the ones ignorant of how it works (spoiler: it doesn’t).

      And kids with their understanding of technology being limited to youtube and tiktok have no clue about what an AI is. They see it, like most people, as a magic black box that is incredibly smart. Apart from being a black box, none of that is true.

      • @astronaut_sloth@mander.xyz
        link
        fedilink
        English
        -12 years ago

        It works well if you know how to prompt well. LLMs are able to do so much, but a user just needs to know how to use it correctly. The technology is still in its infancy, so it’s a bit difficult to use well.

        • @Solumbran@lemmy.world
          link
          fedilink
          English
          32 years ago

          It does not work. At most it looks like it works.

          A human brain is able to understand and process information. An AI simply calculates a mathematical function. There is no reasoning and no understanding of anything, all that ChatGPT does is try to look like a human. And by “try to look like a human”, I mean “generate sentences that can be believable, on shape, to be written by a human”.

          If you ask it to calculate 2+2, and then tell it that it is equal to 5, it won’t see any problem because it doesn’t understand any of it. But it will give you answers that are, grammatically speaking, reasonably human.

          If I ask a rock how much is 2+2 and I throw it, and it bounces 4 times, it does not mean that this rock knows how to count. ChatGPT and similar are just better illusions, but they’re nothing more.