It is objectively a lot more male than Reddit or other social media. Reddit has many issues, but lack of women is not one of them.

  • Zak@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    12 hours ago

    Both use a lot of energy, but operation accounts for the majority not training.

    Running a (relatively) large model on your own PC’s GPU is energy-intensive compared to typical household electronics, but not compared to driving a car. People don’t usually object to someone playing a AAA game at 2K240, which burns energy just as fast as running inference on the same GPU.

    A typical prompt and response uses maybe a quarter to half a Watt-hour. That’s like using an LED light bulb for a few minutes; it’s the scale that makes these things problematic.

      • Zak@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 hours ago

        To a datacenter, tens or hundreds of thousands, which is my point about scale. One person using an LLM isn’t wasting any more power than they would be gaming on a PC, but a lot more people are using LLMs at any given time than are gaming.