kersploosh@sh.itjust.works to Programmer Humor@programming.dev · 2 days agoIt works thosh.itjust.worksvideomessage-square35linkfedilinkarrow-up1765arrow-down112
arrow-up1753arrow-down1videoIt works thosh.itjust.workskersploosh@sh.itjust.works to Programmer Humor@programming.dev · 2 days agomessage-square35linkfedilink
minus-squareJankatarch@lemmy.worldlinkfedilinkarrow-up42arrow-down1·2 days agoI am actually pretty ok with this type of "messing around’ usage. On the condition they also stop killing the environment to train and run these stupid things.
minus-squareentropicdrift@lemmy.sdf.orglinkfedilinkarrow-up13·2 days agoYeah, if they were just running it locally off a GPU it would be cooler
minus-squarepsud@aussie.zonelinkfedilinkEnglisharrow-up7·2 days agoRunning an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.
I am actually pretty ok with this type of "messing around’ usage.
On the condition they also stop killing the environment to train and run these stupid things.
Yeah, if they were just running it locally off a GPU it would be cooler
Running an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.