Who reads this anyway? Nobody, that’s…. Oh wait. Some people actually do. I guess I should put something worth reading in here then. Err… Let’s go with lorem impsum for the time being.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam eu libero vitae augue pretium sollicitudin…

  • 1 Post
  • 2 Comments
Joined 3 years ago
cake
Cake day: June 5th, 2023

help-circle

  • Here’s the interesting part.

    “We didn’t do any LLMs. There is significant interest in that. There are lots of people trying those ideas out, but I think they’re still in the exploratory phase,” Desai told El Reg.

    As it turned out, the researchers didn’t need them. “We used a simpler model called a variational auto encoder (VAE). This model was established in 2013. It’s one of the early generative models,” Desai said.

    By sticking with domain-specific models based on more mature architectures, Sandia also avoided hallucinations – the errors that arise when AI makes stuff up – which have become one of the biggest headaches associated with deploying generative AI.

    “Hallucinations were not that big a concern here because we build a generative model that is tailored for this very specific task,” Desai explained.