• @lugal@sopuli.xyz
    link
    fedilink
    15 months ago

    But does it work to tell it not to hallucinate? And does it work the other way around too?

    • @orcaA
      link
      25 months ago

      It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.