• vnshng@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I asked perplexity that same question. It kind of did better, it made no errors in temperature’s like the others do. It just left those details out, initially. After asking follow-up questions it answered correctly, but also gave some unnecessary and unrelated information.

      I didn’t use any of the prompts, I was asking about saggar firing processes and temps, the prompts were just ceramics related.

    • Wholesalechicken@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Hard to believe something that feels like it’s lying to you all the time. I asked it about a topic that I’m in and have a website about, it told me the website was hypothetical. It got it wrong twice, even after it agreed it was wrong, and then told me the wrong thing again.

      Is this what they consider hallucinations?