I asked perplexity that same question. It kind of did better, it made no errors in temperature’s like the others do. It just left those details out, initially. After asking follow-up questions it answered correctly, but also gave some unnecessary and unrelated information.
I didn’t use any of the prompts, I was asking about saggar firing processes and temps, the prompts were just ceramics related.
Hard to believe something that feels like it’s lying to you all the time. I asked it about a topic that I’m in and have a website about, it told me the website was hypothetical. It got it wrong twice, even after it agreed it was wrong, and then told me the wrong thing again.
Just tried it out, withe some questions about ceramic firing in a electric kiln. Seems to have similar accuracy to chatgpt, maybe closer to gpt4.
It’s not clear when using it what version it’s on, so this may have been Claude 1, I’m unsure where to check.
deleted by creator
I asked perplexity that same question. It kind of did better, it made no errors in temperature’s like the others do. It just left those details out, initially. After asking follow-up questions it answered correctly, but also gave some unnecessary and unrelated information.
I didn’t use any of the prompts, I was asking about saggar firing processes and temps, the prompts were just ceramics related.
deleted by creator
Is this what they consider hallucinations?