-
@ someone
2025-05-03 16:36:40what did you say? it is a weird model. sometimes it says it doesn't know or nobody knows it. for example i asked "what does Nostr stand for?". It said "nobody knows". somehow it estimates that it does not know the answer? that kind of answers did not happen with llama 3, it always hallucinated before accepting "it does not know".