-

@ Bartosz Milewski
2025-03-04 08:10:54
nostr:nprofile1qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyqpqcfvhv2j38hxlyrracfgzt2yf8xzwaxulxdujcm5m54gkew4vprtqx9q7zp nostr:nprofile1qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyqpqhpy6zyt0pa05nezps2x8hhrntua2hng8t495ya4rnyj7ne83ey9qn39gmq nostr:nprofile1qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyqpqwrpqzd5elcczztl3tzs29kat0gy7e0js88cvshcvjrtj2483l6nqp45a92
I agree. I'm studying the LLM architecture and it's clear that it's just Eliza on steroids.
On the other hand, Google isn't reliable either. We don't say that Google is hallucinating answers--it's the humanity that's hallucinating.