in terms of communication utility, it’s also a very accurate term.
when WE hallucinate, it’s because our internal predictive models are flying off the rails filling in the blanks based on assumptions rather than referencing concrete sensory information and generating results that conflict with reality.
when AIs hallucinate, it’s due to its predictive model generating results that do not align with reality because it instead flew off the rails presuming what was calculated to be likely to exist rather than referencing positively certain information.
it’s the same song, but played on a different instrument.
when WE hallucinate, it’s because our internal predictive models are flying off the rails filling in the blanks based on assumptions rather than referencing concrete sensory information and generating results that conflict with reality.
Is it really? You make it sound like this is a proven fact.
Can we fucking stop anthropomorphising software?
“Hallucinate” is the standard term used to explain the GenAI models coming up with untrue statements
in terms of communication utility, it’s also a very accurate term.
when WE hallucinate, it’s because our internal predictive models are flying off the rails filling in the blanks based on assumptions rather than referencing concrete sensory information and generating results that conflict with reality.
when AIs hallucinate, it’s due to its predictive model generating results that do not align with reality because it instead flew off the rails presuming what was calculated to be likely to exist rather than referencing positively certain information.
it’s the same song, but played on a different instrument.
Is it really? You make it sound like this is a proven fact.
I believe that’s where the scientific community is moving towards, based on watching this Kyle Hill video.
Here is an alternative Piped link(s):
this Kyke Hill video
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
I know I’m responding to a bot, but… how does a PipedLinkBot get “Kyle Hill” wrong to “Kyke Hill”? More AI hallucinations?
Op has a pencil in the top right, looks like it was edited
What standard is that? I’d like a reference.
https://en.m.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
It’s as much as “Hallucination” as Tesla’s Autopilot is an Autopilot
https://en.m.wikipedia.org/wiki/Tesla_Autopilot
I don’t propagate techbro “AI” bullshit peddled by companies trying to make a quick buck
Also, in the world of science and technology a “Standard” means something. Something that’s not a link to a wikipedia page.
It’s still anthropomorphising software and it’s fucking cringe.