But how do we ground our relations to hallucination?

Klosterman's "But What If We're Wrong?" and pursuing "hallucination". Reflecting on @leahu2016ontological, @munk2022thick, @rettberg2022algorithmic, and trying to "see the world we already inhabit" [@powles2018seductive].

Added 2023-08-04 12:07:04:

@goodside via Twitter on Aug 4, 2023

“we can’t trust LLMs until we can stop them from hallucinating” says the species that literally dies if you don’t let them go catatonic for hours-long hallucination sessions every night