John Schinnerer
Aug 25, 2023

--

"...remember, for every question it sidesteps, there’s a chance someone will turn to less reliable sources."

There's nothing 'reliable', when it comes to factual information, about an LLM. They are not 'aware' of fact versus fiction, and do no "fact-checking." The "hallucinations" they spit out are not an 'anomaly' - they are a simple consequence of how LLMs operate.

As to the degradation of output, you might want to look into what happens when a recursive system gets most of its input only from itself. Errors are magnified over time internally and the system finds itself in a destabilizing feedback loop. Without external stabilizing influences (external input that counters the internal errors/degradation) it's downhill all the way...

--

--

John Schinnerer
John Schinnerer

Written by John Schinnerer

A generalist in a hyper-specialized society. "How we do what we do is who we are becoming." - Humberto Maturana

Responses (1)