Among the marvels of the human brain is its ability to generalize. We see an object, like a chair, and we know it's a chair, ...
According to Meta, memory layers may be the the answer to LLM hallucinations as they don't require huge compute resources at inference time.