ChatGPT is impressive, but it’s missing a vital component. That’s according to Ehud Karpas, a squad director at, which develops generative AI for text.

To that end, three of the 12 Spices are facts: statistical facts, historical facts and nature facts.

“We think that even the best model will have weaknesses, just because it’s one model that’s good in some stuff, but it has flaws.

One way to address that, the paper proposes, is through Retrieval Augmented Language Modeling (RALM) to ground the language model “doing generation by conditioning on repeat documents retrieved from an external knowledge source.”

Related Articles