Emergence
Capabilities that appear at scale but were never explicitly trained for, such as arithmetic or multi-step reasoning emerging from text prediction.
Emergence describes the phenomenon where foundation models develop unexpected abilities as they scale up. A model trained only to predict the next token might spontaneously develop the ability to do arithmetic, write code, or reason through complex problems. Stanford's CRFM identifies emergence as a defining characteristic of foundation models, and it drove the scaling hypothesis that dominated AI strategy from 2020 through 2024.
Also known as
emergent capabilities, emergent abilities, emergent behavior