01
The Model as a Reasoning Substrate
The first shift was not better autocomplete. It was general reasoning with context.
Large language models arrived as something genuinely different: not just search, not just retrieval, but a system that could take context and produce useful thought. Early adoption framed them as assistants. The deeper implication was that a model able to reason about a problem could also reason about what to do next.