Small Language Models: Architecture, Evolution, and the Future of Artificial Intelligence
Published in Preprints, 2026
This work surveys the landscape of Small Language Models (SLMs), defined as having fewer than 15 billion parameters. We introduce a novel multi-axis taxonomy to categorize these models by their genesis, architecture, and optimization goals. Our research demonstrates that state-of-the-art SLMs match or exceed larger models in specialized domains like mathematical reasoning and code generation, while suggesting the future of AI involves hybrid ecosystems where specialized SLMs manage most tasks locally, escalating complex queries to cloud-based LLMs.
Recommended citation: @article{shah2026small, title={Small Language Models: Architecture, Evolution, and the Future of Artificial Intelligence}, author={Shah, Ankit Parag and Hosseini, Mohammad-Parsa and Park, Su Min and Miao, Connie and Wei, Wei}, journal={Preprints}, year={2026}, doi={10.20944/preprints202601.0973} }
Download Paper