Stay Updated
Subscribe to our newsletter for the latest news and updates about Open Source Alternatives
Subscribe to our newsletter for the latest news and updates about Open Source Alternatives
A compact, open-source LLaMA-based language model (~1.1 B parameters) pretrained on trillions of tokens under the Apache 2.0 license.
TinyLLaMA is an open-source language model by jzhang38’s team, designed as a lightweight yet capable alternative to larger LLaMA models. Its 1.1 B base model is trained on an impressive corpus of 3 trillion tokens following the original LLaMA architecture and tokenizer . The project includes fully reproducible checkpoints, a chat-finetuned variant, and shared evaluation benchmarks.As a schlacker-optimized, lightweight model, TinyLLaMA serves as a practical alternative to larger models like LLaMA‑3.1 or GPT‑NeoX when computational resources are limited, without sacrificing strong performance
Key features include:
Use cases include: