Skip to content
OpenCatalogcurated by FLOSSK
AI & Machine Learning

TinyLlama

1.1B-parameter Llama-architecture model trained on ~3T tokens—Apache-2.0 weights for fast experiments and teaching.

Why it is included

Tiny open checkpoint for CI, education, and edge prototypes without huge VRAM.

Best for

Students and engineers testing pipelines before scaling parameters.

Strengths

  • Small
  • Permissive
  • Llama-compatible tooling

Limitations

  • Capability ceiling vs 7B+ models

Good alternatives

SmolLM · Phi · Gemma

Related tools