BLOOMZ (Hugging Face)
BigScience instruction-tuned BLOOM derivatives (e.g. BLOOMZ-560M–176B) for multilingual zero-shot instruction following on the Hub.
Why it is included
BLOOMZ remains widely downloaded as the instruction-tuned sibling to BLOOM in multilingual benchmarks.
Best for
Multilingual instruction research and comparisons to newer open instruct lines.
Strengths
- Multilingual instruct heritage
- Pairs with BLOOM tooling
Limitations
- RAIL obligations; large checkpoints costly to run
Good alternatives
BLOOM · Llama · Aya
Related tools
AI & Machine Learning
BLOOM
BigScience 176B multilingual causal LM—landmark collaborative open training effort on Jean Zay (weights under BigScience Responsible AI License).
AI & Machine Learning
Hugging Face Transformers
State-of-the-art pretrained models for PyTorch, TensorFlow, and JAX.
AI & Machine Learning
Axolotl
YAML-configured fine-tuning for LLMs: LoRA, QLoRA, FSDP, and many architectures on top of Hugging Face trainers.
AI & Machine Learning
Qwen
Alibaba’s Qwen family (dense and MoE) with strong multilingual and coding variants; weights and code on Hugging Face under stated licenses per release.
AI & Machine Learning
Yi
01.AI Yi open-weight bilingual models (EN/ZH focus) with Apache-2.0 or Yi license per checkpoint on Hugging Face.
AI & Machine Learning
SmolLM
Hugging Face TB small LM family (135M–1.7B) with Apache-2.0 weights aimed at on-device and edge quality per size.
