Skip to content
OpenCatalogcurated by FLOSSK
AI & Machine Learning

Mistral AI (open models)

Mistral’s open-weight checkpoints (e.g. 7B era, Mixtral MoE) and Apache-2.0–licensed **code** alongside proprietary flagship lines—verify each checkpoint.

Why it is included

European-led open-weights story with strong instruction and MoE releases used across the stack.

Best for

Teams wanting permissively licensed smaller models or MoE under clear Apache weights where offered.

Strengths

  • Mixtral MoE
  • Strong OSS tooling alignment
  • HF presence

Limitations

  • Model mix is not uniformly open—read each card

Good alternatives

Meta Llama · Qwen · Gemma

Related tools