Skip to content
OpenCatalogcurated by FLOSSK
AI & Machine Learning

Ollama

Local LLM runner and model library with simple CLI and API for workstation inference.

Why it is included

Lowers friction for privacy-preserving inference and offline experimentation on open weights.

Best for

Developers and power users testing models without cloud API bills.

If you use Windows, Mac, or paid tools

Local models alternative to ChatGPT, Claude, and Gemini cloud APIs for on-machine experimentation.

Strengths

  • Simple UX
  • Model packaging
  • Local-first

Limitations

  • Hardware limits; verify model licenses separately

Good alternatives

llama.cpp · vLLM

Related tools