Skip to content
OpenCatalogcurated by FLOSSK

Browse & filter

Filter by platform, license text, maturity, maintenance cadence, and editorial tags like privacy-focused or self-hosted. Search matches names, summaries, tags, and use cases.

7 tools match your filters

Meta’s Llama family of open **weights** (subject to Llama license) with reference code, tooling, and downloads via Hugging Face and meta-llama org.

llmopen-weightsmetafoundation-model

Mistral’s open-weight checkpoints (e.g. 7B era, Mixtral MoE) and Apache-2.0–licensed **code** alongside proprietary flagship lines—verify each checkpoint.

llmopen-weightsmoeeuropefoundation-model

Alibaba’s Qwen family (dense and MoE) with strong multilingual and coding variants; weights and code on Hugging Face under stated licenses per release.

llmopen-weightscodingmultilingualfoundation-model

DeepSeek open-weight models (e.g. V3/R1 lineage) with MIT or custom terms per release—high capability coding and reasoning checkpoints.

llmopen-weightsreasoningcodingfoundation-model

Google’s smaller open **weights** Gemma line (Gemma 2/3, etc.) with Gemma license terms, plus `gemma.cpp` for lightweight CPU inference.

llmopen-weightsgoogleedgefoundation-model

Technology Innovation Institute Falcon open weights (7B–180B era) under Apache-2.0 weights for many releases—landmark UAE-led open model line.

llmopen-weightsapache-2foundation-model

Google Research pretrained time-series foundation model for forecasting with open Apache-2.0 code and checkpoints.

time-seriesforecastingfoundation-modeltaaft-repositories