Skip to content
OpenCatalogcurated by FLOSSK
AI & Machine Learning

MNN

Alibaba’s lightweight inference engine for mobile and edge—used for on-device LLMs and classic CV models with aggressive optimization.

Why it is included

Listed on TAAFT’s machine-learning / LLM repository feeds as a major Apache-2.0 on-device stack.

Best for

Teams shipping neural networks on phones, embedded Linux, and constrained ARM SoCs.

Strengths

  • Edge performance
  • Small footprint
  • LLM paths on device

Limitations

  • Ecosystem smaller than ONNX Runtime for generic server inference

Good alternatives

ONNX Runtime · TensorFlow Lite · ExecuTorch

Related tools