llm-ls
Rust LSP server that plugs LLM-backed completions into editors—designed to pair with local or API models.
Why it is included
Appears in TAAFT’s #llm repository listings as Hugging Face’s Apache-2.0 LLM language-server experiment.
Best for
Editor and tooling hackers adding LLM completion to LSP-compatible IDEs.
Strengths
- LSP-native
- Local-model friendly
- Small core
Limitations
- Ecosystem younger than full IDE extensions like Continue
Good alternatives
Continue · Tabby · Copilot-style hosts
Related tools
IDEs & Editors
Continue
⏩ Source-controlled AI checks, enforceable in CI. Powered by the open-source Continue CLI
IDEs & Editors
Tabby
Self-hosted AI coding assistant
AI & Machine Learning
llama.cpp
Plain C/C++ inference for LLaMA-class models with broad community backends.
AI & Machine Learning
MNN
Alibaba’s lightweight inference engine for mobile and edge—used for on-device LLMs and classic CV models with aggressive optimization.
AI & Machine Learning
rtp-llm
Alibaba’s high-performance LLM inference engine (CUDA-focused) for production serving of diverse decoder architectures.
AI & Machine Learning
KVPress
NVIDIA research-oriented toolkit for LLM KV-cache compression to stretch context within fixed VRAM budgets.
