AI / LLMs — privacy-respecting options » Local Runtime
B LM Studio
● up · checked 26m ago verified 1 day ago
Desktop GUI for local LLMs. Drag-drop GGUF models, OpenAI-compatible local API, no cloud.
At a glance
- no-KYC signup
- Non-custodial — you hold keys
- Self-hostable
Review
Cross-platform desktop app (macOS / Windows / Linux) that wraps llama.cpp with a polished UI. Discover + download GGUF quantised models from Hugging Face, chat locally, expose an OpenAI-compatible API on localhost for apps to consume. Closed-source UI but the runtime is open. Best on-ramp for non-engineers.
Fees
Free for personal · MacOS/Win/Linux · localhost API
Links
Why B? — rubric definition
Solid pick. Verified working but with a meaningful caveat (UX rough, smaller market, intermediate trust step, partial coverage). Listed because the trade-off is sometimes worth it. Full rubric + worked example at /methodology; the curator's reasoning for this specific listing is in the audit log.
Audit trail — receipts for the editorial claim
- Upstream up · HTTP 200 · 351ms · checked 26m ago
- No
.onionmirror listed - Last manual verification
2026-05-13(<7d) - See curator log for LM Studio
Reviews — moderated · rules
No approved reviews yet. Be the first.
Add a review
Honest, brand-neutral feedback welcome. A curator approves before it appears here. No JS, no signup required.