LocalAI
Drop-in OpenAI-compatible API server for running LLMs, image, and audio models locally.
Open SourceSelf HostedOffline Capable
0.0 (0)
About
LocalAI is a self-hosted, OpenAI API-compatible server for running LLMs, generating images, audio, and embeddings locally. Supports GGUF, GPTQ, and other formats. No GPU required. Docker-ready. MIT license.
Reviews (0)
Leave a Review
No reviews yet. Be the first to review!
Details
- Category
- LLM Inference & Serving
- Price
- Free
- Platform
- Local/Desktop
- Difficulty
- Easy (2/5)
- License
- MIT
- Added
- Apr 3, 2026
Similar Tools
Featured
Desktop application for discovering, downloading, and running local LLMs.
Self HostedOffline
Beginner
0.0 (0)
Open-source ChatGPT alternative that runs 100% offline on your computer.
Open SourceSelf HostedOffline
Beginner
0.0 (0)
Open-source ecosystem for running LLMs locally on consumer hardware.
Open SourceSelf HostedOffline
Beginner
0.0 (0)