llama-cpp-python
Python bindings for llama.cpp with OpenAI-compatible API server.
Open SourceSelf HostedOffline Capable
0.0 (0)
About
llama-cpp-python provides Python bindings for llama.cpp, enabling easy LLM inference from Python. Includes an OpenAI-compatible API server, function calling support, and GPU acceleration. Simple pip install. MIT license.
Reviews (0)
Leave a Review
No reviews yet. Be the first to review!
Details
- Category
- LLM Inference & Serving
- Price
- Free
- Platform
- Local/Desktop
- Difficulty
- Easy (2/5)
- License
- MIT
- Added
- Apr 3, 2026
Similar Tools
Featured
Desktop application for discovering, downloading, and running local LLMs.
Self HostedOffline
Beginner
0.0 (0)
Open-source ChatGPT alternative that runs 100% offline on your computer.
Open SourceSelf HostedOffline
Beginner
0.0 (0)
Open-source ecosystem for running LLMs locally on consumer hardware.
Open SourceSelf HostedOffline
Beginner
0.0 (0)