MLC LLM
Universal LLM deployment engine for native apps on any hardware.
Open SourceSelf HostedOffline Capable
0.0 (0)
About
MLC LLM enables deploying large language models natively on diverse hardware (phones, laptops, browsers) with hardware-accelerated runtimes. Compiles models for optimal performance on each platform. Supports iOS, Android, web, and more. Apache 2.0 license.
Reviews (0)
Leave a Review
No reviews yet. Be the first to review!
Details
- Category
- LLM Inference & Serving
- Price
- Free
- Platform
- Local/Desktop
- Difficulty
- Intermediate (3/5)
- License
- Apache-2.0
- Added
- Apr 3, 2026
Similar Tools
Featured
Desktop application for discovering, downloading, and running local LLMs.
Self HostedOffline
Beginner
0.0 (0)
Open-source ChatGPT alternative that runs 100% offline on your computer.
Open SourceSelf HostedOffline
Beginner
0.0 (0)
Open-source ecosystem for running LLMs locally on consumer hardware.
Open SourceSelf HostedOffline
Beginner
0.0 (0)