MiniCPM

Compact edge-deployable LLM family by OpenBMB with strong performance per parameter.

Open SourceSelf HostedOffline Capable
0.0 (0)

About

MiniCPM by OpenBMB is a family of compact language models (1.2B, 2.4B) achieving performance comparable to larger models. Designed for edge and mobile deployment. Supports multimodal input in MiniCPM-V. Apache 2.0 license.

Reviews (0)

Leave a Review

No reviews yet. Be the first to review!

Details

Price
Free
Platform
Local/Desktop
Difficulty
Easy (2/5)
License
Apache-2.0
Added
Apr 3, 2026

Similar Tools

Featured

Open-weight LLM by Meta available in 8B, 70B, and 405B parameter sizes.

Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)
Featured

Latest Llama model family by Meta with Mixture-of-Experts architecture.

Open SourceSelf HostedOfflineGPU 16GB+
Advanced
0.0 (0)
Featured

High-performance open-weight LLMs by Mistral AI with MoE architecture.

Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)