DeepSeek-Coder

Open-weight code LLM trained on 2 trillion tokens of code and natural language.

Open SourceSelf HostedOffline CapableGPU Required (8GB+ VRAM)
0.0 (0)

About

DeepSeek-Coder is a code-focused LLM trained on 2 trillion tokens comprising 87% code and 13% natural language. Available in 1.3B, 6.7B, and 33B sizes. Supports 16K context with fill-in-the-blank and cross-file completion. DeepSeek License.

Reviews (0)

Leave a Review

No reviews yet. Be the first to review!

Details

Price
Free
Platform
Local/Desktop
Difficulty
Intermediate (3/5)
License
DeepSeek License
Minimum VRAM
8 GB
Added
Apr 3, 2026

Similar Tools

Featured

Open-weight LLM by Meta available in 8B, 70B, and 405B parameter sizes.

Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)
Featured

Latest Llama model family by Meta with Mixture-of-Experts architecture.

Open SourceSelf HostedOfflineGPU 16GB+
Advanced
0.0 (0)
Featured

High-performance open-weight LLMs by Mistral AI with MoE architecture.

Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)