Featured Tool
DeepSeek-V3
High-performance open-weight MoE LLM with 671B total parameters.
Open SourceSelf HostedOffline CapableGPU Required (24GB+ VRAM)
0.0 (0)
About
DeepSeek-V3 is an open-weight Mixture-of-Experts LLM with 671B total parameters (37B active). Achieves performance competitive with GPT-4o and Claude 3.5 Sonnet on benchmarks. Trained efficiently using FP8 mixed precision. DeepSeek License.
Reviews (0)
Leave a Review
No reviews yet. Be the first to review!
Details
- Category
- Large Language Models (LLMs)
- Price
- Free
- Platform
- Local/Desktop
- Difficulty
- Advanced (4/5)
- License
- DeepSeek License
- Minimum VRAM
- 24 GB
- Added
- Apr 3, 2026
Similar Tools
Featured
Open-weight LLM by Meta available in 8B, 70B, and 405B parameter sizes.
Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)
Featured
Latest Llama model family by Meta with Mixture-of-Experts architecture.
Open SourceSelf HostedOfflineGPU 16GB+
Advanced
0.0 (0)
Featured
High-performance open-weight LLMs by Mistral AI with MoE architecture.
Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)