RWKV
RNN-based language model with transformer-level performance and linear scaling.
Open SourceSelf HostedOffline CapableGPU Required (4GB+ VRAM)
0.0 (0)
About
RWKV is a language model architecture combining RNN efficiency with transformer performance. Runs with linear memory and compute scaling, enabling very long context. Available in 0.1B to 14B sizes. Apache 2.0 license by BlinkDL.
Reviews (0)
Leave a Review
No reviews yet. Be the first to review!
Details
- Category
- Large Language Models (LLMs)
- Price
- Free
- Platform
- Local/Desktop
- Difficulty
- Intermediate (3/5)
- License
- Apache-2.0
- Minimum VRAM
- 4 GB
- Added
- Apr 3, 2026
Similar Tools
Featured
Open-weight LLM by Meta available in 8B, 70B, and 405B parameter sizes.
Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)
Featured
Latest Llama model family by Meta with Mixture-of-Experts architecture.
Open SourceSelf HostedOfflineGPU 16GB+
Advanced
0.0 (0)
Featured
High-performance open-weight LLMs by Mistral AI with MoE architecture.
Open SourceSelf HostedOfflineGPU 8GB+
Intermediate
0.0 (0)