Distil-Whisper

Distilled version of Whisper that is 6x faster with minimal accuracy loss.

Open SourceSelf HostedOffline Capable
0.0 (0)

About

Distil-Whisper is a distilled version of OpenAI Whisper by Hugging Face. Achieves 6x faster inference speed while maintaining within 1% WER of the original model. Ideal for production deployment where speed matters. MIT license.

Reviews (0)

Leave a Review

No reviews yet. Be the first to review!

Details

Price
Free
Platform
Local/Desktop
Difficulty
Easy (2/5)
License
MIT
Added
Apr 3, 2026

Similar Tools

Featured

General-purpose speech recognition model by OpenAI trained on 680K hours of multilingual audio.

Open SourceSelf HostedOffline
Easy
0.0 (0)
Featured

High-performance C/C++ port of Whisper for CPU-based speech recognition.

Open SourceSelf HostedOffline
Easy
0.0 (0)

Offline speech recognition toolkit supporting 20+ languages with small models.

Open SourceSelf HostedOffline
Easy
0.0 (0)