KServe

Kubernetes serverless inference platform for deploying ML models.

Open SourceSelf HostedOffline Capable
0.0 (0)

About

KServe (formerly KFServing) provides a Kubernetes Custom Resource for serverless ML model inference. Supports autoscaling, canary rollouts, and multi-framework serving (TensorFlow, PyTorch, ONNX, etc.). Apache 2.0 license.

Reviews (0)

Leave a Review

No reviews yet. Be the first to review!

Details

Price
Free
Platform
Local/Desktop
Difficulty
Advanced (4/5)
License
Apache-2.0
Added
Apr 3, 2026

Similar Tools

Open-source ML deployment platform for Kubernetes.

Open SourceSelf HostedOffline
Intermediate
0.0 (0)
Featured

Framework for building production-ready AI application services.

Open SourceSelf HostedOffline
Easy
0.0 (0)
Featured

NVIDIA inference serving platform for deploying AI models at scale.

Open SourceSelf HostedOfflineGPU 8GB+
Advanced
0.0 (0)