ZenML 🙏: One AI Platform from Pipelines to Agents. https://zenml.io.
npx skills add https://github.com/zenml-io/zenml --skill zenml-backportCLI を使用してこのスキルをインストールし、ワークスペースで SKILL.md ワークフローの使用を開始します。
Projects •
Roadmap •
Changelog •
Report Bug •
Sign up for ZenML Pro •
Blog •
Docs
🎉 For the latest release, see the changelog.
ZenML is built for ML or AI Engineers working on traditional ML use-cases, LLM workflows, or agents, in a company setting.
At it's core, ZenML allows you to write workflows (pipelines) that run on any infrastructure backend (stacks). You can embed any Pythonic logic within these pipelines, like training a model, or running an agentic loop. ZenML then operationalizes your application by:
...amongst many other features.
ZenML is used by thousands of companies to run their AI workflows. Here are some featured ones:
(please email [email protected] if you want to be featured)
# Install ZenML with server capabilities
pip install "zenml[server]" # pip install zenml will install a slimmer client
# Initialize your ZenML repository
zenml init
# Start local server or connect to a remote one
zenml login
You can then explore any of the examples in this repo. We recommend starting with the quickstart, which demonstrates core ZenML concepts: pipelines, steps, artifacts, snapshots, and deployments.
ZenML uses a client-server architecture with an integrated web dashboard (zenml-io/zenml-dashboard):
pip install "zenml[local]" - runs both client and server locallypip install zenml + zenml login <server-url>Here is a short demo:
The best way to learn about ZenML is through our comprehensive documentation and tutorials:
Stop clicking through dashboards to understand your ML workflows. The ZenML MCP Server lets you query your pipelines, analyze runs, and trigger deployments using natural language through Claude Desktop, Cursor, or any MCP-compatible client.
💬 "Which pipeline runs failed this week and why?"
📊 "Show me accuracy metrics for all my customer churn models"
🚀 "Trigger the latest fraud detection pipeline with production data"
Quick Setup:
.dxt file from zenml-io/mcp-zenmlThe MCP (Model Context Protocol) integration transforms your ZenML metadata into conversational insights, making pipeline debugging and analysis as easy as asking a question. Perfect for teams who want to democratize access to ML operations without requiring dashboard expertise.
Building AI agents that need to survive crashes, pause for human approval, or run on cloud infrastructure? Kitaru is our open-source sister project for making Python agents durable.
Built on the same infrastructure that powers ZenML. Two decorators (@flow + @checkpoint) and you're done.
pip install kitaru
ZenML is featured in these comprehensive guides to production AI systems.
Contribute:
good-first-issueStay Updated:
Q: "Do I need to rewrite my agents or models to use ZenML?"
A: No. Wrap your existing code in a @step. Keep using scikit-learn, PyTorch, LangGraph, LlamaIndex, or raw API calls. ZenML orchestrates your tools, it doesn't replace them.
Q: "How is this different from LangSmith/Langfuse?"
A: They provide excellent observability for LLM applications. We orchestrate the full MLOps lifecycle for your entire AI stack. With ZenML, you manage both your classical ML models and your AI agents in one unified framework, from development and evaluation all the way to production deployment.
Q: "Can I use my existing MLflow/W&B setup?"
A: Yes! ZenML integrates with both MLflow and Weights & Biases. Your experiments, our pipelines.
Q: "Is this just MLflow with extra steps?"
A: No. MLflow tracks experiments. We orchestrate the entire development process – from training and evaluation to deployment and monitoring – for both models and agents.
Q: "How do I configure ZenML with Kubernetes?"
A: ZenML integrates with Kubernetes through the native Kubernetes orchestrator, Kubeflow, and other K8s-based orchestrators. See our Kubernetes orchestrator guide and Kubeflow guide, plus deployment documentation.
Q: "What about cost? I can't afford another platform."
A: ZenML's open-source version is free forever. You likely already have the required infrastructure (like a Kubernetes cluster and object storage). We just help you make better use of it for MLOps.
Manage pipelines directly from your editor:
Install from VS Code Marketplace.
ZenML is distributed under the terms of the Apache License Version 2.0. See
LICENSE for details.
