The durable runtime
for workflows and AI agents

Stop stitching together retries, state, and compensation by hand. Conductor gives your workflows a durable runtime.

Write workers in any language
@WorkerTask("charge_payment")
public ChargeResult charge(OrderInput input) {
    var chargeId = paymentService.charge(input.orderId, input.amount);
    return new ChargeResult(chargeId);
}@worker_task(task_definition_name="charge_payment")
def charge_payment(order_id: str, amount: float) -> dict:
    charge_id = payment_service.charge(order_id, amount)
    return {"chargeId": charge_id}func ChargePayment(input *OrderInput) (*ChargeResult, error) {
    chargeId, err := paymentService.Charge(input.OrderId, input.Amount)
    if err != nil {
        return nil, err
    }
    return &ChargeResult{ChargeId: chargeId}, nil
}worker.register("charge_payment", async ({ orderId, amount }) => {
    const chargeId = await paymentService.charge(orderId, amount);
    return { chargeId };
});[WorkerTask("charge_payment")]
public ChargeResult ChargePayment(OrderInput input) {
    var chargeId = _paymentService.Charge(input.OrderId, input.Amount);
    return new ChargeResult(chargeId);
}
Guaranteed at-least-once task delivery
Any language worker support
Millions concurrent workflows
Billions of workflows internet scale execution

Built for workflows that can't afford to fail.

Core

Durable execution by default

Workflow state is persisted at every step. Survive server restarts, worker crashes, and network failures. Durable execution with at-least-once task delivery, configurable retries, timeouts, and compensation flows. Build durable agents that never lose progress.

Failure semantics →
Format

JSON native — deterministic by default

JSON definitions separate orchestration from implementation — no side effects, no hidden state, every run is deterministic. Generate workflows at runtime with LLMs, modify per-execution, and use dynamic forks, dynamic tasks, and dynamic sub-workflows for more flexibility than code-based engines. Code via SDKs when you need it.

Why JSON wins →
Primitives

Pause, Resume, Replay, Restart

Pause workflows on time, external signals, webhooks, or human approval. Resume safely after minutes, hours, or days. Replay any workflow from the beginning, from a specific task, or retry just the failed step — even months later. Full execution history is always preserved.

How it works →
AI

AI agent orchestration & LLM orchestration

Orchestrate AI agents with 14+ native LLM providers (Anthropic, OpenAI, Gemini, Bedrock, Mistral, and more), MCP tool calling, function calling, human-in-the-loop approval, and structured output. Built-in vector database support (Pinecone, pgvector, MongoDB Atlas) for RAG pipelines.

AI Cookbook →
Workers

Polyglot workers

Write task workers in any language. Workers poll for tasks, execute your logic, and report results—run them anywhere.

Java Python Go C# JavaScript Ruby Rust
Reliability

Saga pattern & compensation

Model distributed transactions as sagas. When a step fails, Conductor automatically runs undo logic in reverse order—no manual intervention.

Error handling →

Multiple ways to run.

CLI recommended

# Install CLI
npm install -g @conductor-oss/conductor-cli
# Start the server
conductor server start

Docker

docker run -p 8080:8080 conductoross/conductor:latest

Frequently asked questions.

Is Conductor open source?

Yes. Conductor is a fully open source workflow engine, Apache 2.0 licensed. You can self-host it on your own infrastructure with no vendor lock-in. It supports 8+ persistence backends, 6 message brokers, and runs anywhere Docker runs.

Is this the same as Netflix Conductor?

Yes. Conductor OSS is the continuation of the original Netflix Conductor repository after Netflix contributed the project to the open-source foundation.

Is this project actively maintained?

Yes. Orkes is the primary maintainer of this repository and offers an enterprise SaaS platform for Conductor across all major cloud providers.

Can Conductor scale to handle my workload?

Conductor was built at Netflix to handle massive scale and has been battle-tested in production environments processing millions of workflows. It scales horizontally to meet virtually any demand.

Does Conductor support durable execution?

Yes. Conductor pioneered durable execution patterns, ensuring workflows and durable agents complete reliably even in the face of infrastructure failures, process crashes, or network issues.

Can I replay a workflow after it completes or fails?

Yes. Conductor preserves full execution history indefinitely. You can restart from the beginning, rerun from any specific task, or retry just the failed step — even months later. Use the API (/restart, /rerun, /retry) or the UI.

Are workflows always asynchronous?

No. While Conductor excels at asynchronous orchestration, it also supports synchronous workflow execution when immediate results are required.

Do I need to use a Conductor-specific framework?

No. Conductor is language and framework agnostic. Use your preferred language and framework—SDKs provide native integration for Java, Python, JavaScript, Go, C#, and more.

Isn't JSON too limited for complex workflows?

The opposite. JSON separates orchestration from implementation, making every workflow deterministic by construction — no side effects, no hidden state. Dynamic forks, dynamic tasks, and dynamic sub-workflows let you build workflows that are more flexible than code-based engines. JSON is also AI-native: LLMs can generate and modify workflow definitions at runtime without a compile/deploy cycle. Code-based engines require redeployment for every change.

Is Conductor a low-code/no-code platform?

No. Conductor is designed for developers who write code. While workflows can be defined in JSON, the power comes from building workers and tasks in your preferred programming language.

Can Conductor handle complex workflows?

Conductor was specifically designed for complex orchestration. It supports advanced patterns including nested loops, dynamic branching, sub-workflows, and workflows with thousands of tasks.

Is Netflix Conductor abandoned?

No. The original Netflix repository has transitioned to Conductor OSS, which is the new home for the project. Active development and maintenance continues here.

Is Orkes Conductor compatible with Conductor OSS?

100% compatible. Orkes Conductor is built on top of Conductor OSS, ensuring full compatibility between the open-source version and the enterprise offering.

Can Conductor orchestrate AI agents and LLMs?

Yes. Conductor provides AI agent orchestration and LLM orchestration as native capabilities. 14+ LLM providers (Anthropic, OpenAI, Azure OpenAI, Google Gemini, AWS Bedrock, Mistral, Cohere, HuggingFace, Ollama, and more), MCP tool calling and function calling (LIST_MCP_TOOLS, CALL_MCP_TOOL), vector database integration (Pinecone, pgvector, MongoDB Atlas) for RAG, and content generation (image, audio, video, PDF). All with the same durability guarantees as any other workflow task.

How does Conductor compare to other workflow engines?

Conductor is the only open source workflow engine with native LLM task types for 14+ providers, built-in MCP integration, and vector database support. Combined with durable execution, 7+ language SDKs (Java, Python, Go, JavaScript, C#, Ruby, Rust), 6 message brokers, 8+ persistence backends, and battle-tested scale at Netflix, Tesla, LinkedIn, and JP Morgan, Conductor provides the most complete workflow orchestration platform available. Unlike Temporal, Step Functions, or Airflow, Conductor is fully self-hosted, supports both code-first and JSON workflow definitions, and provides native AI agent orchestration out of the box.

Trusted by engineering teams at

Netflix Tesla LinkedIn JP Morgan Freshworks American Express Redfin VMware Coupang Swiggy Netflix Tesla LinkedIn JP Morgan Freshworks American Express Redfin VMware Coupang Swiggy

Open source workflow engine. Community driven.

Apache-2.0 licensed. Self-hosted, no vendor lock-in. Originally created at Netflix, now maintained by the community.