Docker Compose Gets Agent‑Ready
Docker just overhauled Compose so a single YAML file can define your models, tools and agent frameworks, then launch the whole stack with:
docker compose up
Below is the marketer‑centric rundown-what shipped, how it pairs with Dagger’s agent containers, and why paid‑search and creative teams should care.
What shipped
| Feature | Why it matters |
|---|---|
Agent definitions in compose.yaml | List open‑weights, LangGraph flows, Vercel AI SDK functions, CrewAI toolchains, etc.-Compose networks them automatically. |
| Serverless targets | docker compose up --provider gcr deploys the stack straight to Google Cloud Run (or Azure Container Apps). |
| Docker Offload (beta) | 300 free minutes to push GPU‑hungry steps to a managed cluster while iterating locally-ideal for fine‑tuning or RAG indexing. |
| MCP Catalog + Model Runner | Pull Llama 3, Mixtral or Phi‑3 from Docker Hub, run them as OpenAI‑compatible endpoints with zero code. |
Source → https://www.docker.com/blog/build-ai-agents-with-docker-compose/
Enter Dagger agent containers
Solomon Hykes (Docker’s founder) launched Dagger to treat CI/CD pipelines as code. The latest release adds agent containers-each task (embed, recall, generate) becomes a micro‑service you can orchestrate.
https://dagger.io/blog/agent-container-use
Compose + Dagger lets you:
- Describe a multi‑LLM workflow in Dagger’s pipeline syntax.
- Reference those containers in
compose.yaml. - One‑click ship to Cloud Run, letting Offload handle the GPUs.
Why this matters for digital‑marketing agents
- Context engineering becomes reproducible
Store your RAG setup (ingest → chunk → vectorstore → generate) in YAML-debug‑friendly and version‑controlled. - Campaign orchestration as code
Spin up a nightly Budget agent, schedule via Cloud Run, and swap models without touching infra. - Faster experimentation
Free Offload minutes let you benchmark Mixtral vs. Llama 3 on bid‑modelling before committing GPU spend. - Tool interchangeability
CrewAI for persona planning today; Vercel AI SDK tomorrow-Compose only cares that each service exposes an endpoint.
Quick‑start template for Docker
services:
vector-db:
image: chromadb/chroma
ports: [":8000"]
model-runner:
image: ghcr.io/huggingface/text-generation-inference:0.9
environment:
MODEL_ID: mistralai/Mixtral-8x22B-Instruct-v0.1
langgraph:
image: langchain/langgraph:latest
volumes:
- ./flows:/flows
depends_on: [vector-db, model-runner]
agent:
image: vercel/ai-sdk:latest
environment:
OPENAI_BASE_URL: http://model-runner:8000/v1
depends_on: [langgraph]
Launch locally:
docker compose up
Deploy to Cloud Run:
docker compose up --provider gcr
“The future of software is agentic.” -Mark Cavage & Tushar Jain, Docker
With Compose and Dagger abstracting away GPUs and deployment targets, the bottleneck shifts from DevOps to prompt design and data quality-exactly where marketing teams can add unique value.