Building AI Workflows: From Local Experiments To Serving Users

A presentation at AI_Dev in August 2025 in Amsterdam, Netherlands by Oleg Šelajev

Slide 1

Slide 1

Building AI workflows: from local experiments to serving users Oleg Šelajev, Docker

Slide 2

Slide 2

Slide 3

Slide 3

Agentic applications need three things Models Tools Code

Slide 4

Slide 4

Large Language Model github Github Tool User input or event notion Agent Application Notion tool Response SQL tool Internal DB

Slide 5

Slide 5

The Docker Model Runner Run models next to your other containerized services using the tools you’re already using compose.yaml models: gemma3: model: ai/gemma3:4B-F16 services: app: models: gemma3: endpoint_var: OPENAI_BASE_URL model_var: OPENAI_MODEL

Slide 6

Slide 6

The MCP Catalog Run MCP servers using containers without worrying about runtimes or installs anymore

Slide 7

Slide 7

The MCP Gateway Run containerized MCP servers safely and securely directly in your application stack compose.yaml services: mcp-gateway: image: docker/mcp-gateway:latest use_api_socket: true command: - —transport=sse - —servers=duckduckgo - —tools=search,fetch_content app: … environment: MCP_ENDPOINT: http://mcp-gateway:8811/sse

Slide 8

Slide 8

github Large Language Model Github Tool notion User input or event Notion tool Agent Application MCP Gateway Response SQL tool Internal DB

Slide 9

Slide 9

Compose for agents Build and run AI agents using Docker Compose

Slide 10

Slide 10

Cloud Run and Docker Compose Deploy your compose.yaml directly to Cloud Run cloud.google.com/blog/products/serverless/cloud-run-and-docker-collaboration