Key Concepts¶
This page explains the fundamental concepts of Kubiya SDK and how they work together to create a powerful ecosystem for AI-powered automation.
Tools: The Building Blocks¶
Tools in Kubiya are self-contained, Docker-based components that provide specific functionality. They're designed to be:
- Stateless: Each execution is independent
- Docker-backed: Built on existing Docker images
- Reusable: Can be consumed by multiple consumers
- Portable: Run on any infrastructure supporting Docker
A key advantage is that you don't need to write complex business logic from scratch. Instead, you leverage existing Docker images as building blocks.
from kubiya_sdk import tool
@tool(image="bitnami/kubectl:latest")
def deploy_to_kubernetes(namespace: str, deployment_yaml: str) -> dict:
"""Deploy an application to Kubernetes"""
# The tool runs in the kubectl container
pass
Teammates: Self-Contained AI Agents¶
Teammates are self-contained AI agents that consume tools to perform complex tasks. Each teammate:
- Has access to a set of tools they can use
- Manages credentials securely
- Can utilize dynamic infrastructure (primarily Kubernetes in production)
- Makes intelligent decisions about when and how to use tools
from kubiya_sdk import Teammate, tool
@tool(image="python:3.12-slim", requirements=["pandas", "matplotlib"])
def analyze_data(data: list) -> dict:
"""Analyze data and generate statistics"""
# Data analysis code here
pass
# Create a specialized AI agent (teammate)
data_analyst = Teammate(
id="data-analyst",
description="Analyzes data and generates insights",
tools=[analyze_data],
credentials={"database": "${DB_CREDENTIALS}"}
)
Workflows: Orchestrating Tools and Teammates¶
Workflows allow you to chain tools and teammates together for complex, predictable scenarios:
- Define sequences of tool/teammate executions
- Handle conditional logic and branching
- Pass data between steps
- Manage errors and retries
from kubiya_sdk import Workflow, tool
@tool(image="python:3.12-slim", requirements=["requests"])
def fetch_data(url: str) -> list:
"""Fetch data from an API"""
import requests
response = requests.get(url)
return response.json()
@tool(image="python:3.12-slim", requirements=["pandas"])
def process_data(data: list) -> dict:
"""Process the data"""
import pandas as pd
# Processing code
return {"result": "processed"}
# Create a workflow that chains tools together
data_workflow = Workflow(
id="data-pipeline",
description="Fetch and process data",
tools=[fetch_data, process_data]
)
Model Context Protocol (MCP)¶
Tools can be consumed not only by teammates but also by any LLM application through the Model Context Protocol (MCP):
- Standard interface for tool invocation
- Language-agnostic integration
- Makes tools accessible to any LLM-powered system
This means your tools can be used by: - Kubiya teammates - External LLM applications - Third-party AI systems - Custom integrations
Runners and Infrastructure¶
Kubiya provides flexible execution environments through runners:
- Local Runner: Executes tools on your local Docker environment
- Kubernetes Runner: Scales tools on Kubernetes clusters
- Serverless Runners: Runs tools on serverless platforms
Tools specify their infrastructure requirements, and runners manage the execution:
from kubiya_sdk import tool
from kubiya_sdk.infrastructure import KubernetesConfig
# Configure Kubernetes execution
k8s_config = KubernetesConfig(
namespace="ai-tools",
service_account="tool-runner"
)
@tool(
image="python:3.12-slim",
requirements=["scikit-learn", "pandas"],
infrastructure=k8s_config # Specify Kubernetes execution
)
def train_model(dataset_url: str) -> dict:
"""Train a machine learning model on Kubernetes"""
# ML training code
pass
Kubiya CLI¶
The Kubiya CLI is a command-line tool that helps you build, test, and manage your tool sources:
- Build: Package tools for deployment
- Test: Verify tool functionality
- Deploy: Deploy tools to production
- Manage: Handle tool versions and dependencies
# Test a tool locally
kubiya tool test my-tool --param name=value
# Deploy tools to production
kubiya deploy --env production
# List available tools
kubiya tools list
The Complete Ecosystem¶
The power of Kubiya comes from how these components work together:
- Build Docker-based tools that leverage existing solutions
- Create teammates (AI agents) that use these tools
- Design workflows for complex, predictable processes
- Deploy to infrastructure that matches your needs
- Integrate with LLM applications via MCP
- Manage everything through the Kubiya CLI
This creates a flexible, extensible ecosystem for AI-powered automation that can run anywhere Docker is supported.