Tool Decorator¶
The @kubiya.tool decorator is the primary way to create Docker-based tools with Kubiya SDK. This decorator transforms simple Python functions into containerized tools that can be executed independently or as part of a workflow.
Basic Usage¶
The simplest way to create a tool is to use the decorator on a Python function:
from kubiya_sdk import kubiya
@kubiya.tool(description="Process text data")
def process_text(text: str) -> str:
"""
Convert input text to uppercase
Args:
text: The text to process
Returns:
The processed text in uppercase
"""
return text.upper()
# Run the tool
result = process_text("hello world") # Returns: "HELLO WORLD"
When you apply the @kubiya.tool decorator, the function is:
- Automatically wrapped in a Docker container (Python 3.12-slim by default)
- Registered with Kubiya for use in workflows and through the CLI
- Documented based on its docstring and type annotations
- Executed in an isolated environment with its own dependencies
Decorator Parameters¶
The @kubiya.tool decorator accepts many parameters to customize your tool:
| Parameter | Type | Description |
|---|---|---|
name |
str | Custom name for the tool (defaults to function name) |
description |
str | Tool description (defaults to function docstring) |
image |
str, Dict | Docker image or image profile to use |
requirements |
List[str] | Python packages to install in the container |
env |
List[str] | Environment variables to pass to the container |
secrets |
List[str] | Secret names to make available to the container |
content |
str | Custom content for the container entrypoint |
on_build |
str | Shell commands to run during container build |
with_files |
List[FileSpec] | Files to include in the container |
required_configs |
List[str] | Names of required configuration schemas |
optional_configs |
List[str] | Names of optional configuration schemas |
config |
str | Single configuration schema to use |
on_start |
str | Shell commands to run when the container starts |
on_complete |
str | Shell commands to run when the container completes |
Advanced Examples¶
Custom Docker Image¶
@kubiya.tool(
name="python_processor",
description="Process Python code with advanced formatting",
image="python:3.11-slim"
)
def format_python_code(code: str) -> str:
"""Format Python code using Black formatter"""
import black
try:
return black.format_str(code, mode=black.Mode())
except Exception as e:
return f"Error formatting code: {str(e)}"
With Requirements¶
@kubiya.tool(
description="Generate images with DALL-E",
requirements=["openai"]
)
def generate_image(prompt: str, size: str = "1024x1024") -> str:
"""
Generate an image using DALL-E
Args:
prompt: Text description of the image
size: Image size (1024x1024, 512x512, or 256x256)
Returns:
URL to the generated image
"""
import openai
response = openai.Image.create(
prompt=prompt,
n=1,
size=size
)
return response['data'][0]['url']
Using Image Profiles¶
@kubiya.tool(
description="Perform data analysis on a dataset",
image=kubiya.Image.Python.DATA_SCIENCE
)
def analyze_data(data_url: str, columns: list = None) -> dict:
"""
Analyze data from a CSV file
Args:
data_url: URL to a CSV dataset
columns: Optional list of columns to analyze
Returns:
Statistical summary of the data
"""
import pandas as pd
import numpy as np
# Download and load data
df = pd.read_csv(data_url)
# Filter columns if specified
if columns:
df = df[columns]
# Generate stats
return {
"shape": df.shape,
"summary": df.describe().to_dict(),
"missing_values": df.isna().sum().to_dict(),
"column_types": {col: str(dtype) for col, dtype in df.dtypes.items()}
}
With File Handling¶
from kubiya_sdk.tools.models import FileSpec
@kubiya.tool(
description="Process image files",
requirements=["pillow"],
with_files=[
FileSpec(name="images", mount_path="/app/images", required=True)
]
)
def resize_images(width: int = 800, height: int = 600) -> dict:
"""
Resize all images in the mounted directory
Args:
width: New width for the images
height: New height for the images
Returns:
Summary of processed images
"""
import os
from PIL import Image
image_dir = "/app/images"
processed = []
skipped = []
for filename in os.listdir(image_dir):
try:
if filename.lower().endswith(('.png', '.jpg', '.jpeg')):
img_path = os.path.join(image_dir, filename)
img = Image.open(img_path)
img = img.resize((width, height), Image.LANCZOS)
new_path = os.path.join(image_dir, f"resized_{filename}")
img.save(new_path)
processed.append(filename)
else:
skipped.append(filename)
except Exception as e:
skipped.append(f"{filename} (error: {str(e)})")
return {
"processed": processed,
"skipped": skipped,
"total_processed": len(processed),
"total_skipped": len(skipped)
}
With Environment Variables and Secrets¶
@kubiya.tool(
description="Connect to a database",
requirements=["psycopg2-binary"],
env=["DB_HOST", "DB_PORT", "DB_NAME"],
secrets=["DB_USER", "DB_PASSWORD"]
)
def query_database(query: str) -> list:
"""
Execute a SQL query on a PostgreSQL database
Args:
query: SQL query to execute
Returns:
Query results as a list of dictionaries
"""
import os
import psycopg2
import psycopg2.extras
# Get connection details from environment variables and secrets
db_params = {
"host": os.environ.get("DB_HOST"),
"port": os.environ.get("DB_PORT", "5432"),
"dbname": os.environ.get("DB_NAME"),
"user": os.environ.get("DB_USER"),
"password": os.environ.get("DB_PASSWORD")
}
# Connect to the database
with psycopg2.connect(**db_params) as conn:
with conn.cursor(cursor_factory=psycopg2.extras.DictCursor) as cursor:
cursor.execute(query)
results = cursor.fetchall()
return [dict(row) for row in results]
Custom On-Build Script¶
@kubiya.tool(
description="Run Node.js scripts",
image="node:18-slim",
on_build="""
# Install global Node.js tools
npm install -g typescript@4.9.5
npm install -g prettier
# Create project directory with package.json
mkdir -p /app/project
cd /app/project
npm init -y
npm install lodash axios
"""
)
def run_typescript(code: str) -> dict:
"""
Compile and run TypeScript code
Args:
code: TypeScript code to execute
Returns:
Compilation and execution results
"""
import os
import subprocess
import json
# Write TypeScript code to a file
with open("/app/project/script.ts", "w") as f:
f.write(code)
# Compile the TypeScript code
compile_result = subprocess.run(
["tsc", "/app/project/script.ts", "--outDir", "/app/project"],
capture_output=True,
text=True
)
# Run the compiled JavaScript if compilation succeeded
if compile_result.returncode == 0:
run_result = subprocess.run(
["node", "/app/project/script.js"],
capture_output=True,
text=True
)
return {
"success": True,
"stdout": run_result.stdout,
"stderr": run_result.stderr,
"exit_code": run_result.returncode
}
else:
return {
"success": False,
"error": "Compilation failed",
"stderr": compile_result.stderr
}
Type Handling¶
The tool decorator automatically converts Python type annotations to the appropriate types in the tool's schema:
| Python Type | JSON Schema Type |
|---|---|
str |
"string" |
int |
"integer" |
float |
"number" |
bool |
"boolean" |
list |
"array" |
dict |
"object" |
| Other | "string" (default) |
Function Signature Effects¶
The function signature and docstring determine how your tool will appear to users:
- Function name becomes the tool name (unless overridden)
- Function docstring becomes the tool description (unless overridden)
- Parameter names become argument names
- Parameter type hints become argument types
- Default parameter values become default argument values
- Parameter docstrings become argument descriptions
Running Tools Locally vs. Remotely¶
When you call a tool function directly in your code:
By default, the tool will run in a Docker container on your local machine. However, when your code is deployed to the Kubiya platform, the same tools will execute in the cloud environment automatically.