Daita Create
Generate new agents and workflows from templates. This command creates simple, ready-to-use components that you can customize for your use case.
Syntax
daita create COMPONENT_TYPE NAME
Component Types
| Component | Description | Command |
|---|---|---|
agent | Create a new AI agent | daita create agent NAME |
workflow | Create a new multi-agent workflow | daita create workflow NAME |
Requirements
- Must be run inside a Daita project directory (contains
.daita/folder) - Project must have a valid
daita-project.yamlconfiguration file
daita create agent
Create a new AI agent from a simple template.
Syntax
daita create agent NAME
Arguments
| Argument | Type | Required | Description |
|---|---|---|---|
NAME | string | Yes | Name of the agent (use snake_case or kebab-case, e.g., my_agent) |
Examples
# Create a basic agent
daita create agent data_processor
# Create agent with underscores
daita create agent user_profile_analyzer
# Create agent with hyphens (converted to underscores)
daita create agent sentiment-analyzer
What Gets Created
When you create an agent:
- Agent file is created in
agents/{name}.py - Display name prompt - You'll be asked for a display name for deployment
- Configuration updated - The agent is added to
daita-project.yaml
agents/
└── {name}.py # New agent file
daita-project.yaml # Updated with agent entry
Agent Template
The generated agent file (agents/{name}.py) contains:
"""
DataProcessor Agent
Replace this with your own agent logic.
"""
from daita import SubstrateAgent
def create_agent():
"""Create the agent instance using direct SubstrateAgent pattern."""
# Option 1: Simple instantiation (uses defaults)
agent = SubstrateAgent(name="DataProcessor")
# Option 2: Direct LLM configuration (uncomment and modify as needed)
# import os
# agent = SubstrateAgent(
# name="DataProcessor",
# llm_provider="openai",
# model="gpt-4",
# api_key=os.getenv("OPENAI_API_KEY")
# )
# Optional: Add plugins
# from daita.plugins import postgresql
# agent.add_plugin(postgresql(host="localhost", database="mydb"))
return agent
if __name__ == "__main__":
import asyncio
async def main():
agent = create_agent()
result = await agent.process("test_task", "Hello, world!")
print(result)
asyncio.run(main())
Testing the Agent
# Test the agent directly
python agents/data_processor.py
# Or use the test command
daita test data_processor
daita create workflow
Create a new workflow from a simple template.
Syntax
daita create workflow NAME
Arguments
| Argument | Type | Required | Description |
|---|---|---|---|
NAME | string | Yes | Name of the workflow (use snake_case or kebab-case) |
Examples
# Create a basic workflow
daita create workflow data_pipeline
# Create workflow with hyphens
daita create workflow sentiment-analysis-pipeline
What Gets Created
When you create a workflow:
- Workflow file is created in
workflows/{name}.py - Display name prompt - You'll be asked for a display name for deployment
- Configuration updated - The workflow is added to
daita-project.yaml
Workflow Template
The generated workflow file (workflows/{name}.py) contains:
"""
DataPipeline Workflow
Replace this with your own workflow logic.
"""
from daita import SubstrateAgent, Workflow
class DataPipeline:
"""A simple workflow."""
def __init__(self):
self.workflow = Workflow("data_pipeline")
# Add your agents here
# agent = SubstrateAgent(name="Agent")
# self.workflow.add_agent("agent", agent)
async def run(self, data=None):
"""
Run the workflow.
Replace this with your own logic.
"""
try:
await self.workflow.start()
# Your workflow logic here
result = f"Workflow DataPipeline processed: {data}"
return {
'status': 'success',
'result': result
}
finally:
await self.workflow.stop()
def create_workflow():
"""Create the workflow instance."""
return DataPipeline()
if __name__ == "__main__":
import asyncio
async def main():
workflow = DataPipeline()
result = await workflow.run("test data")
print(result)
asyncio.run(main())
Testing the Workflow
# Test the workflow directly
python workflows/data_pipeline.py
# Or use the test command
daita test data_pipeline
Configuration Updates
When you create an agent or workflow, daita-project.yaml is automatically updated:
agents:
- name: data_processor
display_name: "Data Processor"
type: substrate
created_at: '2025-01-15T10:00:00'
workflows:
- name: data_pipeline
display_name: "Data Pipeline"
type: basic
created_at: '2025-01-15T10:00:00'
Common Issues
Agent Already Exists
❌ Agent data_processor already exists
Solution: Choose a different name or delete the existing agent file.
Not in Project Directory
❌ Not in a Daita project. Run 'daita init' first.
Solution: Navigate to your project directory or initialize a new project:
daita init my-project
cd my-project
daita create agent my_agent
Next Steps
After creating components:
- Test Components - Test your agents and workflows
- Configure Settings - Add retry policies, webhooks, schedules
- Deploy - Deploy to production
Best Practices
Naming Conventions
- Use snake_case for file names:
data_processor,sentiment_analyzer - Use descriptive names that indicate purpose
- Avoid generic names like
agent1,test
Organization
- Keep related agents in the same project
- Use workflows to orchestrate multiple agents
- Add comments explaining agent purpose
Development Workflow
# Create agent
daita create agent my_agent
# Edit the agent file
# agents/my_agent.py
# Test locally
daita test my_agent
# Repeat until working
daita test --watch
# Deploy
daita push