Project Configuration
Project configuration in Daita is managed through the `daita-project.yaml` file. This file defines your agents, workflows, webhooks, schedules, and deployment settings.
#Quick Start
When you run daita init, a basic daita-project.yaml file is created:
name: my_project
version: 1.0.0
description: A Daita AI agent project
created_at: '2025-01-15T10:00:00'
agents: []
workflows: []As you create agents and workflows, they're automatically added to this configuration file.
#Agent Configuration
Define agents in your project with their settings and optional features:
agents:
- name: data_processor
description: 'Processes incoming data'
file: agents/data_processor.py
enabled: true
- name: content_generator
description: 'Generates content from templates'
file: agents/content_generator.py
enabled: true
# Optional: Retry configuration
enable_retry: true
retry_policy:
max_retries: 3
base_delay: 1.0
max_delay: 60.0
strategy: exponential # fixed, exponential, or linear
# Optional: Custom settings
settings:
temperature: 0.7
max_tokens: 1000#Retry Policies
Configure how agents handle failures:
agents:
- name: resilient_agent
enable_retry: true
retry_policy:
max_retries: 5 # Maximum retry attempts
base_delay: 1.0 # Initial delay in seconds
max_delay: 60.0 # Maximum delay cap
strategy: exponential # exponential, fixed, or linear
jitter: true # Add randomness to prevent thundering herdRetry Strategies:
exponential: Delays increase exponentially (1s, 2s, 4s, 8s...)fixed: Same delay between retries (1s, 1s, 1s...)linear: Delays increase linearly (1s, 2s, 3s...)
#Workflow Configuration
Define workflows that orchestrate multiple agents:
workflows:
- name: data_pipeline
description: 'Complete data processing pipeline'
file: workflows/data_pipeline.py
type: standard
enabled: true
- name: content_workflow
description: 'Content generation and review workflow'
file: workflows/content_workflow.py
type: standard
enabled: true#Webhook Configuration
Configure webhooks to trigger agents or workflows from external services (GitHub, Slack, custom APIs, etc.):
agents:
- name: github_processor
file: agents/github_processor.py
webhooks:
- slug: 'github-push'
field_mapping:
'repository.name': 'repo_name'
'pusher.name': 'author'
'commits[0].message': 'commit_message'
'commits[0].url': 'commit_url'
'ref': 'branch'
- slug: 'slack-mention'
field_mapping:
'event.user': 'user_id'
'event.text': 'message_text'
'event.channel': 'channel_id'#Webhook URL Format
After deployment, webhooks are available at:
https://api.daita-tech.io/api/v1/webhooks/trigger/{org_id}/{webhook_slug}List your webhook URLs:
daita webhook list#Field Mapping
The field_mapping transforms incoming webhook payloads into agent inputs:
field_mapping:
'source.path': 'target_name' # Simple field mapping
'array[0].field': 'first_item' # Array indexing
'nested.deep.value': 'value' # Nested object accessExample: GitHub push webhook payload:
{
"repository": { "name": "my-repo" },
"pusher": { "name": "john" },
"commits": [
{
"message": "Fix bug",
"url": "https://github.com/..."
}
]
}Maps to agent input:
{
"repo_name": "my-repo",
"author": "john",
"commit_message": "Fix bug",
"commit_url": "https://github.com/..."
}#Schedule Configuration (Cron Jobs)
Schedule agents and workflows to run automatically using cron expressions:
schedules:
agents:
data_processor:
cron: '0 */6 * * *' # Every 6 hours
enabled: true
timezone: 'UTC'
description: 'Process accumulated data'
data:
batch_size: 1000
report_generator:
cron: '0 9 * * MON' # Every Monday at 9 AM
enabled: true
timezone: 'America/New_York'
description: 'Generate weekly reports'
workflows:
backup_workflow:
cron: '0 0 * * *' # Daily at midnight
enabled: true
timezone: 'UTC'
description: 'Daily backup'#Cron Expression Format
┌───────────── minute (0 - 59)
│ ┌───────────── hour (0 - 23)
│ │ ┌───────────── day of month (1 - 31)
│ │ │ ┌───────────── month (1 - 12)
│ │ │ │ ┌───────────── day of week (0 - 6) (Sunday to Saturday)
│ │ │ │ │
* * * * *Common Patterns:
"*/15 * * * *"- Every 15 minutes"0 */2 * * *"- Every 2 hours"0 9 * * MON-FRI"- Weekdays at 9 AM"0 0 1 * *"- First day of month at midnight"0 0 * * SUN"- Every Sunday at midnight
#Enabling/Disabling Schedules
Control whether schedules are active:
schedules:
agents:
data_processor:
cron: '0 */6 * * *'
enabled: true # Active schedule
backup_agent:
cron: '0 0 * * *'
enabled: false # Disabled (paused)#Complete Example
Here's a comprehensive daita-project.yaml with all features:
name: my_production_app
version: 2.1.0
description: Production AI agent application
agents:
# GitHub webhook processor
- name: github_processor
description: 'Processes GitHub webhook events'
file: agents/github_processor.py
enabled: true
enable_retry: true
retry_policy:
max_retries: 3
base_delay: 1.0
strategy: exponential
webhooks:
- slug: 'github-push'
field_mapping:
'repository.name': 'repo_name'
'commits[0].message': 'commit_message'
# Scheduled data processor
- name: data_processor
description: 'Processes accumulated data'
file: agents/data_processor.py
enabled: true
settings:
batch_size: 1000
# Report generator
- name: report_generator
description: 'Generates reports'
file: agents/report_generator.py
enabled: true
workflows:
- name: data_pipeline
description: 'Complete data processing pipeline'
file: workflows/data_pipeline.py
enabled: true
# Schedule automated tasks
schedules:
agents:
data_processor:
cron: '0 */6 * * *' # Every 6 hours
enabled: true
timezone: 'UTC'
description: 'Process accumulated data'
report_generator:
cron: '0 9 * * MON' # Monday mornings
enabled: true
timezone: 'America/New_York'
description: 'Generate weekly reports'
workflows:
data_pipeline:
cron: '0 2 * * *' # Daily at 2 AM
enabled: true
timezone: 'UTC'#Working with the Configuration
#View Current Configuration
# View your project configuration
cat daita-project.yaml
# Test deployment (dry run)
daita push --dry-run#Update Configuration
Edit daita-project.yaml manually or let the CLI update it:
# Create new agent (auto-adds to config)
daita create agent new_processor
# Create new workflow (auto-adds to config)
daita create workflow new_pipeline#Deploy Configuration
# Deploy with your configuration
daita push
# List webhook URLs
daita webhook list
# View deployment status
daita status#Configuration Best Practices
#1. Version Your Configuration
Always update version when making significant changes:
name: my_app
version: 2.1.0 # Increment on changes#2. Keep Webhooks Organized
Group related webhooks under the same agent:
agents:
- name: integration_agent
webhooks:
- slug: 'github-push'
field_mapping: ...
- slug: 'github-pr'
field_mapping: ...
- slug: 'slack-mention'
field_mapping: ...#3. Document Your Schedules
Always include descriptions for scheduled tasks:
schedules:
agents:
processor:
cron: '0 2 * * *'
description: 'Daily data processing at 2 AM UTC' # Clear description#4. Test Configuration Changes
Use dry-run before deploying:
# See what will be deployed
daita push --dry-run
# Then deploy
daita push#Next Steps
- Create agents and configure them in
daita-project.yaml - Set up authentication for LLM providers
- Deploy to the cloud with your configuration using
daita push