Daita Logo

Project Commands

Scaffold, build, test, and deploy Daita projects from the command line.

The project commands cover the local development lifecycle: scaffolding a new project, adding components, running them locally, and pushing to the cloud. These commands require daita-agents to be installed in addition to daita-cli.

#init

Scaffold a new Daita project in the current directory.

bash
daita init [project_name]

If project_name is omitted you are prompted for it interactively.

Options:

FlagDefaultDescription
--typebasicProject template: basic, analysis, or pipeline
--forceOverwrite an existing project without confirmation

Example:

bash
daita init my-data-pipeline --type pipeline

This creates:

python
my-data-pipeline/
├── .daita/
├── agents/
│   └── my_agent.py         # starter agent with factory function
├── workflows/
│   └── my_workflow.py      # starter workflow
├── data/
├── tests/
│   └── test_agents.py
├── daita-project.yaml
├── requirements.txt
└── .gitignore

Each generated agent and workflow includes a create_agent() / create_workflow() factory function that daita test and daita push use to load your components.

#create

Add a new agent or workflow to the current project.

bash
daita create agent <name>
daita create workflow <name>

Example:

bash
daita create agent customer_support
daita create workflow ingestion_pipeline

A new Python file is created under agents/ or workflows/ with a boilerplate class and factory function. The component is also registered in daita-project.yaml.

#test

Run agents and workflows locally using your current environment.

bash
daita test [target]

If target is omitted, all agents and workflows in the project are tested.

Options:

FlagDescription
--data <file>JSON or text file to use as input data
--watchWatch for file changes and re-run automatically (requires watchdog)

Example:

bash
# Test everything
daita test
 
# Test a specific agent with custom input
daita test my_agent --data sample.json
 
# Watch mode during development
daita test --watch

The CLI dynamically imports each agent or workflow via importlib, calls its factory function, and runs it. Timing and token metrics are printed on success.

#push

Package and deploy the current project to the Daita cloud.

bash
daita push

Options:

FlagDescription
--forceSkip the confirmation prompt
--dry-runShow what would be packaged and deployed without actually deploying

Requirements: DAITA_API_KEY must be set.

Example:

bash
daita push --dry-run    # preview the deployment
daita push              # deploy for real

Push creates a zip package from your project, uploads it, and deploys it. The following paths are excluded from the package:

  • .daita/, .git/, .pytest_cache/, .mypy_cache/
  • __pycache__/, venv/, env/, .venv/, node_modules/
  • tests/, data/

A deployment ID is returned on success. Use daita status or daita deployments show <id> to monitor the rollout.