#[0.11.0] - 2026-03-17
This release introduces three major capabilities: Agent.from_db() for zero-config database agents, the @agent.watch() system for real-time data monitoring, and a new suite of data plugins (SQLite, Transformer, Data Quality). The agent core has been significantly restructured for clarity, and the graph backend now supports custom distributed implementations.
#Added
-
Agent.from_db()— Database Agent Factory- Create a fully configured agent from a connection string or plugin instance with a single
awaitcall - Automatically connects, discovers the schema, generates a context-aware system prompt, and registers the appropriate query tools
- Infers the data domain from table names to produce a more natural default agent name and prompt context
- Numeric column sampling provides the LLM with value distributions; PII columns are redacted by default (
redact_pii_columns=True) - Schema discovery results are cacheable with a configurable TTL (
cache_ttl=) and drift-detected on subsequent calls - Opt-in
lineage=,memory=, andhistory=parameters auto-create and attach the corresponding plugins read_only=True(default) restricts the agent to query tools onlytoolkit="analyst"(default) registers six analytical tools: pivot table, correlation, anomaly detection, entity comparison, similar-record search, and trend forecasting- Supports PostgreSQL, MySQL, SQLite, and MongoDB connection strings
- Create a fully configured agent from a connection string or plugin instance with a single
-
Analytical Toolkit (
daita.agents.db.tools)- Six plug-and-play tools available via
Agent.from_db()or manual registration:pivot_table— group and aggregate tabular results in-memorycorrelate— compute Pearson correlations between numeric columnsdetect_anomalies— flag statistical outliers using IQRcompare_entities— diff two entities row-by-row across a key columnfind_similar— rank records by similarity to a reference entityforecast_trend— linear trend projection over a time-series column
- All tools work on any tabular data returned by a previous query tool
- Six plug-and-play tools available via
-
@agent.watch()— Real-Time Data Monitoring- New decorator for defining data-driven event handlers on a running agent
- Supports polling sources (database plugins, async callables) on configurable intervals (
"30s","5m","1h","2d") - Condition evaluation via SQL
WHEREstrings or Python predicates; threshold callbacks for numeric comparisons on_resolve=Truefires a second event when a triggered condition clears- Streaming source protocol (
WatchSource) for push-based integrations WatchEventcarries the current value, previous value, timestamp, source type, and resolution flag- Exported from top-level package:
from daita import WatchEvent
-
SQLite Plugin
- New
SqlitePluginwith full async support viaaiosqlite - Tools:
sqlite_query,sqlite_list_tables,sqlite_get_schema,sqlite_inspect,sqlite_execute - Supports in-memory databases and file-based databases
- Compatible with Focus DSL pushdown and
Agent.from_db() - Install with:
pip install 'daita-agents[sqlite]'
- New
-
Transformer Plugin
- New
TransformerPluginfor declarative in-memory data transformations - Tools: filter rows, rename columns, cast types, aggregate, pivot, join, deduplicate, and compute derived columns
- Operates on data returned by other tools without additional database access
- Install with:
pip install 'daita-agents[data]'
- New
-
Data Quality Plugin
- New
DataQualityPluginfor validating datasets against configurable rule sets - Built-in rules: null checks, uniqueness, range bounds, pattern matching, and referential integrity
- Returns a structured report with per-column pass/fail counts and failing row samples
- Install with:
pip install 'daita-agents[data]'
- New
-
Custom Graph Backends
register_backend_factory()allows replacing the defaultLocalGraphBackendwith any implementation that satisfies theGraphBackendprotocol- Enables distributed or cloud-backed graph storage for Lineage and Catalog plugins in multi-agent deployments
GraphBackendprotocol is@runtime_checkable; the local NetworkX/JSON backend remains the zero-config default
-
Connection Retry & Observability
BaseAgentgains a configurable connection retry loop with exponential backoff- Per-retry events are emitted to the trace system, surfacing retry counts and failure reasons in the dashboard
#Changed
-
Agent Core Restructured
agent.pyrefactored: initialization, tool dispatch, plugin management, and execution are now in clearly separated internal layersBaseAgentslimmed down — lifecycle hooks and retry infrastructure consolidated; watch startup is deferred until after_running=True- Plugin
connect()lifecycle formalized: plugins now expose a richer interface including health-check and disconnect hooks AgentConfigextended with retry policy fields surfaced in the constructor
-
Focus System: SQL Backend Improvements
- SQL pushdown backend rewritten to handle a broader set of DSL constructs natively
GROUP BYwith aggregates now pushes correctly for all supported dialects- Dot-notation field access and unsupported expressions fall back gracefully to the Python evaluator
evaluate_remainingnow skips projection when aggregate columns are absent from result rows, preventing empty-result regressions
-
Catalog Plugin: Distributed Consistency
- Internal graph writes use the registered backend factory, so multi-node catalog deployments share a single graph store
- Schema comparison and diagram export paths updated to work through the
GraphBackendprotocol
#Fixed
-
**Agent
**kwargsForwarding**- Extra keyword arguments passed to
Agent()are now forwarded correctly to the underlying LLM provider, fixing silent drops of provider-specific options
- Extra keyword arguments passed to
#Removed
-
run_detailed()— Replaced byrun(detailed=True)agent.run_detailed(prompt)has been removed- Use
agent.run(prompt, detailed=True)to get the full execution dict (result, tool_calls, iterations, tokens, cost, elapsed time) - Plain
agent.run(prompt)continues to return the result string as before
-
RowAssertion- Removed
RowAssertionfromdaita.core.assertionsand from the top-level package exports - Use column-level assertions (
NotNullAssertion,UniqueAssertion,RangeAssertion, etc.) orDataQualityPluginfor row-level validation
- Removed