Daita Logo

SQLite Plugin

File-based and in-memory async SQLite access with WAL mode support, built on `aiosqlite`.

#Installation

bash
pip install 'daita-agents[sqlite]'

#Quick Start

python
from daita import Agent
from daita.plugins import sqlite
 
# Create plugin
db = sqlite(path="./app.db")
 
# Agent uses database tools autonomously
agent = Agent(
    name="DB Agent",
    prompt="You are a database analyst. Help users query and analyze data.",
    tools=[db]
)
 
await agent.start()
result = await agent.run("Show me all orders from this week")

#Direct Usage

The plugin can be used directly without agents for programmatic access. The main value of this plugin is agent integration — enabling LLMs to autonomously query and analyze SQLite data.

#Connection Parameters

python
sqlite(
    path: str = ":memory:",
    wal_mode: bool = True,
    timeout: float = 5.0,
)

#Parameters

  • path (str): Path to the SQLite database file, or ":memory:" for an in-process database discarded on close. Defaults to ":memory:".
  • wal_mode (bool): Enable WAL journal mode for better concurrent read performance. Ignored for in-memory databases. Defaults to True.
  • timeout (float): Seconds to wait when the database is locked before raising an error. Defaults to 5.0.

#Connection Methods

File-based or in-memory:

python
from daita.plugins import sqlite
 
# File-based database (persists across sessions)
async with sqlite(path="./data.db") as db:
    rows = await db.query("SELECT * FROM users")
 
# In-memory database (default — discarded on close)
async with sqlite() as db:
    rows = await db.query("SELECT 1 AS n")
 
# File-based with custom timeout
async with sqlite(path="./analytics.db", timeout=30.0) as db:
    results = await db.query("SELECT * FROM events")

#Query Operations

SELECT queries — use query(), returns list of dictionaries:

python
from daita.plugins import sqlite
 
async with sqlite(path="./app.db") as db:
    # Simple query
    users = await db.query("SELECT id, name, email FROM users")
    # Returns: [{"id": 1, "name": "Alice", "email": "alice@example.com"}, ...]
 
    # Parameterized query (use ? placeholders)
    active_users = await db.query(
        "SELECT * FROM users WHERE status = ? AND created_at > ?",
        ["active", "2024-01-01"]
    )

Parameter syntax: SQLite uses ? placeholders (not $1 or %s).

INSERT, UPDATE, DELETE — use execute(), returns affected row count:

python
async with sqlite(path="./app.db") as db:
    # Insert
    await db.execute(
        "INSERT INTO users (name, email) VALUES (?, ?)",
        ["Jane Doe", "jane@example.com"]
    )
 
    # Update
    await db.execute(
        "UPDATE users SET last_login = datetime('now') WHERE id = ?",
        [user_id]
    )
 
    # Bulk insert
    users_data = [
        {"name": "Alice", "email": "alice@example.com"},
        {"name": "Bob", "email": "bob@example.com"},
    ]
    count = await db.insert_many("users", users_data)

Multi-statement scripts — use execute_script() for schema setup and migrations:

python
async with sqlite(path="./app.db") as db:
    await db.execute_script("""
        CREATE TABLE IF NOT EXISTS users (
            id    INTEGER PRIMARY KEY AUTOINCREMENT,
            name  TEXT NOT NULL,
            email TEXT UNIQUE
        );
        CREATE TABLE IF NOT EXISTS orders (
            id      INTEGER PRIMARY KEY AUTOINCREMENT,
            user_id INTEGER REFERENCES users(id),
            total   REAL
        );
    """)

#Schema Introspection

python
async with sqlite(path="./app.db") as db:
    # List all user tables
    tables = await db.tables()
    # Returns: ["orders", "products", "users"]
 
    # Column info for a table
    columns = await db.describe("users")
    # Returns: [{"column_name": "id", "data_type": "INTEGER", "is_nullable": "NO",
    #            "default_value": None, "is_primary_key": True}, ...]
 
    # Read/set PRAGMA values
    journal_mode = await db.pragma("journal_mode")
    await db.pragma("cache_size", -64000)   # 64 MB cache

#Using with Agents

python
from daita import Agent
from daita.plugins import sqlite
import os
 
# Create database plugin
db = sqlite(path=os.getenv("SQLITE_PATH", "./data.db"))
 
# Pass plugin to agent — agent can use database tools autonomously
agent = Agent(
    name="Data Analyst",
    prompt="You are a database analyst. Help users query and analyze data.",
    llm_provider="openai",
    model="gpt-4",
    tools=[db]
)
 
await agent.start()
 
# Agent autonomously uses SQLite tools to answer questions
result = await agent.run("Show me the top 10 products by revenue")
 
# The agent will autonomously:
# 1. Use sqlite_inspect to explore tables and columns
# 2. Use sqlite_query to fetch and aggregate data
# 3. Analyze and present results in natural language
 
await agent.stop()

#Available Tools

The SQLite plugin exposes these tools to LLM agents:

ToolDescriptionKey Parameters
sqlite_queryRun a SELECT querysql (required), params, limit, columns, focus
sqlite_list_tablesList all tablesNone
sqlite_get_schemaGet column info for a tabletable_name (required)
sqlite_inspectList all tables + schemas in one calltables (optional filter)
sqlite_executeRun INSERT/UPDATE/DELETEsql (required), paramsomitted in read-only mode

Tool Categories: database Tool Source: plugin

sqlite_execute is only registered when read_only=False is not set (the default allows writes unless explicitly restricted via the constructor).

#Error Handling

python
from daita.plugins import sqlite
 
try:
    async with sqlite(path="./mydb.db") as db:
        results = await db.query("SELECT * FROM users WHERE id = ?", [user_id])
except ImportError as e:
    print("Install aiosqlite: pip install 'daita-agents[sqlite]'")
except RuntimeError as e:
    if "locked" in str(e).lower():
        print(f"Database locked — increase timeout: {e}")
    else:
        print(f"Error: {e}")

#Best Practices

Connection Management:

  • Use context managers (async with) for automatic cleanup
  • For in-memory databases, keep the connection open for the lifetime of your data
  • WAL mode (wal_mode=True) is recommended for file-based databases with concurrent reads

Security:

  • Use ? placeholders for all parameterized queries to prevent SQL injection
  • Prefer read_only=True in agent configurations to prevent accidental writes

Performance:

  • Use execute_script() for schema migrations rather than individual execute() calls
  • Use insert_many() for bulk inserts instead of loops
  • Use sqlite_inspect instead of separate sqlite_list_tables + sqlite_get_schema calls

#Troubleshooting

IssueSolution
aiosqlite is requiredpip install 'daita-agents[sqlite]'
Database lockedIncrease timeout, check for long-running queries holding locks
no such tableVerify path points to the correct file; use db.tables() to inspect
Data not persistingCheck you're using a file path, not ":memory:"

#Next Steps