#[0.6.2] - 2026-01-19
#Fixed
- Token Usage Tracking
- Fixed bug where
get_token_stats()only reported tokens from the last LLM call - Token usage now correctly accumulates across all calls during autonomous execution
- Particularly impacts tool-calling scenarios with multiple LLM iterations
- Cost estimation now reflects actual total usage instead of partial usage
- Fixed bug where
#Changed
-
Anthropic Provider - Timeout Configuration
- Added configurable timeout support for Anthropic API calls
- Timeout now properly passed through Agent kwargs to provider initialization
- Default timeout set to 60 seconds with separate read/write/connect timeouts
- Resolves timeout issues with long-running schema introspection operations
-
Agent - LLM Provider Initialization
- Agent now properly stores and passes through kwargs to LLM provider
- Enables configuration options like
timeout,max_tokensto reach providers - Improved flexibility for provider-specific parameter configuration