Builtin Tools
Overview
Upsonic’s built-in tools are pre-configured, production-ready tools that provide essential capabilities for AI agents without requiring custom development. These tools are designed to work seamlessly with the framework and offer both general-purpose functionality and model provider-specific optimizations.Key Features
- Zero Configuration: Ready to use out of the box with sensible defaults
- Provider Integration: Native support for model provider-specific tools (OpenAI, Anthropic, Google)
- Web Capabilities: Built-in web search and content reading functionality
- Code Execution: Native code execution capabilities through model providers
- Error Handling: Robust error handling and retry mechanisms
- Performance Optimized: Optimized for speed and reliability
Tool Categories
-
Function-Based Tools: General-purpose tools executed by the Upsonic framework
WebSearch
: DuckDuckGo-powered web searchWebRead
: Web content extraction and reading
-
Model Provider Tools: Native tools executed by model provider infrastructure
WebSearchTool
: Advanced web search with provider-specific featuresCodeExecutionTool
: Sandboxed code executionUrlContextTool
: Direct URL content access (Google only)
When to Use Built-in Tools
- Quick Prototyping: Get started immediately without tool development
- Web Research: Search and read web content for information gathering
- Code Execution: Run code snippets and calculations
- Production Applications: Use proven, tested tools for critical functionality
- Cross-Platform Compatibility: Ensure consistent behavior across different environments
WebSearch Function
TheWebSearch
function allows you to search the web using DuckDuckGo and return formatted results.
WebSearch Parameters
query
: The search query stringmax_results
: Maximum number of results to return (default: 10)
Example Usage
WebRead Function
TheWebRead
function allows you to read and extract text content from web pages.
WebRead Parameters
url
: The URL to read from
Example Usage
Model Provider Built-in Tools
These tools are designed to work with specific model providers and provide native capabilities.WebSearchTool
A built-in tool that allows models to search the web for information. Supported by Anthropic, OpenAI Responses API, and Google.CodeExecutionTool
A built-in tool that allows models to execute code. Supported by Anthropic, OpenAI Responses API, and Google.UrlContextTool
Allows models to access contents from URLs. Supported by Google only.Combining Built-in Tools
You can combine multiple built-in tools in a single task:Custom Wrappers for Built-in Tools
You can create custom wrappers around built-in tools to add specific functionality:Best Practices
- Choose the right tool: Use function-based tools (WebSearch, WebRead) for general use, and model provider tools for specific capabilities
- Handle rate limits: Use caching for frequently accessed content
- Configure appropriately: Set appropriate limits and filters for web search tools
- Error handling: Built-in tools handle errors gracefully, but you can add custom error handling in wrappers
Error Handling
Built-in tools include built-in error handling:Performance Considerations
Built-in Function Tools (WebSearch, WebRead)
- Caching: Use
cache_results=True
for frequently accessed content - Rate limiting: Manual implementation required (no automatic rate limiting)
- Timeout: Configurable via
timeout
parameter (default: 30 seconds) - Parallel execution: ✅ Supported (can run in parallel with other tools)
Built-in Model Provider Tools (WebSearchTool, CodeExecutionTool, UrlContextTool)
- Caching: Provider-managed (not configurable by framework)
- Rate limiting: Automatic (handled by model providers)
- Timeout: Provider-managed (not configurable by framework)
- Parallel execution: ❌ Always sequential (executed by model provider)
Model Provider Compatibility
Built-in tools work with specific model providers:- WebSearchTool: ✅ OpenAI Responses API, Anthropic, Google
- CodeExecutionTool: ✅ OpenAI Responses API, Anthropic, Google
- UrlContextTool: ✅ Google only
- For OpenAI: Use
openai-responses/gpt-4o
(notopenai/gpt-4o
)
Key Differences from Custom Function Tools
Built-in tools come in two types, each with different characteristics:Type 1: Built-in Function Tools (WebSearch, WebRead)
These are regular Python functions executed by the Upsonic framework:- Execution Model: Executed by the Upsonic framework (like custom function tools)
- Configuration Options: Full configuration support (caching, hooks, retries, etc.)
- Performance Characteristics: Can be parallelized, framework-managed retries and caching
- Error Handling: Framework-managed error handling with configurable retries
Type 2: Built-in Model Provider Tools (WebSearchTool, CodeExecutionTool, UrlContextTool)
These are model provider-specific tools executed by the provider’s infrastructure:- Execution Model: Executed by the model provider’s infrastructure
- Configuration Options: Limited configuration (provider-specific parameters only)
- Performance Characteristics: Always sequential, provider-managed rate limiting
- Error Handling: Provider-managed error handling and retries
Comparison Table
Feature | Custom Function Tools | Built-in Function Tools | Built-in Model Provider Tools |
---|---|---|---|
Execution | Upsonic Framework | Upsonic Framework | Model Provider Infrastructure |
Configuration | Full Support | Full Support | Limited (Provider-specific) |
Parallelization | ✅ Supported | ✅ Supported | ❌ Always Sequential |
Caching | ✅ Configurable | ✅ Configurable | ❌ Provider-managed |
Retries | ✅ Configurable | ✅ Configurable | ❌ Provider-managed |
Hooks | ✅ Supported | ✅ Supported | ❌ Not Supported |
Rate Limiting | ❌ Manual | ❌ Manual | ✅ Automatic |
Best Practices
When to Use Each Type
-
Use Built-in Function Tools (WebSearch, WebRead) for:
- Simple web operations that need framework features (caching, retries, hooks)
- When you want full control over execution and error handling
- Cross-provider compatibility (works with any model)
-
Use Built-in Model Provider Tools (WebSearchTool, CodeExecutionTool, UrlContextTool) for:
- Advanced web search with provider-specific features (location, domain filtering)
- Code execution with provider-managed sandboxing
- URL context access (Google only)
- When you want provider-optimized performance
-
Use Custom Function Tools for:
- Custom business logic and data processing
- External API integrations
- Complex workflows requiring framework features
Combination Strategies
- Hybrid Approach: Use model provider tools for core capabilities and custom tools for business logic
- Fallback Strategy: Use built-in function tools as fallbacks when model provider tools aren’t supported
- Model Selection: Always use the correct model provider for model provider tools
Troubleshooting
Common Issues
“WebSearchTool is not supported with OpenAIChatModel”- Solution: Use
openai-responses/gpt-4o
instead ofopenai/gpt-4o
- Solution: Use
google
provider
- Check: Ensure you’re using the correct model provider
- Check: Verify the tool is properly imported and added to the task
- Check: Built-in tools are now properly supported (as of the latest framework update)
Debugging Tips
- Enable debug mode: Use
debug=True
in agent calls to see detailed execution logs - Check tool registration: Built-in tools should appear in the tool usage summary
- Verify model compatibility: Each built-in tool has specific model provider requirements
- Test with simple examples: Start with basic configurations before adding advanced parameters