LLM Support
Use variously llms to handle your agents and tasks.
Overview
The Upsonic framework can use .env variable files or environment variables for LLM support. Once you provide the keys for various services, you can easily select the LLM by using the model parameter within the agents.
The supported LLMs are:
-
OpenAI
- openai/gpt-4o
- openai/gpt-4.5-preview
- openai/o3-mini
- openai/gpt-4o-mini
-
Azure
- azure/gpt-4o
- azure/gpt-4o-mini
-
Anthropic
- claude/claude-3-5-sonnet
- claude/claude-3-7-sonnet
-
AWS Bedrock
- bedrock/claude-3-5-sonnet
-
DeepSeek
- deepseek/deepseek-chat
-
Google Gemini
- gemini/gemini-2.0-flash
- gemini/gemini-1.5-pro
- gemini/gemini-1.5-flash
-
Ollama
- ollama/llama3.2
- ollama/llama3.1:70b
- ollama/llama3.1
- ollama/llama3.3
- ollama/qwen2.5
-
OpenRouter
- openrouter/anthropic/claude-3.5-sonnet
- openrouter/meta-llama/llama-3-70b
- openrouter/google/gemini-pro
-
- See the note below for more information about OpenRouter’s gateway functionality
Note: OpenRouter acts as a gateway to access various LLM models from different providers. You can use any model available on OpenRouter by following the pattern
openrouter/{provider}/{model-name}
. This allows you to access models that might not be directly supported through their native APIs.
Setting up .env
To use these LLMs, an example .env variable file is as follows. You can adjust the variables according to the LLM you want to use and fill in only the ones required. This .env file must be located in your working directory.
Using an specific model in Agent
Using the model parameter, you can easily select which LLM each agent will use. You can refer to the example below:
Using an specific model in Direct
As like Agent you can specify model directly with model
parameter.