Concepts
LLM Support
Use variously llms to handle your agents and tasks.
Overview
The Upsonic framework can use .env variable files or environment variables for LLM support. Once you provide the keys for various services, you can easily select the LLM by using the model parameter within the agents.
The supported LLMs are:
-
OpenAI
-
openai/gpt-4o
-
openai/o3-mini
-
-
Azure
- azure/gpt-4o
-
Anthropic
- claude/claude-3-5-sonnet
-
AWS Bedrock
- bedrock/claude-3-5-sonnet
-
DeepSeek
- deepseek/deepseek-chat
Setting up .env
To use these LLMs, an example .env variable file is as follows. You can adjust the variables according to the LLM you want to use and fill in only the ones required. This .env file must be located in your working directory.
Using an specific model in Agent
Using the model parameter, you can easily select which LLM each agent will use. You can refer to the example below: