Overview

The Upsonic framework’s Agent system optimizes tasks by creating a simulated corporate environment where agents operate with defined roles and responsibilities. By using job title parameters, the system sets decision-making frameworks and performance objectives for each agent to streamline workflows.

Agent provide a reusable architecture for better efficiency and flexibility. For example, when analyzing servers, a software engineer agent can be dynamically assigned to relevant tasks. This approach promotes code reuse and optimizes system performance through smart agent allocation.

Creating an Agent

The AgentConfiguration class serves as a fundamental component in ensuring optimal task execution within the system. Given its critical role, developers should allocate sufficient time to properly configure and fine-tune this class. Additionally, it is essential to conduct comprehensive testing across various configuration scenarios to validate performance and behavior under different operational conditions.

from upsonic import Agent

agent = Agent(
    name="Product Manager",

    company_url="https://upsonic.ai",
    company_objective="Developing an AI Agent Framework",
)

Changing LLM Model

To specify which LLM the agent will use, it’s sufficient to directly use the model parameter. You can check the LLM support section to see all supported LLMs.

from upsonic import Agent

agent = Agent(  
    name="Product Manager",
    model="openai/gpt-4o" # Set the llm model
)

Agent Attributest

Agents are equipped with supplementary features designed to enhance their performance and increase success rates during task execution. These configurable capabilities can be dynamically adjusted throughout the task lifecycle, allowing for real-time optimization of the agent’s processing capacity. By fine-tuning these features, users can significantly improve the probability of successful task completion while meeting specific operational requirements.

AttributeParametersTypeDescription
job_titlejob_titlestrThe job title of Agent.
company_url (Optional)company_urlstrThe url of your company
company_objective (Optional)company_objectivestrThe objective of your company
system_prompt (Optional)system_promptstrOverwrite the characterization.
Name (Optional)namestrThe name of the human that represent from Agent
Contact (Optional)contactstrThe contact info of the human that represent from Agent
Memory (Optional)memorybooleanThe persistent memory by the agent id (Default: False)
Reflection (Optional)reflectionbooleanReflection mode for agent. (Default: False)
Compress Context (Optional)compress_contextbooleanCompress the context for LLM context lenght (Default: True)
model (Optional)modelstrThe llm model for Agent (Default: openai/gpt-4o)

Memory

Memory management plays a crucial role in maintaining contextual continuity across distributed tasks and timeframes for agent operations. The framework implements a disk-based persistence mechanism that associates memory storage with unique agent identifiers (IDs). To enable persistent memory functionality, developers must explicitly define and consistently maintain agent IDs across all agent definitions within their implementation.

from upsonic import Agent


agent = Agent(
    name="Marketing Manager",
    memory=True, # Enabling the memory
)

Reflection

During task execution, agents may occasionally generate inaccurate results or misinterpret task objectives, which can significantly impact system stability and output quality, particularly when critical sub-tasks are involved. To address this challenge, the framework implements a sophisticated reflection feature that enables continuous self-monitoring and quality assurance.

from upsonic import Agent

agent = Agent(  
    name="Marketing Manager",
    reflection=True, # Enabling the reflection
)

Compress Context

One of the main limitations of LLMs is the Context Length Limit, which affects how much data the model can process at once when generating outputs. This directly impacts how well-referenced and accurate the results will be. The Upsonic framework handles this limitation by automatically summarizing resources when they exceed the context limit. When your input data is too large, the system compresses it while keeping the important information, allowing the LLM to continue functioning normally. This removes the need to manually calculate and manage context limits, letting you focus on your actual work.

from upsonic import Agent

agent = Agent(  
    name="Marketing Manager",
    compress_context=True, # Enabling the compress_context
)