Skip to main content

Overview

Groq provides ultra-fast inference through their Language Processing Unit (LPU) technology. Access open-source models with industry-leading speed and built-in web search capabilities. Model Class: GroqModel

Authentication

export GROQ_API_KEY="gsk_..."

Examples

from upsonic import Agent, Task
from upsonic.models.groq import GroqModel

model = GroqModel(model_name="openai/gpt-oss-120b")
agent = Agent(model=model)

task = Task("Hello, how are you?")
result = agent.do(task)
print(result)

Model Settings

You can set model parameters in two ways: on the model or on the Agent. On the model:
from upsonic import Agent, Task
from upsonic.models.groq import GroqModel, GroqModelSettings

model = GroqModel(
    model_name="openai/gpt-oss-120b",
    settings=GroqModelSettings(max_tokens=1024, temperature=0.7)
)
agent = Agent(model=model)
On the Agent:
from upsonic import Agent, Task
from upsonic.models.groq import GroqModelSettings

agent = Agent(
    model="groq/llama-3.3-70b-versatile",
    settings=GroqModelSettings(max_tokens=1024)
)

Parameters

ParameterTypeDescriptionDefaultSource
max_tokensintMaximum tokens to generate1024Base
temperaturefloatSampling temperature (0.0-2.0)1.0Base
top_pfloatNucleus sampling1.0Base
seedintRandom seedNoneBase
stop_sequenceslist[str]Stop sequencesNoneBase
presence_penaltyfloatToken presence penalty0.0Base
frequency_penaltyfloatToken frequency penalty0.0Base
parallel_tool_callsboolAllow parallel toolsTrueBase
timeoutfloatRequest timeout (seconds)600Base
groq_reasoning_format'hidden' | 'raw' | 'parsed'Reasoning output formatNoneSpecific