Skip to main content

Overview

Heroku provides an OpenAI-compatible inference API. The default base URL is https://us.inference.heroku.com (with /v1 appended). You can override it with HEROKU_INFERENCE_URL or pass base_url to HerokuProvider. Model Class: HerokuModel

Authentication

export HEROKU_INFERENCE_KEY="..."
# Optional: custom base URL (without /v1)
export HEROKU_INFERENCE_URL="https://us.inference.heroku.com"

Examples

from upsonic import Agent, Task
from upsonic.models.heroku import HerokuModel

model = HerokuModel(model_name="claude-sonnet-4-6")
agent = Agent(model=model)

task = Task("Hello, how are you?")
result = agent.do(task)
print(result)

Model Settings

You can set model parameters in two ways: on the model or on the Agent. On the model:
from upsonic import Agent, Task
from upsonic.models.heroku import HerokuModel, HerokuModelSettings

model = HerokuModel(
    model_name="claude-sonnet-4-6",
    settings=HerokuModelSettings(max_tokens=1024, temperature=0.7)
)
agent = Agent(model=model)
On the Agent:
from upsonic import Agent, Task
from upsonic.models.heroku import HerokuModelSettings

agent = Agent(
    model="heroku/claude-sonnet-4-6",
    settings=HerokuModelSettings(max_tokens=1024, temperature=0.7)
)

Parameters

ParameterTypeDescriptionDefaultSource
max_tokensintMaximum tokens to generateModel defaultBase
temperaturefloatSampling temperature (0.0-2.0)1.0Base
top_pfloatNucleus sampling1.0Base
seedintRandom seedNoneBase
stop_sequenceslist[str]Stop sequencesNoneBase
presence_penaltyfloatToken presence penalty0.0Base
frequency_penaltyfloatToken frequency penalty0.0Base
parallel_tool_callsboolAllow parallel toolsTrueBase
timeoutfloatRequest timeout (seconds)600Base