Skip to main content

Overview

LM Studio lets you run large language models locally with an OpenAI-compatible API. It’s useful for development, testing, and privacy-sensitive use cases. Model Class: OpenAIChatModel (OpenAI-compatible API)

Authentication

export LMSTUDIO_BASE_URL="http://localhost:1234/v1"  # Required

Examples

from upsonic import Agent, Task
from upsonic.models.lmstudio import LMStudioModel

model = LMStudioModel(model_name="local-model")
agent = Agent(model=model)

task = Task("Hello, how are you?")
result = agent.do(task)
print(result)

Model Settings

You can set model parameters in two ways: on the model or on the Agent. On the model:
from upsonic import Agent, Task
from upsonic.models.lmstudio import LMStudioModel, LMStudioModelSettings

model = LMStudioModel(
    model_name="local-model",
    settings=LMStudioModelSettings(max_tokens=1024, temperature=0.7)
)
agent = Agent(model=model)
On the Agent:
from upsonic import Agent, Task
from upsonic.models.lmstudio import LMStudioModelSettings

agent = Agent(
    model="lmstudio/local-model",
    settings=LMStudioModelSettings(max_tokens=1024, temperature=0.7)
)

Parameters

ParameterTypeDescriptionDefaultSource
max_tokensintMaximum tokens to generateModel defaultBase
temperaturefloatSampling temperature0.8Base
top_pfloatNucleus sampling0.9Base
seedintRandom seedNoneBase
stop_sequenceslist[str]Stop sequencesNoneBase