Skip to main content

Chat settings

Chat settings are reusable configuration objects that control which model provider is used for chats and workflows, along with generation parameters (temperature, max tokens, etc.).

These settings are stored in the chat service and referenced by other parts of the platform (for example workflows can carry a chatSettingsId).

API endpoints

All endpoints require Authorization: Bearer <ACCESS_TOKEN>.

  • GET /api/chat/settings — list settings
  • POST /api/chat/settings — create settings
  • HEAD /api/chat/settings/{id} — check existence (200 if exists, 204 if not)
  • GET /api/chat/settings/{id} — fetch by id
  • PUT /api/chat/settings/{id} — update
  • DELETE /api/chat/settings/{id} — delete (204)

Shape of a chat settings object

Chat settings are typed by their provider using the discriminator field _t.

Common fields:

  • name (string)
  • default (boolean)
  • provider (object)
  • model (string, optional)
  • embeddingModel (string, optional)
  • temperature (number, optional)
  • topK (number, optional)
  • maxTokens (number, optional)

Provider types

The chat service supports multiple LLM providers. The provider type is selected by setting provider._t.

Examples include:

  • azure-openai
  • openai
  • ollama
  • azure-ai-inference
  • anthropic
  • cohere
  • deepseek
  • groq
  • mistral-ai
  • google-gen-ai
  • huggingface

Create example

This example mirrors the chat-service integration tests.

{
"name": "My chat settings",
"default": false,
"provider": {
"_t": "azure-openai",
"endpoint": "https://YOUR-RESOURCE.openai.azure.com",
"apiKey": "YOUR_AZURE_OPENAI_API_KEY",
"apiVersion": "2024-02-01",
"deployment": "gpt-4",
"embeddingDeployment": "text-embedding-3-large"
},
"model": "gpt-4",
"embeddingModel": "text-embedding-3-large",
"temperature": 0.7,
"topK": 5,
"maxTokens": 4096
}
note

For some providers, the backend can also read credentials from environment variables (for example AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY). If you omit fields in the payload, make sure the service is configured accordingly.