Chat settings
Chat settings are reusable configuration objects that control which model provider is used for chats and workflows, along with generation parameters (temperature, max tokens, etc.).
These settings are stored in the chat service and referenced by other parts of the platform (for example workflows can carry a chatSettingsId).
API endpoints
All endpoints require Authorization: Bearer <ACCESS_TOKEN>.
GET /api/chat/settings— list settingsPOST /api/chat/settings— create settingsHEAD /api/chat/settings/{id}— check existence (200if exists,204if not)GET /api/chat/settings/{id}— fetch by idPUT /api/chat/settings/{id}— updateDELETE /api/chat/settings/{id}— delete (204)
Shape of a chat settings object
Chat settings are typed by their provider using the discriminator field _t.
Common fields:
name(string)default(boolean)provider(object)model(string, optional)embeddingModel(string, optional)temperature(number, optional)topK(number, optional)maxTokens(number, optional)
Provider types
The chat service supports multiple LLM providers. The provider type is selected by setting provider._t.
Examples include:
azure-openaiopenaiollamaazure-ai-inferenceanthropiccoheredeepseekgroqmistral-aigoogle-gen-aihuggingface
Create example
This example mirrors the chat-service integration tests.
{
"name": "My chat settings",
"default": false,
"provider": {
"_t": "azure-openai",
"endpoint": "https://YOUR-RESOURCE.openai.azure.com",
"apiKey": "YOUR_AZURE_OPENAI_API_KEY",
"apiVersion": "2024-02-01",
"deployment": "gpt-4",
"embeddingDeployment": "text-embedding-3-large"
},
"model": "gpt-4",
"embeddingModel": "text-embedding-3-large",
"temperature": 0.7,
"topK": 5,
"maxTokens": 4096
}
For some providers, the backend can also read credentials from environment variables (for example AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY). If you omit fields in the payload, make sure the service is configured accordingly.