Skip to main content
curl --request POST \
  --url https://api.usedatabrain.com/api/v2/workspace \
  --header 'Authorization: Bearer dbn_live_abc123...' \
  --header 'Content-Type: application/json' \
  --data '{
    "name": "Sales Analytics",
    "connectionType": "DATASOURCE",
    "datasourceName": "postgres-main"
  }'
{
  "data": {
    "name": "Sales Analytics"
  },
  "error": null
}
POST
/
api
/
v2
/
workspace
curl --request POST \
  --url https://api.usedatabrain.com/api/v2/workspace \
  --header 'Authorization: Bearer dbn_live_abc123...' \
  --header 'Content-Type: application/json' \
  --data '{
    "name": "Sales Analytics",
    "connectionType": "DATASOURCE",
    "datasourceName": "postgres-main"
  }'
{
  "data": {
    "name": "Sales Analytics"
  },
  "error": null
}
Create workspaces to organize your analytics environment by connecting datasources or datamarts. Workspaces serve as containers for dashboards and metrics, providing structured access to your data.
Choose a connectionType that matches how this workspace will use data. For DATASOURCE or DATAMART, the corresponding datasourceName or datamartName must already exist in your organization. For MULTI_DATASOURCE or MULTI_DATAMART, those name fields are not required in the request body (see connectionType below).

Authentication

This endpoint requires a service token in the Authorization header. Service tokens differ from data app API keys and provide organization-level permissions. To access your service token:
  1. In Settings page, navigate to the Service Tokens section.
  2. Click the “Generate Token” button to create a new service token if you don’t have one already.
Use this token as the Bearer value in your Authorization header.

Headers

Authorization
string
required
Bearer token for API authentication. Use your service token.
Authorization: Bearer dbn_live_abc123...
Content-Type
string
required
Must be set to application/json for all requests.
Content-Type: application/json

Request Body

name
string
required
Name of the workspace to create. Must be unique within your organization.
connectionType
string
required
Type of connection for the workspace. Must be one of: DATASOURCE, DATAMART, MULTI_DATASOURCE, or MULTI_DATAMART.
datasourceName
string
Name of the datasource to connect to this workspace.Required when connectionType is DATASOURCE.
datamartName
string
Name of the datamart to connect to this workspace.Required when connectionType is DATAMART.
llmName
string
Optional primary LLM name for workspace-level AI features. Must match an existing LLM configured in your organization.
aiCopilotLlms
array
Optional list of LLM names available for AI Copilot in this workspace. Every value must match an existing organization LLM name.
isEnableMetricSuggestions
boolean
Optional flag to enable AI-powered metric suggestions in this workspace. Defaults to false when omitted.
isEnableMetricSummary
boolean
Optional flag to enable AI-generated metric summaries in this workspace. Defaults to false when omitted.
summaryType
string
Summary mode used when metric summaries are enabled. Must be one of: technicalAndInsightSummary, forecastAndTrendAnalysis, comparativeAndAnomalyDetection, custom.Required when isEnableMetricSummary is true.
customSummaryPrompt
string
Custom summary instruction prompt for AI-generated summaries.Required when summaryType is custom.
themeName
string
Optional workspace theme name. Must match an existing theme configured in your organization.

Response

data
object
Contains the created workspace information on success.
error
null | object
Error object if the request failed, otherwise null for successful requests.

Examples

HTTP Status Code Summary

Status CodeDescription
200OK - Workspace created successfully
400Bad Request - Invalid request parameters
401Unauthorized - Invalid or missing API key
500Internal Server Error - Server error occurred

Possible Errors

Error CodeHTTP StatusDescription
INVALID_REQUEST_BODY400Missing or invalid parameters
WORKSPACE_NAME_ALREADY_EXISTS400Workspace name already exists
INVALID_DATASOURCE_NAME400Datasource not found
INVALID_DATAMART_NAME400Datamart not found
INVALID_LLM_NAME400Invalid LLM name provided
INVALID_AI_COPILOT_LLMS400Invalid AI Copilot LLM list
INVALID_THEME_NAME400Invalid theme name provided
INVALID_DATA_APP_API_KEY401Invalid API key
INTERNAL_SERVER_ERROR500Server error

Quick Start Guide

1

Verify prerequisites

Before creating a workspace, ensure you have:
  • A valid API token from your data app
  • Either a datasource or datamart already configured
  • The exact name of your datasource or datamart (case-sensitive)
2

Choose your connection type

Decide which connection type fits your use case:
  • DATASOURCE: For connecting to a single data source
  • DATAMART: For connecting to a pre-configured datamart with table/column configurations
  • MULTI_DATASOURCE: For workspaces that need access to multiple datasources
  • MULTI_DATAMART: For workspaces that support multiple datamarts (same API shape as multi-datasource: only name and connectionType in the minimal body)
Start with DATASOURCE for simple use cases. Use DATAMART when you need structured access with tenancy settings. For multi-datamart setup in the product UI, see Multi Datamart Workspace.
3

Create your workspace

Make the API call with your chosen configuration:
curl --request POST \
  --url https://api.usedatabrain.com/api/v2/workspace \
  --header 'Authorization: Bearer dbn_live_abc123...' \
  --header 'Content-Type: application/json' \
  --data '{
    "name": "My Analytics Workspace",
    "connectionType": "DATASOURCE",
    "datasourceName": "postgres-main"
  }'
Successful response returns the workspace name. Save this for use in embed configurations.
4

Use your workspace

Reference your workspace in dashboards and metrics:
const embedConfig = {
  workspaceName: 'My Analytics Workspace',
  // ... other configuration
};