Skip to main content
curl --request PUT \
  --url https://api.usedatabrain.com/api/v2/datasource \
  --header 'Authorization: Bearer dbn_live_abc123...' \
  --header 'Content-Type: application/json' \
  --data '{
    "datasourceType": "postgres",
    "credentials": {
      "name": "production-postgres",
      "host": "new-db.example.com",
      "port": 5432,
      "username": "dbuser",
      "password": "newpassword",
      "database": "analytics",
      "schema": "public"
    }
  }'
{
  "name": "production-postgres"
}
PUT
https://api.usedatabrain.com
/
api
/
v2
/
datasource
curl --request PUT \
  --url https://api.usedatabrain.com/api/v2/datasource \
  --header 'Authorization: Bearer dbn_live_abc123...' \
  --header 'Content-Type: application/json' \
  --data '{
    "datasourceType": "postgres",
    "credentials": {
      "name": "production-postgres",
      "host": "new-db.example.com",
      "port": 5432,
      "username": "dbuser",
      "password": "newpassword",
      "database": "analytics",
      "schema": "public"
    }
  }'
{
  "name": "production-postgres"
}
Update an existing datasource’s credentials or configuration. The API validates the new credentials, tests the connection, and automatically refreshes the cached schema.
You can only update datasources that already exist in your organization. The datasource is identified by the name field in the credentials. After updating, the schema will be automatically re-cached.

Endpoint

PUT https://api.usedatabrain.com/api/v2/datasource

Self-hosted Databrain Endpoint

PUT <SELF_HOSTED_URL>/api/v2/datasource

Authentication

This endpoint requires a service token in the Authorization header. Service tokens differ from data app API keys and provide organization-level permissions. To access your service token:
  1. Go to your Databrain dashboard and open Settings.
  2. Navigate to Settings.
  3. Find the Service Tokens section.
  4. Click the “Generate Token” button to generate a new service token if you don’t have one already.
  5. Use this token as the Bearer value in your Authorization header.

Headers

Authorization
string
required
Bearer token for API authentication. Use your API key from the data app.
Authorization: Bearer dbn_live_abc123...
Content-Type
string
required
Must be set to application/json for all requests.
Content-Type: application/json

Request Body

datasourceType
string
required
The type of datasource. Must match the existing datasource type. See Create Datasource for supported types.
credentials
object
required
Updated connection credentials for the datasource. Must include the name field matching the existing datasource name.
The credentials.name field must match the exact name of the existing datasource you want to update. This name is used to identify which datasource to update.
credentials.name
string
required
The name of the existing datasource to update. Must match exactly as it was created.
tenancySettings
object
Multi-tenant configuration for the datasource. Defines how data is isolated between different tenants/clients. Optional - if not provided, existing tenancy settings will remain unchanged.
tenancySettings.tenancyLevel
string
The level at which tenant isolation occurs. Must be one of: TABLE or DATABASE.
  • TABLE: Client mapping is stored in a specific table (most common)
  • DATABASE: Each client has a separate database instance
Required when tenancySettings is provided. If tenancySettings is omitted, this field is not needed.
tenancySettings.clientColumnType
string
Data type of the client identifier column. Must be either NUMBER or STRING.Required when tenancyLevel is TABLE.
tenancySettings.schemaName
string
Schema name where the client mapping table is located.Required when tenancyLevel is TABLE.
tenancySettings.tableName
string
Name of the table that contains client mapping information.Required when tenancyLevel is TABLE.
tenancySettings.tableClientNameColumn
string
Column name in the mapping table that stores the client identifier.Required when tenancyLevel is TABLE.
tenancySettings.tablePrimaryKeyColumn
string
Primary key column of the client mapping table.Required when tenancyLevel is TABLE.

Datasource-Specific Credentials

You must provide all required fields for the datasource type, even if only some values are changing. Partial updates are not supported.
When updating credentials, provide the complete credential structure for the datasource type. Partial updates are not supported.
credentials
object
Datasource-specific credential fields. The required fields depend on the datasourceType.
credentials.host
string
required
Snowflake account hostname (e.g., your-account.snowflakecomputing.com)
credentials.username
string
required
Snowflake username
credentials.role
string
required
Snowflake role to use
credentials.warehouse
string
required
Snowflake warehouse name
credentials.database
string
required
Snowflake database name
credentials.schema
string
required
Snowflake schema name
credentials.credentials
string
required
Authentication method: "username/password" or "Key-pair authentication"
credentials.password
string
Password (required if credentials is "username/password")
credentials.privateKey
string
Private key (required if credentials is "Key-pair authentication")
credentials.privateKeyPass
string
Passphrase for the private key (optional)
credentials.host
string
required
Database hostname or IP address
credentials.port
number
required
Database port number (1-65535)
credentials.username
string
required
Database username
credentials.password
string
required
Database password
credentials.database
string
required
Database name
credentials.schema
string
required
Schema name
credentials.sslMode
boolean
Enable SSL mode (optional)
credentials.sshTunnel
string
SSH tunnel setting: "enable" or "disable" (optional)
credentials.sshHost
string
SSH server hostname (required if sshTunnel is "enable")
credentials.sshPort
number
SSH server port (required if sshTunnel is "enable")
credentials.sshUsername
string
SSH username (required if sshTunnel is "enable")
credentials.sshPrivateKey
string
SSH private key (required if sshTunnel is "enable")
credentials.host
string
required
CockroachDB hostname or IP address
credentials.port
number
required
Database port number (1-65535)
credentials.username
string
required
Database username
credentials.password
string
required
Database password
credentials.database
string
required
Database name
credentials.schema
string
required
Schema name
credentials.sslMode
boolean
Enable SSL mode (optional)
credentials.sshTunnel
string
SSH tunnel setting: "enable" or "disable" (optional)
credentials.sshHost
string
SSH server hostname (required if sshTunnel is "enable")
credentials.sshPort
number
SSH server port (required if sshTunnel is "enable")
credentials.sshUsername
string
SSH username (required if sshTunnel is "enable")
credentials.sshPrivateKey
string
SSH private key (required if sshTunnel is "enable")
credentials.credentials_json
string
required
JSON string containing Google Cloud service account credentials
credentials.project_id
string
required
Google Cloud project ID
credentials.dataset_location
string
required
BigQuery dataset location (e.g., "US", "EU")
credentials.dataset_id
string
BigQuery dataset ID (optional)
credentials.host
string
required
Database hostname or IP address
credentials.port
number
required
Database port number (1-65535)
credentials.user
string
required
Database username. Note: Uses user not username for these datasource types.
credentials.password
string
required
Database password
credentials.server
string
required
SQL Server hostname or IP address. Note: Uses server not host for MSSQL.
credentials.port
number
required
Database port number (1-65535)
credentials.user
string
required
Database username. Note: Uses user not username for MSSQL.
credentials.password
string
required
Database password
credentials.database
string
Database name (optional)
credentials.isDisableDatabase
boolean
Disable database selection (optional)
credentials.host
string
required
Database hostname or IP address
credentials.port
number
required
Database port number (1-65535)
credentials.user
string
required
Database username
credentials.password
string
required
Database password
credentials.database
string
required
Database name
credentials.serverHostname
string
required
Databricks server hostname
credentials.httpPath
string
required
Databricks HTTP path
credentials.token
string
required
Databricks access token
credentials.server_type
string
required
Server type: "elastic-cloud", "open-cloud", or "self-managed"
credentials.cloud_id
string
Cloud ID (required if server_type is "elastic-cloud" or "open-cloud")
credentials.server_url
string
Server URL (required if server_type is "self-managed")
credentials.username
string
Username (required unless disableAuth is true)
credentials.password
string
Password (required unless disableAuth is true)
credentials.disableAuth
boolean
Disable authentication (optional, default false). Only valid when server_type is "self-managed".
credentials.isIgnoreVerifyCert
boolean
Ignore certificate verification (optional)
credentials.isIgnoreSsl
boolean
Ignore SSL (optional)
credentials.server_type
string
required
Server type: "elastic-cloud", "open-cloud", or "self-managed"
credentials.cloud_id
string
Cloud ID (required if server_type is "elastic-cloud" or "open-cloud")
credentials.server_url
string
Server URL (required if server_type is "self-managed")
credentials.username
string
Username (required unless disableAuth is true)
credentials.password
string
Password (required unless disableAuth is true)
credentials.disableAuth
boolean
Disable authentication (optional, default false). Only valid when server_type is "self-managed".
credentials.isIgnoreVerifyCert
boolean
Ignore certificate verification (optional)
credentials.isIgnoreSsl
boolean
Ignore SSL (optional)
credentials.client_id
string
required
Firebolt client ID
credentials.client_secret
string
required
Firebolt client secret
credentials.account
string
required
Firebolt account name
credentials.database
string
required
Database name
credentials.engineName
string
required
Engine name
credentials.schema
string
Schema name (optional)
credentials.database
string
required
Athena database name
credentials.outputBucket
string
required
S3 output bucket for query results
credentials.s3AccessKeyId
string
required
AWS access key ID
credentials.s3Region
string
required
AWS region
credentials.s3SecretAccessKey
string
required
AWS secret access key
credentials.datasourceId
string
Datasource ID (optional)
credentials.host
string
required
Trino hostname or IP address
credentials.port
number
required
Database port number (1-65535)
credentials.catalog
string
required
Trino catalog name
credentials.schema
string
required
Schema name
credentials.username
string
required
Username
credentials.password
string
required
Password
credentials.sshTunnel
string
SSH tunnel setting: "enable" or "disable" (optional)
credentials.sshHost
string
SSH host (required if sshTunnel is "enable")
credentials.sshPort
number
SSH port (required if sshTunnel is "enable")
credentials.sshUsername
string
SSH username (required if sshTunnel is "enable")
credentials.sshPrivateKey
string
SSH private key (required if sshTunnel is "enable")
credentials.name
string
required
Name for the CSV datasource
credentials.bucketName
string
required
S3 bucket name
credentials.datasetPath
string
Path within the bucket (optional, can be empty string)
credentials.s3Region
string
required
AWS region (e.g., "us-east-1")
credentials.tableLevel
string
Table level: "File" or "Folder" (optional)
credentials.s3AccessKeyId
string
required
AWS access key ID
credentials.s3SecretAccessKey
string
required
AWS secret access key

Response

name
string
The name of the updated datasource (same as credentials.name).
error
null
Error field, null when successful. Not included in successful responses.

Examples

Error Codes

Error CodeHTTP StatusDescription
INVALID_REQUEST_BODY400Missing required fields or invalid credential structure
DATASOURCE_NAME_ERROR400Datasource not found
CREDENTIAL_TEST_FAILED400Connection test failed
AUTHENTICATION_ERROR401Invalid or missing service token
SCHEMA_CACHE_FAILED500Schema caching failed
DATASOURCE_NOT_FOUND404The specified datasource does not exist
TENANCY_SETTINGS_CREATE_FAILED500Failed to create tenancy settings for the datasource
TENANCY_SETTINGS_UPDATE_FAILED500Failed to update existing tenancy settings
INTERNAL_SERVER_ERROR500Server error occurred

Next Steps