Skip to main content

Common Issues

Cause: Your service token isn’t set or isn’t being read by the MCP server.Fix:
  1. Check that DATABRAIN_SERVICE_TOKEN is in the env block of your MCP config (not in your shell environment — MCP clients read from the config file)
  2. Verify the token value is correct — no extra spaces or quotes
  3. Restart your AI client after changing the config
{
  "mcpServers": {
    "databrain": {
      "command": "npx",
      "args": ["@databrainhq/mcp-server"],
      "env": {
        "DATABRAIN_SERVICE_TOKEN": "your-actual-token-here"
      }
    }
  }
}
If the token is set correctly and you still see this error, try regenerating it in Settings → Service Tokens in the Databrain UI.
Cause: An embed operation was attempted without an active API token. This is normal during first-time setup.Fix: The assistant should automatically create one via create_api_token. If the error persists, ask:
“Create an API token for my data app”
You can also pre-configure an API token in your MCP config:
{
  "env": {
    "DATABRAIN_SERVICE_TOKEN": "<your-service-token>",
    "DATABRAIN_API_TOKEN": "<your-api-token>"
  }
}
API tokens are scoped to a single data app. If you work with multiple data apps, let the MCP server create tokens automatically rather than pre-configuring one.
Cause: No workspace was selected, or the selected workspace has no dashboards.Fix: Ask the assistant to list workspaces first, then show dashboards:
“List my workspaces and then show dashboards”
Dashboards are scoped to workspaces. If you see workspaces but no dashboards, verify that dashboards exist in the Databrain UI for that workspace.
Cause: Usually a token, ID, or security mismatch.Fix:
  1. Guest token generated server-side? Never expose API tokens in frontend code. Guest tokens must be generated from your backend.
  2. IDs match? Verify the embed ID and dashboard ID are correct.
  3. Domain whitelisted? Check that your frontend’s domain is allowed in the Data App settings.
  4. Run a health check: Ask the assistant:
“Run a health check on my embed”
This triggers diagnose_embed_health, which checks for malformed metadata and validates runtime readiness.
Cause: Existing guest tokens still carry the old configuration.Fix: After updating theme or options via customize_embed_theme or configure_embed:
  1. Regenerate your guest token by calling generate_guest_token again
  2. Reload your application to use the new token
Guest tokens encode embed configuration at generation time. Any changes made after token creation require a new token.
Cause: Usually a Node.js version issue.Fix: The MCP server requires Node.js 18 or higher:
node --version
If you see a version below 18, update Node.js. We recommend using nvm to manage Node.js versions:
nvm install 18
nvm use 18
Cause: The semantic layer is empty or incomplete.Fix: The semantic layer provides the metadata that helps the AI understand your column names and relationships. Check its status:
“Check the semantic layer quality for my datamart”
If the completion score is low, populate it:
“Add descriptions and synonyms to improve query accuracy”
The assistant will auto-generate descriptions from table and column names and push them via update_semantic_layer. Adding synonyms for business terms (e.g., “revenue” = “total_sales”) significantly improves accuracy.
Cause: npm/npx not installed or network issues fetching the package.Fix:
  1. Ensure npm is installed: npm --version
  2. Try running the package directly to check for errors:
npx @databrainhq/mcp-server
  1. If it hangs, you may have a network/proxy issue. Try installing globally first:
npm install -g @databrainhq/mcp-server
Then update your config to use the global binary:
{
  "mcpServers": {
    "databrain": {
      "command": "databrain-mcp-server",
      "env": {
        "DATABRAIN_SERVICE_TOKEN": "<your-service-token>"
      }
    }
  }
}

Diagnostic Tools

The MCP server includes a built-in diagnostic tool. Ask your assistant:
“Run a health check on my embed”
This calls diagnose_embed_health, which checks:
  • Embed metadata structure (access settings, datamart references)
  • Required fields are present and valid
  • Runtime readiness when guest token info is provided
  • Permission configuration consistency

Known Limitations (v0.2.0)

This is the initial public release. A few things to be aware of:
AreaLimitationWorkaround
Infrastructure creationCannot create datasources, datamarts, or workspaces via MCPCreate these in the Databrain UI, then use MCP for everything else
Transportstdio only — no HTTP/SSE transport yetUse npx with your MCP client’s config file
Batch operationsNo bulk embed creation in a single callCreate embeds one at a time via create_embed or ask the assistant to loop
Dashboard creationCannot create dashboards or metrics via MCPBuild dashboards in the Databrain UI, then embed them via MCP
Semantic layerAuto-generated descriptions are a starting pointReview and refine descriptions manually via update_semantic_layer for best query accuracy
We’re actively expanding the MCP server’s capabilities. If there’s a specific operation you’d like supported, let us know.

Getting Help

Quickstart

Start over with setup

Client Setup

Verify your client config

Support

Contact our team for help