Guides
Visit our websiteDeveloper Docs
  • Getting Started with Databrain
    • What is Databrain?
  • ❄️Onboarding & Configuration
    • 📝Sign-Up
    • ✍️Sign-In
    • ✏️Google Sign In Setup for Self-hosted app
    • 🤔Forgot password? Recover your Databrain Account
    • 🌟Onboarding
    • 💾Add a Data Source
    • 🧑Configure Tenants
    • 🆕Create a Workspace
    • 🔓Create a Private Workspace
    • 🆕Create a Dashboard
    • 💠Create a Metric
      • Create Custom Columns
      • Create a Metric using Chat Mode
      • Create a Metric using Custom SQL
    • Workspace Settings
      • General Settings
      • Access Control Settings
      • Cache Settings
      • Download Settings
    • 🗄️Explore Data
  • 🛢️Datasources
    • Connecting Data Sources to Databrain
      • Amazon Redshift
      • Snowflake
      • BigQuery
      • MySQL
      • Postgres
      • MongoDB
      • ElasticSearch
      • DataBricks
      • ClickHouse
      • MSSQL
      • Amazon S3
      • CSV
      • Firebolt
      • SingleStore
      • Athena
    • Allow Access to our IP
    • Add a Data Source
    • Configure Tenants
    • How to Sync a Data Source
    • Edit Tenancy
    • Create a Datamart
    • Semantic Layer
    • Create a Data App
    • Creating a Custom Dataset/View in a Multi-Datasource Environment
  • Workspace
    • Multi Datasource Workspace
  • 🔍DASHBOARDS
    • Edit a Dashboard
    • Share Dashboard
    • Dashboard Settings
    • Create/Modify Dashboard Filter
      • Dashboard Filter - Variable Apply On
      • Add LHS and RHS custom sql support for dashboard filter
    • Customize Layout
    • Adding Elements to Dashboard
    • Import/Export Dashboard
    • Report Scheduler
  • 📉METRIC
    • Edit a Metric
    • Joins , Filter, Sort, Group By
    • Complex Filter
    • Apply Metric Filter
      • Metric Filter - Variable
      • Metric Filter - Custom
    • Switch X axis and Switch Y axis
    • Group By
    • Footnote and Long Description
    • Dynamic Property
    • Archive/Unarchive Metric Card
    • Download Metric Card
    • Download Underlying Data
    • Metric Summary
    • Metric Expression for Single Value Card
    • AI Summary
    • Merge Metrics
    • Section Filters
    • View Unpublished Metrics
  • 📊VISUALIZATIONS - ACTIONS & APPEARANCE
    • Chart Actions
      • Chart Click Action
      • Chart Click Action with Metric
      • Card Click Action
      • Drill Down
      • Cross Dashboard Drill Down
    • Chart Appearance
      • Chart-Specific Appearance Options
  • 🛢️PREVIEW OF DASHBOARDS
    • Email Settings for Scheduled Reports
    • Scheduled Reports for End User
  • 🔍FILTERS
    • Dashboard Filter
    • Metric Filter
    • App filter
  • 💡Features
    • Python Editor Console
    • Custom SQL Console
    • Custom SQL Query Guidelines
  • 🏢Integrating Plugin
    • ✳️Get an auth token
    • 🙏Get a guest token
  • 🛃THEMEING & CUSTOMIZATION
    • 🎨Creating a theme
    • 🖼️View the theme in action
    • ⚙️Reset a saved theme
  • 📊Metric Component (upto version v0.11.15)
    • ✨Quick start
  • 🕸️Web Components
    • ✨Quick start
    • ⚛️Framework Specific Guide
  • 🚀Product Changelog
  • 🤳Self Hosted Changelog
Powered by GitBook
On this page
  • Requirements:
  • Setup Guide:
  • Locating the Configuration Details in AWS
  1. Datasources
  2. Connecting Data Sources to Databrain

Athena

This guide will walk you through the steps to connect your Athena database to Databrain.

Requirements:

  • Active AWS account with Athena and S3 access.

  • Proper IAM permissions to access Athena and the related S3 buckets.

  • A selected Databrain Workspace where this source will be added.

Setup Guide:

1. Ensure Athena and S3 Accessibility:

  • Make sure your Athena database is active and queries can write results to the designated S3 bucket.

  • Ensure that both Athena and S3 are accessible using the credentials you plan to use with Databrain.

  • Your S3 bucket should be in a region supported by Athena (e.g., us-east-1 is preferred).

2. Grant Necessary Permissions:

You must assign the appropriate IAM policies to the user or role you're using. These should include:

  • Access to Athena for querying.

  • Access to Glue Data Catalog, if used.

  • Access to S3 for reading/writing query results.

3. Fill Up Connection Info:

Provide the following fields in Databrain to configure your Athena source:

  • Destination Name: A custom name to identify this Athena connection in Databrain. Example: Athena Destination Spec

  • S3 Region: The AWS region where your Athena query result bucket is located. Example: us-east-1

  • S3 Access Key ID: Your AWS Access Key ID for authentication.

  • S3 Secret Access Key: Your AWS Secret Access Key associated with the Access Key ID.

  • Database: The name of the Athena database you want to connect to. Example: iceberg_db

  • S3 Bucket Name: The name of the S3 bucket where Athena stores query results. Example: output-bucket-name

Locating the Configuration Details in AWS

1. Destination Name:

Choose any descriptive name to label your Athena connection in Databrain. This does not affect AWS resources.

2. S3 Region:

  • Log in to the AWS Management Console and open the S3 service.

  • Select the bucket used by Athena for output.

  • Then, find the Region under the bucket’s Properties tab.

3. S3 Access Key ID & Secret Access Key:

  • Open the IAM console and select the desired IAM User or Role.

  • Then, go to the Security Credentials tab and create or retrieve Access Keys for use in Databrain.

4. Database:

  • Open the Athena service in the AWS console and select your desired database from the left sidebar (e.g., iceberg_db).

5. S3 Bucket Name:

  • In the Athena console, go to Settings.

  • Then, find the Query result location (S3 URI like s3://your-bucket-name/path).

  • Use the bucket name from this URI (e.g., your-bucket-name).

PreviousSingleStoreNextAllow Access to our IP

Last updated 18 hours ago

🛢️