Guides
Visit our websiteDeveloper Docs
  • Getting Started with Databrain
    • What is Databrain?
  • ❄️Onboarding & Configuration
    • 📝Sign-Up
    • ✍️Sign-In
    • ✏️Google Sign In Setup for Self-hosted app
    • 🤔Forgot password? Recover your Databrain Account
    • 🌟Onboarding
    • 💾Add a Data Source
    • 🧑Configure Tenants
    • 🆕Create a Workspace
    • 🔓Create a Private Workspace
    • 🆕Create a Dashboard
    • 💠Create a Metric
      • Create Custom Columns
      • Create a Metric using Chat Mode
      • Create a Metric using Custom SQL
    • Workspace Settings
      • General Settings
      • Access Control Settings
      • Cache Settings
      • Download Settings
    • 🗄️Explore Data
  • 🛢️Datasources
    • Connecting Data Sources to Databrain
      • Amazon Redshift
      • Snowflake
      • BigQuery
      • MySQL
      • Postgres
      • MongoDB
      • ElasticSearch
      • DataBricks
      • ClickHouse
      • MSSQL
      • Amazon S3
      • CSV
      • Firebolt
      • SingleStore
      • Athena
    • Allow Access to our IP
    • Add a Data Source
    • Configure Tenants
    • How to Sync a Data Source
    • Edit Tenancy
    • Create a Datamart
    • Semantic Layer
    • Create a Data App
    • Creating a Custom Dataset/View in a Multi-Datasource Environment
  • Workspace
    • Multi Datasource Workspace
  • 🔍DASHBOARDS
    • Edit a Dashboard
    • Share Dashboard
    • Dashboard Settings
    • Create/Modify Dashboard Filter
      • Dashboard Filter - Variable Apply On
      • Add LHS and RHS custom sql support for dashboard filter
    • Customize Layout
    • Adding Elements to Dashboard
    • Import/Export Dashboard
    • Report Scheduler
  • 📉METRIC
    • Edit a Metric
    • Joins , Filter, Sort, Group By
    • Complex Filter
    • Apply Metric Filter
      • Metric Filter - Variable
      • Metric Filter - Custom
    • Switch X axis and Switch Y axis
    • Group By
    • Footnote and Long Description
    • Dynamic Property
    • Archive/Unarchive Metric Card
    • Download Metric Card
    • Download Underlying Data
    • Metric Summary
    • Metric Expression for Single Value Card
    • AI Summary
    • Merge Metrics
    • Section Filters
    • View Unpublished Metrics
  • 📊VISUALIZATIONS - ACTIONS & APPEARANCE
    • Chart Actions
      • Chart Click Action
      • Chart Click Action with Metric
      • Card Click Action
      • Drill Down
      • Cross Dashboard Drill Down
    • Chart Appearance
      • Chart-Specific Appearance Options
  • 🛢️PREVIEW OF DASHBOARDS
    • Email Settings for Scheduled Reports
    • Scheduled Reports for End User
  • 🔍FILTERS
    • Dashboard Filter
    • Metric Filter
    • App filter
  • 💡Features
    • Python Editor Console
    • Custom SQL Console
    • Custom SQL Query Guidelines
  • 🏢Integrating Plugin
    • ✳️Get an auth token
    • 🙏Get a guest token
  • 🛃THEMEING & CUSTOMIZATION
    • 🎨Creating a theme
    • 🖼️View the theme in action
    • ⚙️Reset a saved theme
  • 📊Metric Component (upto version v0.11.15)
    • ✨Quick start
  • 🕸️Web Components
    • ✨Quick start
    • ⚛️Framework Specific Guide
  • 🚀Product Changelog
  • 🤳Self Hosted Changelog
Powered by GitBook
On this page
  1. Datasources
  2. Connecting Data Sources to Databrain

Amazon S3

Getting Started with S3 Source Configuration

Requirements:

  • Active AWS account with S3 access.

  • Appropriate IAM permissions to access the desired S3 buckets.

  • Choose the Databrain Workspace to which you wish to connect the data.

Setup Guide:

  1. Ensure Bucket Accessibility:

    • Make sure your S3 bucket is active and accessible from Databrain.

    • This depends on your AWS account settings and bucket permissions.

  2. Grant Necessary Permissions:

    • Read Access on Buckets and Objects: Grant read access permissions to the S3 buckets and objects you want to sync.

  3. Fill Up Connection Info:

    • Provide the following information to connect to your S3 bucket:

      • Destination Name: A custom name to identify this connection in Databrain.

      • S3 Region: The AWS region where your S3 bucket is located (e.g., us-east-1).

      • S3 Access Key ID: Your AWS Access Key ID for authentication.

      • S3 Secret Access Key: Your AWS Secret Access Key associated with the Access Key ID.

      • S3 Bucket Dataset Folder Path: The specific folder path within your bucket (e.g., awss3_folder_test_less/).

      • S3 Bucket Name: The name of your S3 bucket (e.g., databrain-s3-test-csv).

      • Table Level: Select whether to interpret data at the Folder or File level.

Permissions:

  • Permission to list bucket contents.

  • Permission to read objects from the specified bucket.

  • If using KMS encryption, permission to use the KMS key for decryption.

Locating the Configuration Details in AWS S3

  1. Destination Name:

    • Choose a descriptive name for this connection within Databrain.

  2. S3 Region:

    • Log in to the AWS Management Console and open the S3 service.

    • Select your bucket, and find the region information in the bucket's "Properties" tab.

  3. S3 Access Key ID & Secret Access Key:

    • Generated in the IAM (Identity and Access Management) section of AWS.

    • Navigate to IAM, select the desired user, go to the "Security credentials" tab, and create or manage access keys.

  4. S3 Bucket Dataset Folder Path:

    • Navigate to your bucket in the S3 console and note the specific folder path you wish to sync.

  5. S3 Bucket Name:

    • This is the name of your S3 bucket, visible in the S3 dashboard of the AWS Management Console.

  6. Table Level:

    • Determine whether your data should be interpreted at the folder level or file level based on your S3 bucket structure and data organization.

PreviousMSSQLNextCSV

Last updated 18 hours ago

🛢️