cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

dbt writting into different schema

ruoyuqian
New Contributor II

I have a unity catalog and it goes like `catalogname.schemaname1`& `catalogname.schemaname2`. and I am trying to write tables into schemaname2 with dbt, the current setup in the dbt profiles.yml is 

 

 

prj_dbt_databricks:
  outputs:
    dev:
      catalog: catalogname
      host: adb-xxxx.azuredatabricks.net
      http_path: /sql/1.0/warehouses/xxx
      schema: schemaname1
      threads: 1
      token: "{{ env_var('DATABRICKS_TOKEN') }}"
      type: databricks
  target: dev

 

 

so in the profiles.yml by default the schema is set to schemaname1. 
and in my dbt_project.yml the setting is below

 

 

name: 'prj_dbt_databricks'
version: '1.0.0'

# This setting configures which "profile" dbt uses for this project.
profile: 'prj_dbt_databricks'

model-paths: ["models"]
analysis-paths: ["analyses"]
test-paths: ["tests"]
seed-paths: ["seeds"]
macro-paths: ["macros"]
snapshot-paths: ["snapshots"]

clean-targets:         # directories to be removed by `dbt clean`
  - "target"
  - "dbt_packages"

# Configuring models
# Full documentation: https://docs.getdbt.com/docs/configuring-models


models:
  prj_dbt_databricks:
    +persist_docs:
      relation: true
      columns: true
    +copy_grants: true
    schemaname1:
      +schema: schemaname1
      materialized: table
    schemaname2:
      +schema: schemaname2
      materialized: table

# Configuring seeds
seeds:
  prj_dbt_databricks:
    +persist_docs:
      relation: true     # Persist descriptions for seeds
      columns: true      # Persist descriptions for seed columns
    schemaname1:  # Applies to seeds under this directory
      +schema: 'schemaname1'
      +persist_docs:
        relation: true   # Persist descriptions for seeds
        columns: true    # Persist descriptions for seed columns

 

 

and I put my SQL files in the corresponding folders for each schema under the models folder in DBT, however when I run the dbt models for schemaname2 it keeps writing into schemaname1. I am not sure why it happens to Databrick catalog, it works fine for Snowflake 

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @ruoyuqian, ensure models for `schemaname2` are in the correct folder as per `dbt_project.yml`, explicitly set the schema in model files with `{{ config(schema='schemaname2') }}`, verify `profiles.yml` configuration, run `dbt debug` to check connectivity, specify models explicitly when running dbt commands, ensure compatibility with your dbt version, and check Unity Catalog configuration and permissions. Let me know if you need further assistance or if there’s anything else I can help with!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group