cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AlexSantiago
by New Contributor II
  • 4682 Views
  • 16 replies
  • 4 kudos

spotify API get token - raw_input was called, but this frontend does not support input requests.

hello everyone, I'm trying use spotify's api to analyse my music data, but i'm receiving a error during authentication, specifically when I try get the token, above my code.Is it a databricks bug?pip install spotipyfrom spotipy.oauth2 import SpotifyO...

  • 4682 Views
  • 16 replies
  • 4 kudos
Latest Reply
Aric7711
Visitor
  • 4 kudos

With more students turning to online learning, platforms like Colegia.me are proving to be incredibly valuable. It’s efficient, reliable, and built for modern learners.

  • 4 kudos
15 More Replies
xhead
by New Contributor II
  • 21900 Views
  • 14 replies
  • 3 kudos

Does "databricks bundle deploy" clean up old files?

I'm looking at this page (Databricks Asset Bundles development work tasks) in the Databricks documentation.When repo assets are deployed to a databricks workspace, it is not clear if the "databricks bundle deploy" will remove files from the target wo...

Data Engineering
bundle
cli
deploy
  • 21900 Views
  • 14 replies
  • 3 kudos
Latest Reply
ganapati
Visitor
  • 3 kudos

@JamesGraham this issue is related to "databricks bundle deploy" command itself, when run inside ci/cd pipeline, i am still seeing old configs in bundle.tf.json. Ideally it should be updated to changes done from previous run. But i am still seeing er...

  • 3 kudos
13 More Replies
DanielW
by New Contributor II
  • 480 Views
  • 12 replies
  • 3 kudos

Resolved! Databricks Rest api swagger definition not handling bigint or integer

I want to test create a custom connector in a Power App that connects to table in Databricks.  The issue is if I have any columns like int or bigint. No matter what I define in the response in my swagger definition See  below), it is not correct type...

DanielW_0-1747312458356.png DanielW_0-1747313218694.png
  • 480 Views
  • 12 replies
  • 3 kudos
Latest Reply
DanielW
New Contributor II
  • 3 kudos

Hi @LRALVA This might warrant another post to keep the conversation focussed, but I found a couple of things with the custom connector that make it a bit cumbersome to use.1) I don't seem to be able to have two post operations under /statments so I c...

  • 3 kudos
11 More Replies
chexa_Wee
by New Contributor III
  • 148 Views
  • 2 replies
  • 1 kudos

How to Implement Incremental Loading in Azure Databricks for ETL

Hi everyone,I'm currently working on an ETL process using Azure Databricks (Standard Tier) where I load data from Azure SQL Database into Databricks. I run a notebook daily to extract, transform, and load the data for Power BI reports.Right now, the ...

  • 148 Views
  • 2 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

In case you do not want to use dlt (and there are reasons not to), you can also check the docs for autoloader and merge notebooksThese 2 do basically the same as dlt but without the extra cost and more control.  You have to write more code though.For...

  • 1 kudos
1 More Replies
Heman2
by Valued Contributor II
  • 17539 Views
  • 5 replies
  • 22 kudos

Resolved! How to export the output data in the Excel format into the dbfs location

Is there any way to export the ​output data in the Excel format into the dbfs?, I'm only able to do it in the CSV format

  • 17539 Views
  • 5 replies
  • 22 kudos
Latest Reply
haidereli
Visitor
  • 22 kudos

As shared above I tested it and worked fine for loading and updating and saving import openpyxlwb = openpyxl.load_workbook('Test.xlsx')ws = wb.activefor row in ws.iter_rows():print([col.value for col in row]) #show all dataws['A1']='Data'wb.save('Tes...

  • 22 kudos
4 More Replies
oneill
by New Contributor
  • 160 Views
  • 1 replies
  • 0 kudos

SQL - Dynamic overwrite + overwrite schema

Hello,Let say we have an empty table S that represents the schema we want to keepABCDEWe have another table T partionned by column A with a schema that depends on the file we have load into. Say :ABCF1b1c1f12b2c2f2Now to make T having the same schema...

  • 160 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @oneill, please check this if it helps for your use case: https://docs.databricks.com/aws/en/delta/selective-overwrite#dynamic-partition-overwrites  

  • 0 kudos
Einsatz
by New Contributor II
  • 165 Views
  • 1 replies
  • 0 kudos

Dataframe getting updated after creating temporary view

I'm observing different behavior between Databricks Runtime versions when working with DataFrames and temporary views, and would appreciate any clarification.In both environments, I performed the following steps in a notebook (each connected to its o...

  • 165 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @Einsatz, this is expected in DBR version 14.3 and above since we don't have a Spark context. This is happening due to the cache invalidation.  To resolve the issue, please go with the dynamic name for the view every time. 

  • 0 kudos
mstfkmlbsbdk
by New Contributor II
  • 84 Views
  • 1 replies
  • 0 kudos

Analyzing Serverless SQL Warehouse Cost Projection Using System Tables

Hello everyone,I'm working on analyzing cost projections for Serverless SQL Warehouses using system tables, and I’d like to share a visualization approach I’m using to highlight some key differences between classic and serverless SQL warehouses. (Loo...

Screenshot 2025-05-21 at 11.51.03.png
  • 84 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @mstfkmlbsbdk, great analysis.  However, as you mentioned, the classic warehouse is not terminating even though there are no active queries on it. The reason for that is while creating a classic warehouse, by default it takes 45 minutes to termina...

  • 0 kudos
laus
by New Contributor III
  • 9287 Views
  • 6 replies
  • 6 kudos

Resolved! How to sort widgets in a specific order?

I'd like to have a couple of widgets, one for the start and another for end date. I want them to appear in that order but when I run the code below, end date shows up before the start date. How can order in the way I I desired?dbutils.widgets.text("s...

  • 9287 Views
  • 6 replies
  • 6 kudos
Latest Reply
markok
New Contributor II
  • 6 kudos

Doing it manually is not optimal. It should be possible to do this automatically - by creation date or extra function to sort widgets.

  • 6 kudos
5 More Replies
chexa_Wee
by New Contributor III
  • 69 Views
  • 1 replies
  • 0 kudos

How to Implement Incremental Loading in Azure Databricks for ETL

Hi everyone,I'm currently working on an ETL process using Azure Databricks (Standard Tier) where I load data from Azure SQL Database into Databricks. I run a notebook daily to extract, transform, and load the data for Power BI reports.Right now, the ...

  • 69 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @chexa_Wee, you can leverage DLT feature to do so.  Please check: https://docs.databricks.com/aws/en/dlt/transform https://docs.databricks.com/aws/en/dlt/stateful-processing Here is the step-by-step tutorial: https://docs.databricks.com/aws/en/dlt...

  • 0 kudos
BF7
by New Contributor III
  • 63 Views
  • 1 replies
  • 0 kudos

Migrating DLT tables from TEST to PROD catalogs

Why can't we just copy all the DLT tables and materialized views from one UC catalog to another to get the historical data in place and then run the DLT pipelines on those UC tables?We are migrating many very large tables from our TEST catalog to our...

Data Engineering
Delta Live Tables
Unity Catalog
  • 63 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @BF7, we do not support moving the streaming table yet. If we clone the DLT streaming table, it will get converted into a normal delta table instead of a streaming table. In that condition, we need to go with the "full refresh all" option and inde...

  • 0 kudos
eballinger
by Contributor
  • 428 Views
  • 4 replies
  • 0 kudos

List all users groups and the actual users in them in sql

We have a bunch of cloud AD groups in Databricks and I can see what users are in each group by using the user interface Manage Account -> Users and groups -> GroupsI would like to be able to produce this full list in SQL. I have found the below code ...

  • 428 Views
  • 4 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Ok. Well, my last suggestion is to have a look at the SCIM Users API, and SCIM Groups API. You should be able to make the API calls right in the notebook. Cheers, Lou.

  • 0 kudos
3 More Replies
rajib76
by New Contributor II
  • 2795 Views
  • 2 replies
  • 2 kudos

Resolved! DBFS with Google Cloud Storage(GCS)

Does DBFS support GCS?

  • 2795 Views
  • 2 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:bucket_name = "<bucket-name>" mount_name = "<mount-name>" dbutils.fs.mount("gs://%s" % bucket_name, "/m...

  • 2 kudos
1 More Replies
Abishrp
by Contributor
  • 1094 Views
  • 5 replies
  • 1 kudos

Issue in getting system.compute.warehouses table in some workspaces

In some workspaces, I can get system.compute.warehouses table But in some other workspaces, it is not available how can i enable it?Both are in same account but assigned to different metastore. 

Abishrp_0-1737014513631.png Abishrp_1-1737014632007.png
  • 1094 Views
  • 5 replies
  • 1 kudos
Latest Reply
aranjan99
New Contributor III
  • 1 kudos

I disabled the compute schema and then enabled it again

  • 1 kudos
4 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels