cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

PawanB
by New Contributor
  • 720 Views
  • 1 replies
  • 0 kudos

Facing issue while accessing DLT pipeline event logs.

while accessing event logs created by dlt pipeline, i am getting below error.[INTERNAL_ERROR] The Spark SQL phase analysis failed with an internal error. You hit a bug in Spark or the Spark plugins you use. Please, report this bug to the correspondin...

  • 720 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi Pawan,How are you doing today? As per my understanding, This error likely indicates an internal issue with Databricks, Delta Live Tables (DLT), or a Spark plugin. First, check if there are any known issues in Databricks release notes or community ...

  • 0 kudos
Sridhark
by New Contributor
  • 873 Views
  • 1 replies
  • 0 kudos

how to upload excel file from power apps to databircks notebook folder

Hi All,User uploads an excel file from power apps interface, excel file has to store in databricks folder and  insert the excel data into sql table, storing excel file in databricks and inserting data has to go through power automate. Please suggest ...

  • 873 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi @Sridhark How are you doing today? As per my understanding, You can achieve this by setting up a Power Automate flow that triggers when a user uploads an Excel file in Power Apps, storing it in OneDrive or SharePoint first. Then, use Power Automat...

  • 0 kudos
Viren123
by Contributor
  • 4017 Views
  • 5 replies
  • 2 kudos

User entry via portal/UI

Hello Experts,We have data bricks on Azure. We need to provide a user interface to users so that some customizing table entries can be entered by end users which in turn are saved in the Delta table. Is there any feature in Databricks or tools that w...

  • 4017 Views
  • 5 replies
  • 2 kudos
Latest Reply
RajeevKum
New Contributor II
  • 2 kudos

I achieved this using textbox widget and asked user to enter and run dashboard, in python code validating and inserting data in databricks.2nd method , created excel vba application, using odbc connection to insert delete and update data in delta tab...

  • 2 kudos
4 More Replies
vidya_kothavale
by Contributor
  • 623 Views
  • 1 replies
  • 0 kudos

Keywords and Functions supported in Vertica SQL but not in Databricks SQL.

I have to convert Vertica queries in Databricks SQLs, so that I can run them in databricks environment. So I want to know the list of all keywords, functions or anything that is different in databricks SQL. 

  • 623 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @vidya_kothavale, Please refer to:  https://docs.databricks.com/aws/en/sql/language-manual https://docs.databricks.com/aws/en/sql/language-manual/sql-ref-reserved-words

  • 0 kudos
brianr
by New Contributor II
  • 1521 Views
  • 2 replies
  • 0 kudos

Databricks Apps - Streamlit Performance Hangups

Hi All,I have a Streamlit app running via Databricks Apps. The app is fairly simple and displays data from a small handful of lightweight database queries (each running in less than 1 second).As of a few days ago, this app was running great. But as o...

  • 1521 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @brianr, Have you validated the SQL warehouse you are referring in your code? I would be useful to identify the delayed queries and check it's status from "Query History" in Databricks UI.

  • 0 kudos
1 More Replies
rafaelcavalcant
by New Contributor II
  • 1259 Views
  • 2 replies
  • 0 kudos

Resolved! Autoloader with cdf not ignoring reserved columns

Hi, i'm using the medallion architecture and the bronze (autoloader with outputmode append) has the full history. So i decided to user the silver zone to dedup the bronze using the 'change data feed'. But when i this to do the upsert i got the messag...

rafaelcavalcant_0-1740063060487.png
  • 1259 Views
  • 2 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi @rafaelcavalcant How are you doing today? As per my understanding, The error happens because _change_type is appearing twice in your schema, likely due to how you're processing the Change Data Feed (CDF). This can happen if SELECT * is pulling in ...

  • 0 kudos
1 More Replies
amitDE
by New Contributor
  • 874 Views
  • 1 replies
  • 0 kudos

Cloning to data bricks from azure dev ops

Created a ADF pipeline, having databricks notebook activity. In that using python code. Have a scenario to use some classes from azure devops repos. As part of that , have to clone the repo into databricks workpsace. I have used managed identity to c...

  • 874 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @amitDE, Could you please review this documentation and make sure setup looks fine? https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/use-ms-entra-sp-with-devops

  • 0 kudos
mattmunz
by New Contributor III
  • 32120 Views
  • 6 replies
  • 0 kudos

How can I resolve this SSL error which occurrs when calling databricks-sql-connector/databricks.sql.connect() from my python app?

Error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)>  python --versionPython 3.10.4This error seems to be coming from the thrift backend. I suspect but have not confirmed that t...

  • 32120 Views
  • 6 replies
  • 0 kudos
Latest Reply
Hardy_M
New Contributor II
  • 0 kudos

You can set up an SSL context that skips certificate verification with the following command:import ssl ssl._create_default_https_context = ssl._create_unverified_contextI have followed some steps from this source.

  • 0 kudos
5 More Replies
AKon
by New Contributor
  • 428 Views
  • 1 replies
  • 0 kudos

Incremental load from a sql table

Hi, Is there a way to read a sql table using jdbc as a streaming source? If not what is the best approach to read incremental data from a sql table and implement SCD Type 1 while loading into silver and gold layer. Can this be implemented using dlt p...

  • 428 Views
  • 1 replies
  • 0 kudos
Latest Reply
MariuszK
Valued Contributor III
  • 0 kudos

hi,You can use ADF to extract data incrementally into the bronze layer, and use DLT to load this data into silver with SCD1 or SCD2.You will need to store somewhere information about last extracted ID and read it in ADF pipeline to get only new recor...

  • 0 kudos
Anmol_Chauhan
by New Contributor II
  • 680 Views
  • 1 replies
  • 0 kudos

Query with SHOW and Describe command not working in Databricks Dashboard

Hi Folks,I am trying to retrieve table properties of delta table in Databricks dashboard data editor but getting error message.It seems like describe and show command not working with dashboard editor as Query is working fine in sql editor and notebo...

Anmol_Chauhan_1-1739806218544.png
  • 680 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @Anmol_Chauhan! Dashboard Data Editor is designed for querying and working with datasets to create dashboard visualizations. However, outputs from commands like show, describe, or explain are not valid as datasets, which is why the query does n...

  • 0 kudos
shekharshukla
by New Contributor II
  • 586 Views
  • 1 replies
  • 0 kudos

Not able to access Table_tags in Databricks Apps:

When I try to fetch system.information_schema.schema_tags, it shows up but when I'm trying to fetch system.information_schema.table_tags it's not showing up and returns an empty df. Is there anything I am missing?assert os.getenv('DATABRICKS_WAREHOUS...

  • 586 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi @shekharshukla How are you doing today? As per understanding, It looks like the system.information_schema.table_tags query is returning an empty DataFrame, which could be due to a couple of reasons. First, make sure that there are actually tags as...

  • 0 kudos
Faizan_khan8171
by New Contributor
  • 790 Views
  • 1 replies
  • 0 kudos

UCX Assessment Dashboard Error: "The warehouse was not found"

Hello everyone,We recently installed UCX and were able to access the UCX Assessment Dashboard successfully. However, we’re now seeing an error stating: "The warehouse was not found." I suspect that someone may have accidentally deleted the warehouse ...

  • 790 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi @Faizan_khan8171 How are you doing today? As per my understanding, It looks like the warehouse linked to your UCX Assessment Dashboard was deleted, which is likely causing the error. You can try checking under SQL Warehouses to see if it's still t...

  • 0 kudos
carolsun08
by New Contributor
  • 857 Views
  • 1 replies
  • 0 kudos

Repairing job is useful but disabled if the job is triggered by another Run job task

Hi, I regularly use the repair job option to rerun a subset of a larger data pipeline job. However, this option is disabled if I start this job from another "orchestration job" via Job Run tasks. The "repair" button is disabled in my case. This restr...

  • 857 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi @carolsun08 How are you doing today?, Yeah, it's a bit frustrating that the repair job option is disabled when running jobs through Job Run tasks in an orchestration job. While there's no official confirmation on when this might change, it would d...

  • 0 kudos
BobCat62
by New Contributor III
  • 1765 Views
  • 2 replies
  • 0 kudos

Resolved! Missing Delta-live-Table in hive-metastore catalog

Hi experts,I defined my delta table in an external location as following:%sqlCREATE OR REFRESH STREAMING TABLE pumpdata (Body string,EnqueuedTimeUtc string,SystemProperties string,_rescued_data string,Properties string)USING DELTALOCATION 'abfss://md...

Bild1.png Bild2.png Bild3.png Bild4.png
Data Engineering
Delta Live Tables
  • 1765 Views
  • 2 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hey @BobCat62 , This might helpdlt will be in direct publishingmode by default. If you select hive_metstore you must specify the default schema in the dlt pipeline setting. If not done there. At the time of defining the dlt table pass the schema_name...

  • 0 kudos
1 More Replies
MrFi
by New Contributor
  • 1172 Views
  • 1 replies
  • 0 kudos

500 Error on /ajax-api/2.0/fs/list When Accessing Unity Catalog Volume in Databricks

 We are encountering an issue with volumes created inside Unity Catalog. We are using AWS and Terraform to host Databricks, and our Unity Catalog structure is as follows:• Catalog: catalog_name• Schemas: raw, bronze, silver, gold (all with external l...

  • 1172 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi @MrFi How are you doing today?As per my understanding, It looks like the Unity Catalog UI might have trouble handling external volumes, even though dbutils works fine. Try running SHOW VOLUMES IN catalog_name.raw; to check if the volume is properl...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels