Hello everyone!We are currently facing an issue with a stream that is not updating new data since the 20 of July.We've validated and bronze table has data that silver doesn't have.Also seeing the logs the silver stream is running but writing 0 files....
I’ve run into an error that I can't figure out how to debug. We're trying to use terraform through a service account. I don’t know if it’s a permissions issue on Databricks, in our account, or in AWS, but it seems that something is being blocked some...
Ok. I found the issue here. We had a *second* place where we were setting up the databricks provider, which I had not updated with the proper client credentials.
Is there a way to check if a table exists, without trying to drop it? something like :select table_name from system_catalogs where database_name = 'mydb' and schema_name = 'myschema' and object_name = 'mytab';
Hi All, I'm wondering if anyone has had any luck setting up multi valued parameters on SSRS using ODBC connection to Databricks? I'm getting "Cannot add multi value query parameter" error everytime I change my parameter to multi value. In the query s...
Hello,I am facing similar kind of issue. I am working on Power BI paginated report and databricks is my source for the report. I was trying to pass the parameter by passing the query in expression builder as mentioned above. However, I have end up w...
Hi!We want to run query located in another notebook every streaming microbatch.We were trying dbutils.run.notebook but we always get errorContext not valid. If you are calling this outside the main thread, you must set the Notebook context via dbutil...
Query Parameters means that you have to pass all parameters as a part of URL after question mark not in the body"/api/1.2/commands/status?clusterId=$cid&contextId=$ec_id&commandId=$command_id"
Our job is written in Scala on DataBricks. It used to have the same problem, but was managed to work with putting all case classes in a separate cell. However, lately it started to fail again due to the same error:Could not initialize class $linec4a1...
Hi team,.I am using cluster 9.1 8n databricks not able to generate Excel file in blob and below are conf Cluster:9.1.8Spark version -3.1.1Scala version 3.1.1Library:Com.crealyticsSpark.excel_2.12Version-3.1.1_0.18.2Dependency:Org.apachr.poi-poi-5.2...
Hi @Databricks143, It appears that you’re encountering an issue while trying to generate an Excel file in Azure Databricks.
Let’s troubleshoot this step by step:
Library Dependencies:
Ensure that the necessary libraries are correctly installed i...
Does anyone have any recent examples of using Tableau and Delta Sharing? The video below mentions using web connector but this connector has been depreciated in Tableau 2023.1. https://www.youtube.com/watch?v=Yg-5LXH9K1I&t=913shttps://help.tableau.co...
Hi,I am still trying to figure out how to use delta sharing with tableau. I've looking for information for a month without any success. As mentionned before, web data connector is deprecatedAny help would be appreciated.thanks, Johnattan
Databricks Runtime 14.2 now has row-level concurrency generally available and enabled by default for Delta tables with deletion vectors. This feature dramatically reduces conflicts between concurrent write operations.
We currently have several workflows that are basically copies with the only difference being that they run with different service principals and so have different permissions and configuration based on who is running. The way this is managed today is...
I have a Unity catalog enabled workspace where I am trying to setCheckpointDir during runtime. The method looks to authenticate using fs.azure.account.key instead of storage credentials. I am using databricks access connector which has "Storage Blob ...
@Kaniz I have provided all the necessary permissions and were able to browse through the folders of the container added as an external location.I don't understand why the method setcheckpointdir looks for account key when the access is already provid...
While trying to ingest data from the S3 bucket, we are running into a situation where the data in s3 buckets is in sub-folders of multiple depths.Is there a good way of specifying patterns for the above case?We tried using the following for a depth o...
Hi @Anup, When dealing with data in S3 buckets that are organized into sub-folders of varying depths, specifying patterns can be challenging.
However, there are some approaches you can consider:
Wildcard Patterns:
You’ve already used a wildcard p...
Hi all, I tried to export several excel files from Databricks. But there will always be one extra underscore behind ".xlsm" and ".xlsx", if I export them and try to open the files on local system. I have to manually remove the underscore from the fil...
Hi, did you find a solution this? I have the same/similar problem where when I save a dataframe from a Databricks notebook using to_excel() it saves the file with extension ".xlsx_" rather then "xlsx", meaning to open I have to manually download and ...