The Unity migration guide (https://docs.databricks.com/en/data-governance/unity-catalog/migrate.html#before-you-begin) states the following:Unity Catalog manages partitions differently than Hive. Hive commands that directly manipulate partitions are ...
Hi, My scenario is I have an export of a table being dropped in ADLS every day. I would like to load this data into a UC table and then repeat the process every day, replacing the data. This seems to rule out DLT as it is meant for incremental proc...
I am not able to run our unit tests suite due a possible bug in the databricks-connect library. The problem is with the Dataframe transformation withColumnRenamed. When I run it in a Databricks cluster (Databricks Runtime 14.3 LTS), the column is ren...
The following doc suggests the ability to add column comments during MV creation via the `column list` parameter.Thus, the SQL code below is expected to generate a table where the columns `col_1` and `col_2` are commented; however, this is not the ca...
Our jobs have been running fine so far w/o any issues on a specific workspace. These jobs read data from files on Azure ADLS storage containers and dont use the hive metastore data at all.Now we attached the unity metastore to this workspace, created...
Cool, I am happy it worked.It would be extremely hard to find it out without looking into your env but to be honest I struggled with those identities, principals and so on, so good that solution will be on this forum.I don't know how you managed to g...
I am trying to connect to SQL through JDBC from databricks notebook. (Below is my notebook command)val df = spark.read.jdbc(jdbcUrl, "[MyTableName]", connectionProperties)
println(df.schema)When I execute this command, with DBR 10.4 LTS it works fin...
Try to add the following parameters to your SQL connection string. It fixed my problem for 13.X and 12.X;trustServerCertificate=true;hostNameInCertificate=*.database.windows.net;
facing issue with integrating our Spring boot JPA supported application with Databricks.Below are the steps and setting we did for the integration.When we are starting the spring boot application we are getting a warning as :HikariPool-1 - Driver doe...
I created a workflow with two tasks. It runs the first notebook and then it wait for that to finish to start the second notebook. I want to use this dynamic value as one of the parameters {{job.start_time.iso_datetime}} for both tasks. This should gi...
Hi Everyone,I am brand new to databricks and am setting up my first Semantic Model with RLS and have run into an unexpected problem.When I was testing my model with filters applied (where the RLS would handle later on) it runs extremely fast. I look...
Are you trying to use Power BI RLS rules on top of DirectQuery?
Can you give an example of the rules you're trying to apply? Are they static roles, or dynamic roles based on the user's UPN/email being in the dataset?
Hi,I'm trying to set a dynamic value to use in a DLT query, and the code from the example documentation does not work.SET startDate='2020-01-01';
CREATE OR REFRESH LIVE TABLE filtered
AS SELECT * FROM my_table
WHERE created_at > ${startDate};It is g...
Hi all,I have many API calls to run on a python Databricks notebook which I then run regularly on a Databricks Workflow job. When I test the following code on an all purpose cluster locally i.e. not via a job, it runs perfectly fine. However, when I ...
Hi, I am a bit stumped atm bc I cannot figure out how to get a DLT table definition picked up in a Python notebook. 1. I created a new notebook in python2. added the following code: %python
import dlt
from pyspark.sql.functions import *
@dlt.table(...
Ok, it seems that the default language of the notebook and the language of a particular cell can clash. If the default is set to Python, switching a cell to SQL won't work in DLT and vice versa. This is super unintuitive tbh.
Library installation failed for library due to user error for jar: \"dbfs:////<<PATH>>/jackson-annotations-2.16.1.jar\"\n Error messages:\nLibrary installation attempted on the driver node of cluster <<clusterId>> and failed. Please refer to the foll...
Hi,We are using azure sql as external meta store. We are trying to access this external metastore from databricks warehouse clusters but getting error. `data` property must be defined in SQL query response . but we are able to connect to the same usi...
Hi,Databricks newbie here. I have copied delta files from my Synapse workspace into DBFS. To add them as a table, I executed.create table audit_payload using delta location '/dbfs/FileStore/data/general/audit_payload'The command executed properly. Ho...