cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

leungi
by New Contributor
  • 0 Views
  • 0 replies
  • 0 kudos

Unable to add column comment in Materialized View (MV)

The following doc suggests the ability to add column comments during MV creation via the `column list` parameter.Thus, the SQL code below is expected to generate a table where the columns `col_1` and `col_2` are commented; however, this is not the ca...

  • 0 Views
  • 0 replies
  • 0 kudos
kDev
by New Contributor
  • 5489 Views
  • 7 replies
  • 2 kudos

UnauthorizedAccessException: PERMISSION_DENIED: User does not have READ FILES on External Location

Our jobs have been running fine so far w/o any issues on a specific workspace. These jobs read data from files on Azure ADLS storage containers and dont use the hive metastore data at all.Now we attached the unity metastore to this workspace, created...

  • 5489 Views
  • 7 replies
  • 2 kudos
Latest Reply
Wojciech_BUK
Contributor III
  • 2 kudos

Cool, I am happy it worked.It would be extremely hard to find it out without looking into your env but to be honest I struggled with those identities, principals and so on, so good that solution will be on this forum.I don't know how you managed to g...

  • 2 kudos
6 More Replies
Hardy
by New Contributor III
  • 2980 Views
  • 5 replies
  • 4 kudos

The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption

I am trying to connect to SQL through JDBC from databricks notebook. (Below is my notebook command)val df = spark.read.jdbc(jdbcUrl, "[MyTableName]", connectionProperties) println(df.schema)When I execute this command, with DBR 10.4 LTS it works fin...

  • 2980 Views
  • 5 replies
  • 4 kudos
Latest Reply
DBXC
Contributor
  • 4 kudos

Try to add the following parameters to your SQL connection string. It fixed my problem for 13.X and 12.X;trustServerCertificate=true;hostNameInCertificate=*.database.windows.net; 

  • 4 kudos
4 More Replies
satishnavik
by New Contributor II
  • 1908 Views
  • 5 replies
  • 0 kudos

How to connect Databricks Database with Springboot application using JPA

facing issue with integrating our Spring boot JPA supported application with Databricks.Below are the steps and setting we did for the integration.When we are starting the spring boot application we are getting a warning as :HikariPool-1 - Driver doe...

  • 1908 Views
  • 5 replies
  • 0 kudos
Latest Reply
172036
New Contributor II
  • 0 kudos

Was there any resolution to this?  Is Spring datasource supported now?

  • 0 kudos
4 More Replies
mh_db
by New Contributor II
  • 31 Views
  • 0 replies
  • 0 kudos

How to get different dynamic value for each task in workflow

I created a workflow with two tasks. It runs the first notebook and then it wait for that to finish to start the second notebook. I want to use this dynamic value as one of the parameters {{job.start_time.iso_datetime}} for both tasks. This should gi...

  • 31 Views
  • 0 replies
  • 0 kudos
Clampazzo
by New Contributor II
  • 71 Views
  • 3 replies
  • 0 kudos

Power BI RLS running extremely slowly with databricks

Hi Everyone,I am brand new to databricks and am setting up my first Semantic Model with RLS and have run into an unexpected problem.When I was testing my model with filters applied (where the RLS would handle later on) it runs extremely fast.  I look...

Data Engineering
Power BI
sql
  • 71 Views
  • 3 replies
  • 0 kudos
Latest Reply
KTheJoker
Contributor II
  • 0 kudos

Are you trying to use Power BI RLS rules on top of DirectQuery? Can you give an example of the rules you're trying to apply? Are they static roles, or dynamic roles based on the user's UPN/email being in the dataset?

  • 0 kudos
2 More Replies
MarkD
by New Contributor
  • 38 Views
  • 0 replies
  • 0 kudos

SET configuration in SQL DLT pipeline does not work

Hi,I'm trying to set a dynamic value to use in a DLT query, and the code from the example documentation does not work.SET startDate='2020-01-01'; CREATE OR REFRESH LIVE TABLE filtered AS SELECT * FROM my_table WHERE created_at > ${startDate};It is g...

Data Engineering
Delta Live Tables
dlt
sql
  • 38 Views
  • 0 replies
  • 0 kudos
pjv
by Visitor
  • 34 Views
  • 0 replies
  • 0 kudos

Asynchronous API calls from Databricks Workflow job

Hi all,I have many API calls to run on a python Databricks notebook which I then run regularly on a Databricks Workflow job. When I test the following code on an all purpose cluster locally i.e. not via a job, it runs perfectly fine. However, when I ...

  • 34 Views
  • 0 replies
  • 0 kudos
Mathias_Peters
by New Contributor III
  • 59 Views
  • 2 replies
  • 0 kudos

Resolved! DLT table not picked in python notebook

Hi, I am a bit stumped atm bc I cannot figure out how to get a DLT table definition picked up in a Python notebook. 1. I created a new notebook in python2. added the following code:  %python import dlt from pyspark.sql.functions import * @dlt.table(...

Mathias_Peters_0-1715334658498.png
  • 59 Views
  • 2 replies
  • 0 kudos
Latest Reply
Mathias_Peters
New Contributor III
  • 0 kudos

Ok, it seems that the default language of the notebook and the language of a particular cell can clash. If the default is set to Python, switching a cell to SQL won't work in DLT and vice versa. This is super unintuitive tbh.

  • 0 kudos
1 More Replies
vvt1976
by Visitor
  • 35 Views
  • 0 replies
  • 0 kudos

Create table using a location

Hi,Databricks newbie here. I have copied delta files from my Synapse workspace into DBFS. To add them as a table, I executed.create table audit_payload using delta location '/dbfs/FileStore/data/general/audit_payload'The command executed properly. Ho...

Data Engineering
data engineering
  • 35 Views
  • 0 replies
  • 0 kudos
JohanS
by New Contributor III
  • 187 Views
  • 2 replies
  • 0 kudos

WorkspaceClient authentication fails when running on a Docker cluster

from databricks.sdk import WorkspaceClientw = WorkspaceClient()ValueError: default auth: cannot configure default credentials ...I'm trying to instantiate a WorkspaceClient in a notebook on a cluster running a Docker image, but authentication fails.T...

  • 187 Views
  • 2 replies
  • 0 kudos
Latest Reply
Srihasa_Akepati
New Contributor III
  • 0 kudos

@JohanS As discussed, Default auth from a notebook using sdk on DCS is yet to be tested by Engineering. Please use PAT auth for now. I will keep you posted on the progress default auth on DCS.  

  • 0 kudos
1 More Replies
Michael_Appiah
by New Contributor III
  • 2260 Views
  • 5 replies
  • 2 kudos

Resolved! Parameterized spark.sql() not working

Spark 3.4 introduced parameterized SQL queries and Databricks also discussed this new functionality in a recent blog post (https://www.databricks.com/blog/parameterized-queries-pyspark)Problem: I cannot run any of the examples provided in the PySpark...

Michael_Appiah_0-1704459542967.png Michael_Appiah_1-1704459570498.png
  • 2260 Views
  • 5 replies
  • 2 kudos
Latest Reply
Cas
New Contributor III
  • 2 kudos

Thanks for the clarification @Michael_Appiah, very helpfull! Is there already a timeline when this will be supported in DBR 14.x ? As alternatives are not sql injection proof enough for us.

  • 2 kudos
4 More Replies
SamGreene
by Contributor
  • 54 Views
  • 1 replies
  • 0 kudos

Using parameters in a SQL Notebook and COPY INTO statement

Hi, My scenario is I have an export of a table being dropped in ADLS every day.  I would like to load this data into a UC table and then repeat the process every day, replacing the data.  This seems to rule out DLT as it is meant for incremental proc...

  • 54 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@SamGreene Simply write your sql queries as a python variables and then run them through spark.sql(qry)

  • 0 kudos
Labels
Top Kudoed Authors