cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

m208205
by New Contributor
  • 15 Views
  • 0 replies
  • 0 kudos

Difference in support for partitions between hive and Unity

The Unity migration guide (https://docs.databricks.com/en/data-governance/unity-catalog/migrate.html#before-you-begin) states the following:Unity Catalog manages partitions differently than Hive. Hive commands that directly manipulate partitions are ...

Data Engineering
Unity Catalog
  • 15 Views
  • 0 replies
  • 0 kudos
SamGreene
by Contributor
  • 62 Views
  • 2 replies
  • 0 kudos

Using parameters in a SQL Notebook and COPY INTO statement

Hi, My scenario is I have an export of a table being dropped in ADLS every day.  I would like to load this data into a UC table and then repeat the process every day, replacing the data.  This seems to rule out DLT as it is meant for incremental proc...

  • 62 Views
  • 2 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@SamGreene Simply write your sql queries as a python variables and then run them through spark.sql(qry)

  • 0 kudos
1 More Replies
dbal
by New Contributor III
  • 22 Views
  • 0 replies
  • 0 kudos

withColumnRenamed does not work with databricks-connect 14.3.0

I am not able to run our unit tests suite due a possible bug in the databricks-connect library. The problem is with the Dataframe transformation withColumnRenamed. When I run it in a Databricks cluster (Databricks Runtime 14.3 LTS), the column is ren...

dbal_3-1715382511871.png dbal_4-1715382516217.png dbal_1-1715383269610.png
  • 22 Views
  • 0 replies
  • 0 kudos
leungi
by New Contributor
  • 26 Views
  • 0 replies
  • 0 kudos

Unable to add column comment in Materialized View (MV)

The following doc suggests the ability to add column comments during MV creation via the `column list` parameter.Thus, the SQL code below is expected to generate a table where the columns `col_1` and `col_2` are commented; however, this is not the ca...

  • 26 Views
  • 0 replies
  • 0 kudos
kDev
by New Contributor
  • 5507 Views
  • 7 replies
  • 2 kudos

UnauthorizedAccessException: PERMISSION_DENIED: User does not have READ FILES on External Location

Our jobs have been running fine so far w/o any issues on a specific workspace. These jobs read data from files on Azure ADLS storage containers and dont use the hive metastore data at all.Now we attached the unity metastore to this workspace, created...

  • 5507 Views
  • 7 replies
  • 2 kudos
Latest Reply
Wojciech_BUK
Contributor III
  • 2 kudos

Cool, I am happy it worked.It would be extremely hard to find it out without looking into your env but to be honest I struggled with those identities, principals and so on, so good that solution will be on this forum.I don't know how you managed to g...

  • 2 kudos
6 More Replies
Hardy
by New Contributor III
  • 2981 Views
  • 5 replies
  • 4 kudos

The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption

I am trying to connect to SQL through JDBC from databricks notebook. (Below is my notebook command)val df = spark.read.jdbc(jdbcUrl, "[MyTableName]", connectionProperties) println(df.schema)When I execute this command, with DBR 10.4 LTS it works fin...

  • 2981 Views
  • 5 replies
  • 4 kudos
Latest Reply
DBXC
Contributor
  • 4 kudos

Try to add the following parameters to your SQL connection string. It fixed my problem for 13.X and 12.X;trustServerCertificate=true;hostNameInCertificate=*.database.windows.net; 

  • 4 kudos
4 More Replies
satishnavik
by New Contributor II
  • 1922 Views
  • 5 replies
  • 0 kudos

How to connect Databricks Database with Springboot application using JPA

facing issue with integrating our Spring boot JPA supported application with Databricks.Below are the steps and setting we did for the integration.When we are starting the spring boot application we are getting a warning as :HikariPool-1 - Driver doe...

  • 1922 Views
  • 5 replies
  • 0 kudos
Latest Reply
172036
New Contributor II
  • 0 kudos

Was there any resolution to this?  Is Spring datasource supported now?

  • 0 kudos
4 More Replies
mh_db
by New Contributor II
  • 56 Views
  • 0 replies
  • 0 kudos

How to get different dynamic value for each task in workflow

I created a workflow with two tasks. It runs the first notebook and then it wait for that to finish to start the second notebook. I want to use this dynamic value as one of the parameters {{job.start_time.iso_datetime}} for both tasks. This should gi...

  • 56 Views
  • 0 replies
  • 0 kudos
Clampazzo
by New Contributor II
  • 80 Views
  • 3 replies
  • 0 kudos

Power BI RLS running extremely slowly with databricks

Hi Everyone,I am brand new to databricks and am setting up my first Semantic Model with RLS and have run into an unexpected problem.When I was testing my model with filters applied (where the RLS would handle later on) it runs extremely fast.  I look...

Data Engineering
Power BI
sql
  • 80 Views
  • 3 replies
  • 0 kudos
Latest Reply
KTheJoker
Contributor II
  • 0 kudos

Are you trying to use Power BI RLS rules on top of DirectQuery? Can you give an example of the rules you're trying to apply? Are they static roles, or dynamic roles based on the user's UPN/email being in the dataset?

  • 0 kudos
2 More Replies
MarkD
by New Contributor
  • 45 Views
  • 0 replies
  • 0 kudos

SET configuration in SQL DLT pipeline does not work

Hi,I'm trying to set a dynamic value to use in a DLT query, and the code from the example documentation does not work.SET startDate='2020-01-01'; CREATE OR REFRESH LIVE TABLE filtered AS SELECT * FROM my_table WHERE created_at > ${startDate};It is g...

Data Engineering
Delta Live Tables
dlt
sql
  • 45 Views
  • 0 replies
  • 0 kudos
pjv
by Visitor
  • 36 Views
  • 0 replies
  • 0 kudos

Asynchronous API calls from Databricks Workflow job

Hi all,I have many API calls to run on a python Databricks notebook which I then run regularly on a Databricks Workflow job. When I test the following code on an all purpose cluster locally i.e. not via a job, it runs perfectly fine. However, when I ...

  • 36 Views
  • 0 replies
  • 0 kudos
Mathias_Peters
by New Contributor III
  • 65 Views
  • 2 replies
  • 0 kudos

Resolved! DLT table not picked in python notebook

Hi, I am a bit stumped atm bc I cannot figure out how to get a DLT table definition picked up in a Python notebook. 1. I created a new notebook in python2. added the following code:  %python import dlt from pyspark.sql.functions import * @dlt.table(...

Mathias_Peters_0-1715334658498.png
  • 65 Views
  • 2 replies
  • 0 kudos
Latest Reply
Mathias_Peters
New Contributor III
  • 0 kudos

Ok, it seems that the default language of the notebook and the language of a particular cell can clash. If the default is set to Python, switching a cell to SQL won't work in DLT and vice versa. This is super unintuitive tbh.

  • 0 kudos
1 More Replies
vvt1976
by Visitor
  • 38 Views
  • 0 replies
  • 0 kudos

Create table using a location

Hi,Databricks newbie here. I have copied delta files from my Synapse workspace into DBFS. To add them as a table, I executed.create table audit_payload using delta location '/dbfs/FileStore/data/general/audit_payload'The command executed properly. Ho...

Data Engineering
data engineering
  • 38 Views
  • 0 replies
  • 0 kudos
Labels
Top Kudoed Authors