cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

StevenW
by New Contributor III
  • 1004 Views
  • 2 replies
  • 0 kudos

Workflow parameter in sql not working

I'm using the following input parameters when running from a workflow:wid_UnityCatalogName = dbutils.jobs.taskValues.get(taskKey="NB_XXX_Workflow_Parameters", key="p_UnityCatalogName", default="xx_lakehouse_dev")        dbutils.widgets.text("UnityCat...

  • 1004 Views
  • 2 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @StevenW ,I see that you are using python noteboook and then the view is created in SQL.1. If you are using %sql magic command then to use parameters you need to reference them like $parameter or :parameter (depending on the runtime).2. If you are...

  • 0 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 2888 Views
  • 7 replies
  • 6 kudos

Resolved! Retrieve table/view popularity

DearsIs there a way to retrieve the popularity score of an unity catalog object? I looked at APIs documentation but couldn't find one that serves the need.Appreciate any thoughts.Br,Noor.

  • 2888 Views
  • 7 replies
  • 6 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 6 kudos

@filipniziol Hi Filip, Thank you. I did a quick test. In my environment, the table query (indirect) event is getting registered with "getTemporaryTableCredential". However, the view query (direct) event is with "getTable".Thanks for your time again. ...

  • 6 kudos
6 More Replies
james_farrugia
by New Contributor II
  • 2454 Views
  • 4 replies
  • 0 kudos

Cross workspace REST API access denied due to network policies

Hi,Our data workspace architecture consists of a collection of discrete workspace segregated according to business function and environment.  Moreover they are not all deployed to the same region: dev and staging are deployed to south east asia, wher...

  • 2454 Views
  • 4 replies
  • 0 kudos
Latest Reply
james_farrugia
New Contributor II
  • 0 kudos

Hi @filipniziol ,Thanks for the tip. I might repost further queries here especially with regards to NSG as I've never manipulated these manually before.  One question: why is it that vnet peering/NSG alteration are not required when invoking a servic...

  • 0 kudos
3 More Replies
Maulik
by New Contributor
  • 900 Views
  • 1 replies
  • 0 kudos

how to set call back for Databricks Statement Execution SQL API Query?

I m using https://docs.databricks.com/api/workspace/statementexecution. using long running queries.my wait time is zero. queries might take 1 hour and I don't want to do pooling https://docs.databricks.com/api/workspace/statementexecution/getstatemen...

  • 900 Views
  • 1 replies
  • 0 kudos
abc1234
by New Contributor
  • 432 Views
  • 0 replies
  • 0 kudos

Point or Load data from GCP DBFS to AWS DBFS

Hi, is there a way to point GCP DBFS to an AWS DBFS so as to access the data in AWS DBFS from GCP DBFS at minimal cost?We are migrating the jobs from AWS databricks to GCP databricks and the data from the jobs use AWS DBFS as interim location.There s...

  • 432 Views
  • 0 replies
  • 0 kudos
Giorgi
by Contributor
  • 9890 Views
  • 5 replies
  • 5 kudos

GitLab integration

I've followed instructions and did gitlab integration:Generated Personal Access Token from GitLabAdd token (from step 1) to User settings (GitLab, email, token)In Admin console -> Repos Git URL Allow List permissions: Disabled (no restrictions)In Adm...

  • 9890 Views
  • 5 replies
  • 5 kudos
Latest Reply
joshuat
Contributor
  • 5 kudos

Thanks for your reply - I did, see my reply above.

  • 5 kudos
4 More Replies
Sathish_itachi
by New Contributor III
  • 12350 Views
  • 19 replies
  • 13 kudos

Resolved! Encountering an error while accessing dbfs root folders

dbfs file browser storagecontext com.databricks.backend.storage.storagecontexttype$dbfsroot$@4155a7bf for workspace 2818007466707254 is not set in the customerstorageinfo above is the error displayed on the ui

  • 12350 Views
  • 19 replies
  • 13 kudos
Latest Reply
MJ_BE8
New Contributor III
  • 13 kudos

Got this error today when trying to import .csv file. It worked fine before (like, last two weeks?). What happened?

  • 13 kudos
18 More Replies
borori
by New Contributor II
  • 3362 Views
  • 2 replies
  • 0 kudos

Resolved! write operation to the Delta table is not completing.

Using a cluster in serverless mode, three tables are joined and the data frame is written as followsdf.write.mode('append').saveAsTable('table name')and shema is belowdate string (ymd format)id bigintvalue stringpartition by date After about one minu...

  • 3362 Views
  • 2 replies
  • 0 kudos
Latest Reply
borori
New Contributor II
  • 0 kudos

Thank you for your advice. I couldn't come to a conclusion based on what you told me, but it gave me an opportunity to review all the logs again. The cause was that the amount of data became too large due to joining between null data. The advice was ...

  • 0 kudos
1 More Replies
pmarko1711
by New Contributor II
  • 1300 Views
  • 2 replies
  • 0 kudos

External volume over S3 Access point

Can anybody confirm if  external volumes pointing to S3 access points work in Databricks on AWS?I have S3 bucket, but can only access it via S3 access point. The bucket is KMS encrypted.I created an IAM role that can list and read the S3 access point...

  • 1300 Views
  • 2 replies
  • 0 kudos
Latest Reply
pmarko1711
New Contributor II
  • 0 kudos

This look fine to me. I am the owner of the (external) volume and have READ VOLUME privilege on it. (as for the external location I am also its owner and have READ FILES, BROSE, CREATE EXTERNAL TABLE and CREATE EXTERNAL VOLUME)One additional info I g...

  • 0 kudos
1 More Replies
turtleXturtle
by New Contributor II
  • 989 Views
  • 1 replies
  • 0 kudos

Refreshing DELTA external table

I'm having trouble with the REFRESH TABLE command - does it work with DELTA external tables?  I'm doing the following steps:Create table: CREATE TABLE IF NOT EXISTS `catalog`.`default`.`table_name` ( KEY DOUBLE, CUSTKEY DOUBLE, STATUS STRING, PRICE D...

  • 989 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

Step 3: Insert the Data; don't add it directly to the S3 folder. Once it's converted to Delta, it maintains the transaction log. Inserting a Parquet file (followed by another convert /refresh) won't work, as the rest of the dataset is already Delta. ...

  • 0 kudos
chethankumar
by New Contributor III
  • 1865 Views
  • 2 replies
  • 0 kudos

how to assign account level groups to workspace using, Terraform

in the workspace console when I create groups it creates a source as an account, Basically, it is a account level group, But    provider "databricks" { host = var.databricks_host # client_id = "" # client_secret = " account_id = ...

  • 1865 Views
  • 2 replies
  • 0 kudos
Latest Reply
jennie258fitz
New Contributor III
  • 0 kudos

@chethankumar wrote:in the workspace console when I create groups it creates a source as an account, Basically, it is a account level group, But     provider "databricks" { host = var.databricks_host # client_id = "" # client_secre...

  • 0 kudos
1 More Replies
adhi_databricks
by Contributor
  • 10761 Views
  • 13 replies
  • 3 kudos

Trying to use Python source file as module in databricks Notebook

Hi everyone,I’m currently working on a project in Databricks(version 13.3 LTS) and could use some help with importing external Python files as modules into my notebook. I’m aiming to organize my code better and reuse functions across different notebo...

  • 10761 Views
  • 13 replies
  • 3 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 3 kudos

Hi @adhi_databricks ,I am out of ideas in this case. Is utils.py the correct python file, no errors found.Could you test with some simple code like below?I am starting to think there is something wrong with the file (although you mentioned it works i...

  • 3 kudos
12 More Replies
Shan_n
by New Contributor
  • 1220 Views
  • 1 replies
  • 0 kudos

Geometry Data type in sql

Hi All,I am trying to create a table with Geometry datatype column in Databricks SQL.Unfortunately, I am getting not supported data type error.Is there any way I can create a table with Geometry datatype. Thanks.

  • 1220 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Shan_n ,Databricks doesn't have native support for geometry data type. You can look at the list of all available data types below: https://docs.databricks.com/en/sql/language-manual/sql-ref-datatypes.htmlBut there is a way to work with geospatial...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 1446 Views
  • 3 replies
  • 2 kudos

Resolved! ANALYZE table for stats collection

Hi all,I understand ANALYZE table for stats collection does not interfere with write & update operations on a delta table. Please confirm.I like to execute ANALYZE table command post data loads of delta tables but at times the loads could be extended...

  • 1446 Views
  • 3 replies
  • 2 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 2 kudos

@filipniziol thanks for your time in replying. your answer is satisfactory & resolves my queries.

  • 2 kudos
2 More Replies
hemprasad
by New Contributor II
  • 3626 Views
  • 1 replies
  • 0 kudos

I am trying to use spark session of the compute in java Jar to run queries against tables unity cata

I am trying to use spark session of the compute in java Jar to run queries against tables unity catalog . I get the following error  SparkSession spark = SparkSession.builder()                .appName("Databricks Query Example")                .confi...

  • 3626 Views
  • 1 replies
  • 0 kudos
Latest Reply
samantha789
New Contributor II
  • 0 kudos

@hemprasad newjetnet aa loginwrote:I am trying to use spark session of the compute in java Jar to run queries against tables unity catalog . I get the following error  SparkSession spark = SparkSession.builder()                .appName("Databricks Qu...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels