cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

FennVerm_60454
by New Contributor II
  • 5721 Views
  • 4 replies
  • 1 kudos

Resolved! How to clean up extremely large delta log checkpoints and many small files?

AWS by the way, if that matters. We have an old production table that has been running in the background for a couple of years, always with auto-optimize and auto-compaction turned off. Since then, it has written many small files (like 10,000 an hour...

  • 5721 Views
  • 4 replies
  • 1 kudos
Latest Reply
siddhathPanchal
Databricks Employee
  • 1 kudos

Sometime, if we have less commit versions for a delta table, it won't create checkpoint files in the table. Checkpoint file is responsible to trigger the log cleanup activities. In case, you observe that there are no checkpoint files available for th...

  • 1 kudos
3 More Replies
Kayla
by Valued Contributor II
  • 1688 Views
  • 3 replies
  • 1 kudos

Resolved! Datadog Installation

Is anyone familiar with installing the Datadog agent on clusters? We're not having much luck.  We honestly might not be having the init script run since we're not seeing it in the log, but we can get just a generic "hellow world" init script to run a...

  • 1688 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kayla
Valued Contributor II
  • 1 kudos

Responding here with the solution I found. Hopefully it'll help anyone with similar issues.First, the Datadog install script is practically a matryoshka doll- the script creates another script which creates a YAML file.One of the consequences of that...

  • 1 kudos
2 More Replies
erigaud
by Honored Contributor
  • 2597 Views
  • 4 replies
  • 0 kudos

Pass Dataframe to child job in "Run Job" task

Hello,I have a Job A that runs a Job B, and Job A defines a globalTempView and I would like to somehow access it in the child job. Is that in anyway possible ? Can the same cluster be used for both jobs ? If it is not possible, does someone know of a...

  • 2597 Views
  • 4 replies
  • 0 kudos
Latest Reply
rahuja
Contributor
  • 0 kudos

Hi @ranged_coop Yes, we are using the same job compute for using different workflows. But I think different tasks are like different docker containers so that is why it becomes an issue. It would be nice if you can explain a bit about the approach yo...

  • 0 kudos
3 More Replies
tf32
by New Contributor II
  • 2458 Views
  • 2 replies
  • 1 kudos

Resolved! ERROR com.databricks.common.client.DatabricksServiceHttpClientException: DEADLINE_EXCEEDED

Hi,I got this error "com.databricks.WorkflowException: com.databricks.common.client.DatabricksServiceHttpClientException: DEADLINE_EXCEEDED" during the run of a job workflow with an interactive cluster, at the start of this. It's a job that has been ...

  • 2458 Views
  • 2 replies
  • 1 kudos
Latest Reply
tf32
New Contributor II
  • 1 kudos

Yes, subsequent runs have been successful.Thank you for the explanation. 

  • 1 kudos
1 More Replies
Avinash_Narala
by Valued Contributor II
  • 3292 Views
  • 2 replies
  • 2 kudos

Resolved! Databricks AI Assistant Cost Implications

I'm worried about how much the Databricks AI assistant will cost me.I need to understand what I'll be charged for, especially when I give a prompt to the AI Assistant Pane and how it will operate in the background.

  • 3292 Views
  • 2 replies
  • 2 kudos
Latest Reply
Avinash_Narala
Valued Contributor II
  • 2 kudos

Is there any token limit? like in response or the prompt we send?

  • 2 kudos
1 More Replies
brian999
by Contributor
  • 2521 Views
  • 4 replies
  • 2 kudos

Resolved! Managing libraries in workflows with multiple tasks - need to configure a list of libs for all tasks

I have workflows with multiple tasks, each of which need 5 different libraries to run. When I have to update those libraries, I have to go in and make the update in each and every task. So for one workflow I have 20 different places where I have to g...

  • 2521 Views
  • 4 replies
  • 2 kudos
Latest Reply
brian999
Contributor
  • 2 kudos

Actually I think I found most of a solution here in one of the replies: https://community.databricks.com/t5/administration-architecture/installing-libraries-on-job-clusters/m-p/37365/highlight/true#M245It seems like I only have to define libs for the...

  • 2 kudos
3 More Replies
Jreco
by Contributor
  • 3937 Views
  • 2 replies
  • 1 kudos

Resolved! SQLServer Incorrect syntax near the keyword 'WITH'

Hi Mates!I'm trying to get some data from an SQLServer using a query; the query has a WITH statement but I'm getting the following error:raise convert_exception(pyspark.errors.exceptions.connect.SparkConnectGrpcException: (com.microsoft.sqlserver.jdb...

  • 3937 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Jreco ,You need to use prepare query option and then query like below: url = "jdbc:sqlserver://server_name:1433;database=db_name" df = spark.read \ .format("jdbc") \ .option("url", url) \ .option("prepareQuery", "with cte as ( SELECT ...

  • 1 kudos
1 More Replies
gaurav_khanna
by New Contributor II
  • 7047 Views
  • 4 replies
  • 3 kudos
  • 7047 Views
  • 4 replies
  • 3 kudos
Latest Reply
BartRJD
New Contributor II
  • 3 kudos

I am having the same issue (Azure Databricks).I have a running compute cluster analytics-compute-cluster running in Single User access mode.  The Event Log for the cluster says the cluster is running and the "Driver is healthy".I have Manage permissi...

  • 3 kudos
3 More Replies
duttong
by New Contributor III
  • 4665 Views
  • 8 replies
  • 7 kudos

[Errno 11] resource temporarily unavailable

Hi Databricks Community,We faced a strange error today where the error below was returned when a notebook was being run. It only happens on git connected notebooks and on rerun it succeeds. What is the issue? 

duttong_0-1719413522488.png
  • 4665 Views
  • 8 replies
  • 7 kudos
Latest Reply
Witold
Honored Contributor
  • 7 kudos

Just follow https://status.azuredatabricks.net, there you'll see an active incident in West Europe

  • 7 kudos
7 More Replies
SrinuM
by New Contributor III
  • 1480 Views
  • 4 replies
  • 1 kudos

CLOUD_PROVIDER_LAUNCH_FAILURE (CLOUD_FAILURE) for workflow job with all-purpose cluster

One of our databricks workflow job is failing occasionally with below error, after re-running and working fine without any issue.What is the exact reason for the issue and how can we fix itError:Unexpected failure while waiting for the cluster to be ...

  • 1480 Views
  • 4 replies
  • 1 kudos
Latest Reply
PSR100
New Contributor III
  • 1 kudos

These are cloud provider related errors and we will not have much error details from the error message. Based on the error message and also, that you have enough CPU/VM quota available, I think the issue might be due to the storage creation stage in ...

  • 1 kudos
3 More Replies
RKNutalapati
by Valued Contributor
  • 2173 Views
  • 3 replies
  • 0 kudos

Jobs API "run now" - How to set task wise parameters

I have a job with multiple tasks like Task1 -> Task2 -> Task3. I am trying to call the job using api "run now". Task details are belowTask1 - It executes a Note Book with some input parametersTask2 - It runs using "ABC.jar", so its a jar based task ...

  • 2173 Views
  • 3 replies
  • 0 kudos
Latest Reply
Harsha777
New Contributor III
  • 0 kudos

Hi,It would be a good feature to pass parameters at task level. We have scenarios where we would like to create a job with multiple tasks (notebook/dbt) and pass parameters at task level.

  • 0 kudos
2 More Replies
safoineext
by New Contributor
  • 1238 Views
  • 1 replies
  • 0 kudos

Uploading wheel using `dbutils.fs.cp` to workspace and install it in Runtime>15

I have been trying to find an alternative to copying a wheel file from my local file system to Databricks and then installing it into the cluster. Doing this databricks_client.dbutils.fs.cp("file:/local..../..whl", "dbfs:/Workspace/users/..../..whl")...

safoineext_0-1720009993682.png
  • 1238 Views
  • 1 replies
  • 0 kudos
Latest Reply
Rishabh_Tiwari
Databricks Employee
  • 0 kudos

Hi @safoineext , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...

  • 0 kudos
Mahesh_Yadav
by New Contributor II
  • 792 Views
  • 1 replies
  • 0 kudos

System Access Column lineage showing inaccurate results

Hi All,I have been trying to leverage the system column lineage table to check the overall journey of a column. But i am getting inaccurate results wherever unpivot transformations are used.Instead of showing the results in a way that 20 columns are ...

Mahesh_Yadav_1-1719985303244.png
  • 792 Views
  • 1 replies
  • 0 kudos
Latest Reply
Rishabh_Tiwari
Databricks Employee
  • 0 kudos

Hi @Mahesh_Yadav , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your fee...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels