cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

krishnaarige
by New Contributor
  • 1942 Views
  • 1 replies
  • 0 kudos

OperationalError: 250003: Failed to get the response. Hanging? method: get

OperationalError: 250003: Failed to get the response. Hanging? method: get, url: https://cdodataplatform.east-us-2.privatelink.snowflakecomputing.com:443/queries/01ae7ab6-0c04-e4bd-011c-e60552f6cf63/result?request_guid=315c25b7-f17d-4123-a2e5-6d82605...

  • 1942 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

could you please share the full error stack trace? 

  • 0 kudos
pygreg
by New Contributor
  • 1628 Views
  • 0 replies
  • 0 kudos

Workflows "Run now with different parameters" UI proposal

Hello everyone!I've been working with the Databricks platform for a few months now and I have a suggestion/proposal regarding the UI interface of Workflows.First, let me explain what I find not so ideal.Let's say we have a job with three Notebook Tas...

  • 1628 Views
  • 0 replies
  • 0 kudos
Rafal9
by New Contributor II
  • 4062 Views
  • 1 replies
  • 1 kudos

DAB: NameError: name '__file__' is not defined

Hi Everyone,I am running job task using Asset Bundle.Bundle has been validated and deployed according to: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/work-tasksPart of the databricks.yml bundle: name: etldatabricks resourc...

  • 4062 Views
  • 1 replies
  • 1 kudos
Priyag1
by Honored Contributor II
  • 3548 Views
  • 2 replies
  • 7 kudos

Migration of all Databricks SQL content to the workspace browser

Migration of all Databricks SQL content to the workspace browser Databricks will force-migrate all Databricks SQL content (dashboards, queries, alerts) to the workspace browser. Visit My Queries, My Alerts, and My Dashboards and look for any un-migra...

Data Engineering
Dashboards
Databricks SQL
Visualization
  • 3548 Views
  • 2 replies
  • 7 kudos
Latest Reply
joseheliomuller
New Contributor III
  • 7 kudos

The ability to easily migrate queries and dashboards across Databricks Workspace it extremely important.In my company we have dev, stg and production workspaces, with same pipeline creating the data.We create our dashboards in DEV and then we have to...

  • 7 kudos
1 More Replies
samye760
by New Contributor
  • 1787 Views
  • 0 replies
  • 0 kudos

Job Retry Wait Policy and Cluster Shutdown

Hi all,I have a Databricks Workflow job in which the final task makes an external API call. Sometimes this API will be overloaded and the call will fail. In the spirit of automation, I want this task to retry the call an hour later if it fails in the...

Data Engineering
clusters
jobs
retries
Workflows
  • 1787 Views
  • 0 replies
  • 0 kudos
vlado101
by New Contributor II
  • 4009 Views
  • 1 replies
  • 1 kudos

Resolved! ANALYZE TABLE is not updating columns stats

Hello everyone,So I am having an issue when running "ANALYZE TABLE COMPUTE STATISTICS FOR ALL COLUMNS". The way I understand it this should update the min/max value for a column when you run it for all or one column. One way to verify it from what I ...

  • 4009 Views
  • 1 replies
  • 1 kudos
Latest Reply
Priyanka_Biswas
Databricks Employee
  • 1 kudos

Hello @vlado101  The ANALYZE TABLE COMPUTE STATISTICS FOR ALL COLUMNS command in Databricks is used to compute statistics for all columns of a table. This information is persisted in the metastore and helps the query optimizer make decisions such as ...

  • 1 kudos
TimReddick
by Contributor
  • 8627 Views
  • 6 replies
  • 2 kudos

Using run_job_task in Databricks Asset Bundles

Do Databrick Asset Bundles support run_job_task tasks?I've made various attempts to add a run_job_task with a specified job_id. See my the code_snippet below. I tried substituting the job_id using ${...} syntax, as well as three other ways which I've...

Data Engineering
Databrick Asset Bundles
run_job_task
  • 8627 Views
  • 6 replies
  • 2 kudos
Latest Reply
kyle_r
New Contributor II
  • 2 kudos

Ah, I see it is a known bug in the Databricks CLI: Asset bundle run_job_task fails · Issue #812 · databricks/cli (github.com). Anyone facing this issue should comment on and keep an eye on that ticket for resolution. 

  • 2 kudos
5 More Replies
melodiesd
by New Contributor
  • 6904 Views
  • 0 replies
  • 0 kudos

Parse_Syntax_Error Help

Hello all, I'm new to Databricks and can't figure out why I'm getting an error in my SQL code.Error in SQL statement: ParseException: [PARSE_SYNTAX_ERROR] Syntax error at or near 'if'.(line 1, pos 0) == SQL == if OBJECT_ID('tempdb.#InitialData') IS N...

  • 6904 Views
  • 0 replies
  • 0 kudos
GriffLehman
by New Contributor II
  • 1098 Views
  • 1 replies
  • 0 kudos

PROBLEM- Missing data in "Last Run" column in Databricks Workflows UI

Hello,I am having a pretty major problem with the Databricks Workflows UI- when I look at the list of jobs, the "Last Run" column does not have any data in it. This is kind of a big problem because now I don't have a good way of getting visibility in...

  • 1098 Views
  • 1 replies
  • 0 kudos
Latest Reply
GriffLehman
New Contributor II
  • 0 kudos

  

  • 0 kudos
DE-cat
by New Contributor III
  • 1743 Views
  • 0 replies
  • 0 kudos

DatabricksStreamingQueryListener Stopping the stream

I am running the following structured streaming Scala code in DB 13.3LTS job:  val query = spark.readStream.format("delta") .option("ignoreDeletes", "true") .option("maxFilesPerTrigger", maxEqlPerBatch) .load(tblPath) .writeStream .qu...

  • 1743 Views
  • 0 replies
  • 0 kudos
JKR
by Contributor
  • 4107 Views
  • 4 replies
  • 1 kudos

Resolved! Got Failure: com.databricks.backend.common.rpc.SparkDriverExceptions$ReplFatalException error

Got below failure on scheduled job on interactive cluster and the next scheduled run executed fine.I want to know why this error occurred and how can I prevent it to happen again.And how to debug these errors in future ?  com.databricks.backend.commo...

  • 4107 Views
  • 4 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

@JKR Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 1 kudos
3 More Replies
scvbelle
by New Contributor III
  • 3905 Views
  • 3 replies
  • 3 kudos

Resolved! DLT failure: ABFS does not allow files or directories to end with a dot

In my DLT pipeline outlined below which generically cleans identifier tables, after successfully creating initial streaming tables from the append-only sources, fails when trying to create the second cleaned tables witht the following:It'**bleep** cl...

Data Engineering
abfss
azure
dlt
engineering
  • 3905 Views
  • 3 replies
  • 3 kudos
Latest Reply
Priyanka_Biswas
Databricks Employee
  • 3 kudos

Hi @scvbelle The error message you're seeing is caused by an IllegalArgumentException error due to the restriction in Azure Blob File System (ABFS) that does not allow files or directories to end with a dot. This error is thrown by the trailingPeriod...

  • 3 kudos
2 More Replies
jaredwolf
by New Contributor II
  • 2294 Views
  • 2 replies
  • 6 kudos

_sqldf bugs in GCP workspaces?

Utilizing GCP instances using the 12.2DBR ML runtime. Prior to ~7:10CT last night, _sqldf commands in notebooks to reference the previously executed %sql cell would work locally as well as in scheduled Workflow Job runs. Now it appears that the code ...

Data Engineering
_sqldf
GCP
spark
SparkSQL
Workflows
  • 2294 Views
  • 2 replies
  • 6 kudos
Latest Reply
Kayla
Valued Contributor
  • 6 kudos

It looks like Azure was having the same issue, it might just be all 12.2 Photon clusters.https://community.databricks.com/t5/data-engineering/sqldf-bugs-in-gcp-workspaces/td-p/38578 That post says it was been fixed, but last I checked it was still fa...

  • 6 kudos
1 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 4957 Views
  • 2 replies
  • 1 kudos

Resolved! How to solve - gRPC message exceeds maximum size 4194304 pubsub using databricks

I am getting below error while streaming the data from pubsub using databricks DLT pipelines If anyone can help to increase the gRPC message size will help alot. 

ggg.jpg
  • 4957 Views
  • 2 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

   

  • 1 kudos
1 More Replies
Labels