cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mortenhaga
by Contributor
  • 4157 Views
  • 4 replies
  • 4 kudos

Resolved! SQL Serverless Endpoint failing to start with Instance Profile

Hi allSuper stoked about the PP of SQL Serverless, but it does seem that the instance profile Im using doesnt have the required trust relationship for it to work with the Sererless Endpoint. Altough on "classic" mode it works fine. Does Serverless re...

image
  • 4157 Views
  • 4 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Thank you for sharing your valuable solution, it's work properly.mymilestonecard

  • 4 kudos
3 More Replies
Prabakar
by Databricks Employee
  • 7838 Views
  • 2 replies
  • 7 kudos

Resolved! Library installation fails with mirror sync issue

While trying to install ffmpeg package using an init script on Databricks cluster, it fails with the below error.Init script:#! /bin/bash set -e sudo apt-get update sudo apt-get -y install ffmpegError message:E: Failed to fetch http://security.ubuntu...

  • 7838 Views
  • 2 replies
  • 7 kudos
Latest Reply
Prabakar
Databricks Employee
  • 7 kudos

Cause: The VMs are pointing to the cached old mirror which is not up-to-date. Hence there is a problem with downloading the package and it's failing. Workaround: Use the below init script to install the package "ffmpeg". To revert to the original lis...

  • 7 kudos
1 More Replies
Sunny
by New Contributor III
  • 8220 Views
  • 7 replies
  • 4 kudos

Resolved! Retrieve job id and run id from scala

I need to retrieve job id and run id of the job from a jar file in Scala.When I try to compile below code in IntelliJ, below error is shown.import com.databricks.dbutils_v1.DBUtilsHolder.dbutils   object MainSNL {   @throws(classOf[Exception]) de...

  • 8220 Views
  • 7 replies
  • 4 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 4 kudos

Maybe its worth going through the Task Parameter variables section of the below dochttps://docs.databricks.com/data-engineering/jobs/jobs.html#task-parameter-variables

  • 4 kudos
6 More Replies
Mohit_m
by Valued Contributor II
  • 5288 Views
  • 1 replies
  • 2 kudos

Resolved! Databricks jobs create API throws unexpected error

Databricks jobs create API throws unexpected errorError response :{"error_code": "INVALID_PARAMETER_VALUE","message": "Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size"}Any idea on this?

  • 5288 Views
  • 1 replies
  • 2 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 2 kudos

Could you please specify num_workers in the json body and try API again.Also, another recommendation can be configuring what you want in UI, and then pressing “JSON” button that should show corresponding JSON which you can use for API

  • 2 kudos
lav
by New Contributor III
  • 1387 Views
  • 1 replies
  • 1 kudos

Correlated Column Exception in Spark SQL

Hi Johan,Were you able to resolve the correlated column exception issue? I have been stuck on this since past week. If you can guide me that will be alot of help.Thanks.

  • 1387 Views
  • 1 replies
  • 1 kudos
Latest Reply
Johan_Van_Noten
New Contributor III
  • 1 kudos

Seems to be a duplicate of your comment on https://community.databricks.com/s/question/0D53f00001XCuCACA1/correlated-column-exception-in-sql-udf-when-using-udf-parameters. I guess you did that to be able to put other tags?

  • 1 kudos
darshan
by New Contributor III
  • 22482 Views
  • 13 replies
  • 12 kudos

Resolved! Is there a way to run notebooks concurrently in same session?

tried using-dbutils.notebook.run(notebook.path, notebook.timeout, notebook.parameters)but it takes 20 seconds to start new session. %run uses same session but cannot figure out how to use it to run notebooks concurrently.

  • 22482 Views
  • 13 replies
  • 12 kudos
Latest Reply
rudesingh56
New Contributor II
  • 12 kudos

I’ve been struggling with opening multiple browser sessions to open more than one notebook at a time.

  • 12 kudos
12 More Replies
TheOptimizer
by Contributor
  • 12253 Views
  • 5 replies
  • 8 kudos

Resolved! How to create delta table with identity column.

I'm sure this is probably some oversight on my part, but I don't see it. I'm trying to create a delta table with an identity column. I've tried every combination of the syntax I can think of. %sql create or replace table IDS.picklist ( picklist_id...

Capture
  • 12253 Views
  • 5 replies
  • 8 kudos
Latest Reply
lucas_marchand
New Contributor III
  • 8 kudos

I was also having this same error and my cluster was running Databricks Runtime Version 9.1 so I changed it to 11.0 and it worked.

  • 8 kudos
4 More Replies
abd
by Contributor
  • 1580 Views
  • 0 replies
  • 0 kudos

Why use databricks over other tools ?

What is something special about databricks.What databricks provides that no other tool in the market provides ?How can I convince some other person to use databricks and not some other tool ?

  • 1580 Views
  • 0 replies
  • 0 kudos
spartakos
by New Contributor
  • 977 Views
  • 0 replies
  • 0 kudos

Big data ingest into Delta Lake

I have a feature table in BQ that I want to ingest into Delta Lake. This feature table in BQ has 100TB of data. This table can be partitioned by DATE.What best practices and approaches can I take to ingest this 100TB? In particular, what can I do to ...

  • 977 Views
  • 0 replies
  • 0 kudos
merca
by Valued Contributor II
  • 2265 Views
  • 2 replies
  • 4 kudos

DLT schema ambiguity

I have schema:| |-- costCentres: struct (nullable = true) | | |-- dimension1: struct (nullable = true) | | | |-- name: string (nullable = true) | | | |-- value: string (nullable = true) | | |-- dimension10: struct...

  • 2265 Views
  • 2 replies
  • 4 kudos
Latest Reply
PeteC
New Contributor III
  • 4 kudos

I've got the same problem - but using a SQL Select statement (with some explodes).

  • 4 kudos
1 More Replies
Shellytest
by New Contributor
  • 1581 Views
  • 1 replies
  • 1 kudos
  • 1581 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rheiman
Contributor II
  • 1 kudos

Through mounting your Databricks resource and your storage resource. Here is a sample using azure blob storage Azure Blob storage - Azure Databricks | Microsoft Docs

  • 1 kudos
Laniel
by New Contributor
  • 1451 Views
  • 1 replies
  • 0 kudos

‘How do you get cost of a notebook run?’

‘How do you get cost of a notebook run?’

  • 1451 Views
  • 1 replies
  • 0 kudos
Latest Reply
Rheiman
Contributor II
  • 0 kudos

You can check your cloud provider's portal. Go to the subscription > costs field and you should be able to see the costs of the VMs and Databricks. For more granular information, consider installing overwatch.Environment Setup :: Overwatch (databrick...

  • 0 kudos
Reabouri
by New Contributor
  • 1444 Views
  • 1 replies
  • 1 kudos
  • 1444 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rheiman
Contributor II
  • 1 kudos

Table ACLs, Hashing, Anonymization and Pseudonymization of PII to name a few.You can learn everything in the databricks academy course for professional data engineering.

  • 1 kudos
harrisriaz
by New Contributor
  • 3971 Views
  • 2 replies
  • 5 kudos

Resolved! what are the key Data engineering problems that databricks solve?

what are the problem that databricks address from typical data engineering prespective and comparing with other cloud DE tools.

  • 3971 Views
  • 2 replies
  • 5 kudos
Latest Reply
Rheiman
Contributor II
  • 5 kudos

Annoying things databricks solvesSane Data Movement (Fast Parallelized Compute, Table Versioning and History)Environment Management (spark + delta + java) are installed out-of-the-boxCost and Job Monitoring (Overwatch)I've only worked with it for 6 m...

  • 5 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels