cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

HQJaTu
by New Contributor III
  • 2584 Views
  • 3 replies
  • 2 kudos

Custom container doesn't launch systemd

Quite soon after moving from VMs to containers, I started crafting my own images. That way notebooks have all the necessary libraries already there and no need to do any Pipping/installing in the notebook.As requirements get more complex, now I'm at ...

  • 2584 Views
  • 3 replies
  • 2 kudos
Latest Reply
futurewasfree
New Contributor II
  • 2 kudos

Are there any updates on this? I'm also very interested in having full-fledged Databricks system services integrated into DCS.

  • 2 kudos
2 More Replies
Enrique1987
by New Contributor III
  • 2849 Views
  • 2 replies
  • 3 kudos

Resolved! when to activate photon and when not to ?

Photon appears as an option to check and uncheck as appropriate.The use of Photon leads to higher consumption of DBUs and higher costs.At what point does it pay off and when not to enable it.More costs for the use of photon, but at the same time less...

  • 2849 Views
  • 2 replies
  • 3 kudos
Latest Reply
sunlight
New Contributor II
  • 3 kudos

Hi All,Based on the discussion, can we load a huge flat file (csv) 10 GB using photon accelerated Runtime .Just dump that file into delta from cloud storage like S3 or BlobStorage.Is this one of those ideal usecases for using photon where it will be ...

  • 3 kudos
1 More Replies
Optum
by New Contributor III
  • 11306 Views
  • 9 replies
  • 5 kudos

Databricks JDBC & Remote Write

Hello,I'm trying to write to a Delta Table in my Databricks instance from a remote Spark session on a different cluster with the Simba Spark driver. I can do reads, but when I attempt to do a write, I get the following error:{  df.write.format("jdbc...

  • 11306 Views
  • 9 replies
  • 5 kudos
Latest Reply
RoK1
New Contributor II
  • 5 kudos

Any update on the issue?

  • 5 kudos
8 More Replies
maddan80
by New Contributor II
  • 1503 Views
  • 4 replies
  • 3 kudos

Oracle Essbase connectivity

Team, I wanted to understand the best way of connecting to Oracle Essbase to ingest data into the delta lake

  • 1503 Views
  • 4 replies
  • 3 kudos
Latest Reply
BigRoux
Databricks Employee
  • 3 kudos

I would start by looking at Oracle DataDirect ODBC. It is optimized for Oracle and it supports Essbase. I believe this driver is included with Essbase. Hope this helps. Lou.

  • 3 kudos
3 More Replies
Fraip
by New Contributor
  • 1117 Views
  • 1 replies
  • 0 kudos

Unable to read files or write to from external location S3 (DataBricks Free Trial)

Hi! I'm trying DataBricks free trial and I tried to link it to an S3 Bucket I set up but I get errors related to serverless policies and unauthorized access whether I tried to read or write to S3, but I have no problem just listing the files that exi...

  • 1117 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

Your error may be caused by serverless network policy restrictions and/or missing S3 permissions. In the free trial, you cannot use your own S3 buckets with serverless compute. For full access, use a paid workspace and configure both network policy a...

  • 0 kudos
BalaRamesh
by New Contributor II
  • 390 Views
  • 3 replies
  • 0 kudos

Delta Live tables - If there is no target schema defined , where live tables will create.

Currently i am working Delta live tables.  one of  my ex - team member designed the job and they did not defined in target schema as empty  in destinations (settings -->destination ---> Target Shema). where delta live tables will create if it is empt...

BalaRamesh_0-1746779647873.png
  • 390 Views
  • 3 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@BalaRamesh If you have catalog specified, there will be storage location for that and you will see this MV created there.Refer to this doc to understand about storage location: https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/m...

  • 0 kudos
2 More Replies
Garrus990
by New Contributor II
  • 1002 Views
  • 2 replies
  • 1 kudos

How to run a python task that uses click for CLI operations

Hey,in my application I am using click to facilitate CLI operations. It works locally, in notebooks, when scripts are run locally, but it fails in Databricks. I defined a task that, as an entrypoint, accepts the file where the click-decorated functio...

  • 1002 Views
  • 2 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

The SystemExit issue you’re seeing is typical with Click, as it’s designed for standalone CLI applications and automatically calls sys.exit() after running a command. This behavior can trigger SystemExit exceptions in non-CLI environments, like Datab...

  • 1 kudos
1 More Replies
nchittampelly
by New Contributor
  • 844 Views
  • 1 replies
  • 0 kudos

What is the best way to connect Oracle CRM cloud from databricks?

What is the best way to connect Oracle CRM cloud from databricks?

  • 844 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Contributor
  • 0 kudos

Hi @nchittampelly Follow these steps:Use a Compatible JDBC DriverUse a suitable JDBC driver such as CData Oracle Sales Cloud JDBC or HCM Cloud JDBC driver.Upload the Driver to DatabricksAdd the driver JAR to your cluster via the Libraries tab. For Un...

  • 0 kudos
eyalo
by New Contributor II
  • 1633 Views
  • 1 replies
  • 0 kudos

Ingest from FTP server doesn't work

Hi,I am trying to connect my FTP server and store the files to a dataframe with the following code:%pip install ftputilfrom ftputil import FTPHostHost = "92.118.67.49"Login = "StrideNBM-DF_BO"Passwd = "Sdf123456"ftp_dir = "/dwh-reports/"with FTPHost(...

  • 1633 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kayla
Valued Contributor II
  • 0 kudos

I'm afraid I don't have an answer, and I know this is an old post, but if you haven't already, I'd recommend changing the password if that is/was a genuine password.

  • 0 kudos
Sudheerreddy25
by New Contributor II
  • 5429 Views
  • 8 replies
  • 1 kudos

Resolved! Regarding Exam got Suspended at middle without any reason.

Hi Team,My Databricks Certified Data Engineer Associate (Version 3) exam got suspended on 25th August and it is in progress state.I was there continuously in front of the camera and suddenly the alert appeared, and support person asked me to show the...

  • 5429 Views
  • 8 replies
  • 1 kudos
Latest Reply
Sneha2
New Contributor II
  • 1 kudos

Hi Team ,During the exam, I was asked to show my room from all four sides, which I did promptly. There was no one else in my room, no background noise, and no inappropriate behavior or activity of any kind.Despite my compliance, my exam was unexpecte...

  • 1 kudos
7 More Replies
alsetr
by New Contributor II
  • 674 Views
  • 3 replies
  • 0 kudos

Executor OOM Error with AQE enabled

We have Databricks Spark Job. After migration from Databricks Runtime 10.4 to 15.4 one of our Spark jobs which uses broadcast hint started to fail with error:```ERROR Executor: Exception in task 2.0 in stage 371.0 (TID 16912)org.apache.spark.memory.S...

  • 674 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

ok, that is up to you.An executor will not be able to take all the ram.you can try to work with the spark.executor parameters.

  • 0 kudos
2 More Replies
alsetr
by New Contributor II
  • 367 Views
  • 1 replies
  • 0 kudos

Disable Databricks-generated error messages

Since Databricks Runtime 12.2 Databricks started to wrap spark exceptions in their own exceptions.https://learn.microsoft.com/en-us/azure/databricks/error-messages/While for some users it might be handy, for our team it is not convinient, as we canno...

  • 367 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Databricks does not use vanilla spark.They added optimizations like the AQE, unity catalog etc.So looking for the error in the spark source code will not always work (in a lot of cases it will)

  • 0 kudos
deano2025
by New Contributor II
  • 555 Views
  • 2 replies
  • 0 kudos

Resolved! How to create an external location that accesses a public s3 bucket

Hi,I'm trying to create an external location that accesses a public s3 bucket (for open data). However, I'm not having any success. I'm confused to what to specify as the storage credential (IAM role) since its a public bucket that is out of my contr...

  • 555 Views
  • 2 replies
  • 0 kudos
Latest Reply
deano2025
New Contributor II
  • 0 kudos

Thanks @Isi Now that you've explained external locations, I think it does indeed make sense that they are probably unnecessary in this case. Thanks for clarifying!

  • 0 kudos
1 More Replies
carlos_tasayco
by New Contributor III
  • 434 Views
  • 1 replies
  • 0 kudos

Materializing tables in custom schemas is not supported.

Hello,I have been seeing this:https://www.databricks.com/blog/publish-multiple-catalogs-and-schemas-single-dlt-pipelineNow dlt pipelines support multiple schemas, however is not working my case:Did I do something wrong?Thanks in advance 

carlos_tasayco_0-1742491316716.png carlos_tasayco_1-1742491343464.png
  • 434 Views
  • 1 replies
  • 0 kudos
Latest Reply
MauricioS
New Contributor III
  • 0 kudos

Hi Carlos,Hope you are doing well, did you get any update on this issue, I'm currently running into the same problem.

  • 0 kudos
ankitmit
by New Contributor III
  • 2035 Views
  • 6 replies
  • 3 kudos

DLT Apply Changes

Hi,In DLT, how do we specify which columns we don't want to overwrite when using the “apply changes” operation in the DLT (in the attached example, we want to avoid overwriting the “created_time” column)?I am using this sample code dlt.apply_changes(...

  • 2035 Views
  • 6 replies
  • 3 kudos
Latest Reply
juice
New Contributor II
  • 3 kudos

I'm experience same issue where the columns added to except_column_list are being dropped instead of ignored.

  • 3 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels