cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

v_n66
by New Contributor III
  • 3029 Views
  • 8 replies
  • 2 kudos

Resolved! dbutils.notebooks.exit() is not returning results to the variable only on some notebooks

dbutils.notebooks.exit() is not returning results to the variable only on some notebooks issue is Parent child notebook communication and need some solution

  • 3029 Views
  • 8 replies
  • 2 kudos
Latest Reply
v_n66
New Contributor III
  • 2 kudos

@SteveW @lorenzoscandola Its working without any issue from Yesterday. Thank you guys for the support.

  • 2 kudos
7 More Replies
Rishabh-Pandey
by Databricks MVP
  • 16712 Views
  • 8 replies
  • 8 kudos

Resolved! connect databricks to teradata

hey i want to know can we connect databricks to the teradata database and if yes what will be the procedure ??? help would be appreciated

  • 16712 Views
  • 8 replies
  • 8 kudos
Latest Reply
BroData
New Contributor II
  • 8 kudos

There are two main ways to connect to Teradata from Databricks using Python.Way 1: Using Python Libraries (e.g., sqlalchemy, pyjdbc, pyodbc, jaydebeapi, and so on)Pros: Provides a comprehensive solution, allowing us to: Query data, Trigger stored pro...

  • 8 kudos
7 More Replies
Nick_Pacey
by New Contributor III
  • 4636 Views
  • 3 replies
  • 1 kudos

Issue when trying to create a Foreign Catalog to a On Prem SQL Server Instance

Hi,We are creating a lakehouse federated connection to our 2016 On Prem SQL Server.  This has an instance in place, so we only want and need to connect to this instance.  From this connection, we want to create a foreign catalog of a database on the ...

  • 4636 Views
  • 3 replies
  • 1 kudos
Latest Reply
trueray_3150
New Contributor II
  • 1 kudos

Hi @Nick_Pacey  Thank you I already did that using in the codejdbc_url = "jdbc:sqlserver://999.99.999.99\\instance:7777;encrypt=true;trustServerCertificate=true;database=mydatabase"jdbc_username = "myusername"jdbc_password = "mypassword"jdbc_driver =...

  • 1 kudos
2 More Replies
Direo
by Contributor II
  • 39565 Views
  • 6 replies
  • 1 kudos

Resolved! Importing CA certificate into a Databricks cluster

Hi!I was following guide outlined here:https://kb.databricks.com/en_US/python/import-custom-ca-cert(also tried this: https://stackoverflow.com/questions/73043589/configuring-tls-ca-on-databricks)to add ca root certificate into Databricks cluster, but...

  • 39565 Views
  • 6 replies
  • 1 kudos
Latest Reply
jash281098
New Contributor II
  • 1 kudos

@Debayan One question - Will same approach work for JKS file containing private key certificate for X.509 authentication to Mongo Atlas database.Usual way of adding below spark config's is not working. spark.driver.extraJavaOptions -Djavax.net.ssl.ke...

  • 1 kudos
5 More Replies
Stentone
by New Contributor III
  • 1382 Views
  • 3 replies
  • 0 kudos

DLT Direct Publish Mode does not Handle Constraint Dependencies

I'm having some issues with the direct publish mode when defining a DLT workflow that includes tables where their schema defines foreign key constraints. When the foreign constraints reference tables that are not directly defined in any joins of the ...

  • 1382 Views
  • 3 replies
  • 0 kudos
Latest Reply
lingareddy_Alva
Esteemed Contributor
  • 0 kudos

@Stentone This is a tricky situation where you want to leverage the metadata benefits (like the ERD visualization) without running into execution dependencies. Let me help you solve this issue.The error suggests that DLT is trying to validate the for...

  • 0 kudos
2 More Replies
HQJaTu
by New Contributor III
  • 3397 Views
  • 3 replies
  • 2 kudos

Custom container doesn't launch systemd

Quite soon after moving from VMs to containers, I started crafting my own images. That way notebooks have all the necessary libraries already there and no need to do any Pipping/installing in the notebook.As requirements get more complex, now I'm at ...

  • 3397 Views
  • 3 replies
  • 2 kudos
Latest Reply
futurewasfree
New Contributor II
  • 2 kudos

Are there any updates on this? I'm also very interested in having full-fledged Databricks system services integrated into DCS.

  • 2 kudos
2 More Replies
Enrique1987
by New Contributor III
  • 4575 Views
  • 2 replies
  • 3 kudos

Resolved! when to activate photon and when not to ?

Photon appears as an option to check and uncheck as appropriate.The use of Photon leads to higher consumption of DBUs and higher costs.At what point does it pay off and when not to enable it.More costs for the use of photon, but at the same time less...

  • 4575 Views
  • 2 replies
  • 3 kudos
Latest Reply
sunlight
New Contributor II
  • 3 kudos

Hi All,Based on the discussion, can we load a huge flat file (csv) 10 GB using photon accelerated Runtime .Just dump that file into delta from cloud storage like S3 or BlobStorage.Is this one of those ideal usecases for using photon where it will be ...

  • 3 kudos
1 More Replies
Optum
by Databricks Partner
  • 14324 Views
  • 9 replies
  • 5 kudos

Databricks JDBC & Remote Write

Hello,I'm trying to write to a Delta Table in my Databricks instance from a remote Spark session on a different cluster with the Simba Spark driver. I can do reads, but when I attempt to do a write, I get the following error:{  df.write.format("jdbc...

  • 14324 Views
  • 9 replies
  • 5 kudos
Latest Reply
RoK1
New Contributor II
  • 5 kudos

Any update on the issue?

  • 5 kudos
8 More Replies
Fraip
by New Contributor
  • 3972 Views
  • 1 replies
  • 0 kudos

Unable to read files or write to from external location S3 (DataBricks Free Trial)

Hi! I'm trying DataBricks free trial and I tried to link it to an S3 Bucket I set up but I get errors related to serverless policies and unauthorized access whether I tried to read or write to S3, but I have no problem just listing the files that exi...

  • 3972 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

Your error may be caused by serverless network policy restrictions and/or missing S3 permissions. In the free trial, you cannot use your own S3 buckets with serverless compute. For full access, use a paid workspace and configure both network policy a...

  • 0 kudos
BalaRamesh
by New Contributor II
  • 1102 Views
  • 3 replies
  • 0 kudos

Delta Live tables - If there is no target schema defined , where live tables will create.

Currently i am working Delta live tables.  one of  my ex - team member designed the job and they did not defined in target schema as empty  in destinations (settings -->destination ---> Target Shema). where delta live tables will create if it is empt...

BalaRamesh_0-1746779647873.png
  • 1102 Views
  • 3 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@BalaRamesh If you have catalog specified, there will be storage location for that and you will see this MV created there.Refer to this doc to understand about storage location: https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/m...

  • 0 kudos
2 More Replies
eyalo
by New Contributor II
  • 2361 Views
  • 1 replies
  • 0 kudos

Ingest from FTP server doesn't work

Hi,I am trying to connect my FTP server and store the files to a dataframe with the following code:%pip install ftputilfrom ftputil import FTPHostHost = "92.118.67.49"Login = "StrideNBM-DF_BO"Passwd = "Sdf123456"ftp_dir = "/dwh-reports/"with FTPHost(...

  • 2361 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kayla
Valued Contributor II
  • 0 kudos

I'm afraid I don't have an answer, and I know this is an old post, but if you haven't already, I'd recommend changing the password if that is/was a genuine password.

  • 0 kudos
Sudheerreddy25
by New Contributor II
  • 7537 Views
  • 8 replies
  • 1 kudos

Resolved! Regarding Exam got Suspended at middle without any reason.

Hi Team,My Databricks Certified Data Engineer Associate (Version 3) exam got suspended on 25th August and it is in progress state.I was there continuously in front of the camera and suddenly the alert appeared, and support person asked me to show the...

  • 7537 Views
  • 8 replies
  • 1 kudos
Latest Reply
Sneha2
New Contributor II
  • 1 kudos

Hi Team ,During the exam, I was asked to show my room from all four sides, which I did promptly. There was no one else in my room, no background noise, and no inappropriate behavior or activity of any kind.Despite my compliance, my exam was unexpecte...

  • 1 kudos
7 More Replies
alsetr
by Databricks Partner
  • 904 Views
  • 1 replies
  • 0 kudos

Disable Databricks-generated error messages

Since Databricks Runtime 12.2 Databricks started to wrap spark exceptions in their own exceptions.https://learn.microsoft.com/en-us/azure/databricks/error-messages/While for some users it might be handy, for our team it is not convinient, as we canno...

  • 904 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Databricks does not use vanilla spark.They added optimizations like the AQE, unity catalog etc.So looking for the error in the spark source code will not always work (in a lot of cases it will)

  • 0 kudos
deano2025
by New Contributor II
  • 1722 Views
  • 2 replies
  • 0 kudos

Resolved! How to create an external location that accesses a public s3 bucket

Hi,I'm trying to create an external location that accesses a public s3 bucket (for open data). However, I'm not having any success. I'm confused to what to specify as the storage credential (IAM role) since its a public bucket that is out of my contr...

  • 1722 Views
  • 2 replies
  • 0 kudos
Latest Reply
deano2025
New Contributor II
  • 0 kudos

Thanks @Isi Now that you've explained external locations, I think it does indeed make sense that they are probably unnecessary in this case. Thanks for clarifying!

  • 0 kudos
1 More Replies
carlos_tasayco
by Contributor
  • 1496 Views
  • 1 replies
  • 0 kudos

Materializing tables in custom schemas is not supported.

Hello,I have been seeing this:https://www.databricks.com/blog/publish-multiple-catalogs-and-schemas-single-dlt-pipelineNow dlt pipelines support multiple schemas, however is not working my case:Did I do something wrong?Thanks in advance 

carlos_tasayco_0-1742491316716.png carlos_tasayco_1-1742491343464.png
  • 1496 Views
  • 1 replies
  • 0 kudos
Latest Reply
MauricioS
Databricks Partner
  • 0 kudos

Hi Carlos,Hope you are doing well, did you get any update on this issue, I'm currently running into the same problem.

  • 0 kudos
Labels