cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rijin-thomas
by New Contributor II
  • 761 Views
  • 4 replies
  • 3 kudos

Mongo Db connector - Connection timeout when trying to connect to AWS Document DB

I am on Databricks Run Time LTE 14.3 Spark 3.5.0 Scala 2.12 and mongodb-spark-connector_2.12:10.2.0. Trying to connect to Document DB using the connector and all I get is a connection timeout. I tried using PyMongo, which works as expected and I can ...

  • 761 Views
  • 4 replies
  • 3 kudos
Latest Reply
Sanjeeb2024
Valued Contributor
  • 3 kudos

Hi @rijin-thomas - Can you please allow the CIDR block for databricks account VPC from aws document db sg ( Executor connectivity stated by@bianca_unifeye ) . 

  • 3 kudos
3 More Replies
tvdh
by New Contributor II
  • 279 Views
  • 1 replies
  • 1 kudos

Resolved! Tab navigation between fields in dashboards is random

Tab navigation between fields in published dashboards seem very random.I have a dashboard with multiple text input fields (mapped to query paramters / filters). I expect to move logically between them when pressing tab (keyboard navigation), but I mo...

  • 279 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Hello @tvdh! You can share this as product feedback so it’s visible to the Databricks product team and can be tracked and prioritized.

  • 1 kudos
Upendra_Dwivedi
by Databricks Partner
  • 3716 Views
  • 2 replies
  • 1 kudos

Resolved! Databricks APP OBO User Authorization

Hi All,We are using on-behalf of user authorization method for our app and the x-forwarded-access-token is expiring after sometime and we have to redeploy our app to rectify the issue. I am not sure what is the issue or how we can keep the token aliv...

Upendra_Dwivedi_0-1747911721728.png
  • 3716 Views
  • 2 replies
  • 1 kudos
Latest Reply
jpt
New Contributor II
  • 1 kudos

I am confronted with a similar error. I am also using obo user auth and have implemented accessing the token via  st.context.headers.get('x-forwarded-access-token') for every query and do not save it in a cache. Still, after 1 hour, i am hit with the...

  • 1 kudos
1 More Replies
Ved88
by Databricks Partner
  • 697 Views
  • 5 replies
  • 1 kudos

databricks all-purpose cluster

getting below error-Failure starting repl. Try detaching and re-attaching the notebook. while executing notebook and can see cluster have all installed lib.

  • 697 Views
  • 5 replies
  • 1 kudos
Latest Reply
Ved88
Databricks Partner
  • 1 kudos

Hi,we are not using hive metastore anywhere not sure why that host ((host=consolidated-westeuropec2-prod-metastore-0.mysql.database.azure.com)(port=3306))is coming in driver log ,will i need to do whitelist for that .we are having other use case simi...

  • 1 kudos
4 More Replies
csondergaardp
by New Contributor II
  • 616 Views
  • 2 replies
  • 2 kudos

Resolved! [PATH_NOT_FOUND] Structured Streaming uses wrong checkpoint location

I'm trying to perform a simple example using structured streaming on a directory created as a Volume. The use case is purely educational; I am investigating various forms of triggers. Basic info:Catalog: "dev_catalog"Schema: "stream"Volume: "streamin...

  • 616 Views
  • 2 replies
  • 2 kudos
Latest Reply
cgrant
Databricks Employee
  • 2 kudos

Your checkpoint code looks correct. What is the source of `df`? Is it `/Volumes/dev_catalog/default/streaming_basics/` ? The path looks incorrect - add `stream` to it.  

  • 2 kudos
1 More Replies
HarishKumarM
by New Contributor
  • 632 Views
  • 1 replies
  • 0 kudos

Resolved! Zerobus Connector Issue

I was trying to implement the example posted on the below link for Zerobus connector to test its functionality on my free edition workspace but unfortunately I am getting below error.Reference Code: https://learn.microsoft.com/en-us/azure/databricks/...

  • 632 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hey @HarishKumarM , I did some digging and found some helpful information to help you troubleshoot.   What the error means Your workspace isn’t currently enrolled in the Zerobus Ingest preview. Even though Zerobus is labeled a Public Preview, it’s st...

  • 0 kudos
RevanthV
by Contributor
  • 483 Views
  • 3 replies
  • 3 kudos

Resolved! Data validation with df writes using append mode

Hi Team,Recently i came across a situation where I had to write a huge data and it took 6 hrs to complete...later when i checked the target data , I saw 20% of the total records written incorrectly or corrupted because the source data itself was corr...

  • 483 Views
  • 3 replies
  • 3 kudos
Latest Reply
RevanthV
Contributor
  • 3 kudos

Hey @K_Anudeep , thanks a lot for tagging me into the GitHub issue.. This is exactly what I want " validate and commit" feature and i se you have already raised a PR for the same with a new option called . I will try this out and check if it satisfie...

  • 3 kudos
2 More Replies
ramsai
by New Contributor II
  • 1009 Views
  • 5 replies
  • 2 kudos

Resolved! Updating Job Creator to Service Principal

Regarding data governance best practices: I have jobs created by a user who has left the organization, and I need to change the job creator to a service principal. Currently, it seems the only option is to clone the job and update it. Is this the rec...

  • 1009 Views
  • 5 replies
  • 2 kudos
Latest Reply
Sanjeeb2024
Valued Contributor
  • 2 kudos

I agree with @nayan_wylde , for auditing, creator is important and it should in immutable by nature. 

  • 2 kudos
4 More Replies
jfvizoso
by New Contributor II
  • 13577 Views
  • 6 replies
  • 0 kudos

Can I pass parameters to a Delta Live Table pipeline at running time?

I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a parameter. I know you can have settings in the pipeline that you use in the DLT notebook, but it seems you can only assign values to them when crea...

  • 13577 Views
  • 6 replies
  • 0 kudos
Latest Reply
Sudharsan
New Contributor II
  • 0 kudos

@DeepakAI : May I know, how you resolved it?

  • 0 kudos
5 More Replies
Dhruv-22
by Contributor III
  • 637 Views
  • 2 replies
  • 1 kudos

Resolved! BUG: Merge with schema evolution doesn't work in update clause

I am referring to this link of databricks documentation. Here is a screenshot of the same  According to the documentation the UPDATE command should work when the target table doesn't have the column but it is present in source. I tried the same with ...

Screenshot 2026-01-09 at 16.33.15.png Dhruv22_0-1767956896097.png
  • 637 Views
  • 2 replies
  • 1 kudos
Latest Reply
Dhruv-22
Contributor III
  • 1 kudos

Hi @iyashk-DBThanks for the response, it will help in resolving the issue.But, can you mark it as a bug and report it? Because specifying just the column without the table name is a little risky.

  • 1 kudos
1 More Replies
SaugatMukherjee
by New Contributor III
  • 1913 Views
  • 2 replies
  • 1 kudos

Resolved! Structured streaming for iceberg tables

According to this https://iceberg.apache.org/docs/latest/spark-structured-streaming/ , we can stream from iceberg tables. I have ensured that my source table is Iceberg version 3, but no matter what I do, I get Iceberg does not streaming reads. Looki...

  • 1913 Views
  • 2 replies
  • 1 kudos
Latest Reply
SaugatMukherjee
New Contributor III
  • 1 kudos

Hi,Iceberg streaming is possible in Databricks. One does not need to change to Delta Lake. In my previous attempt, I used "load" while reading the source iceberg table. One should instead use "table". Load apparently seems to take a path and not a ta...

  • 1 kudos
1 More Replies
rcatelli
by New Contributor
  • 440 Views
  • 1 replies
  • 0 kudos

OBO auth implementation in Streamlit not working

Hello,I am currently trying to implement OBO auth in  a streamlit db app but I'm getting the following error message:requests.exceptions.HTTPError: 400 Client Error: PERMISSION_DENIED: User does not have USE CATALOG on Catalog '...'. Config: host=, a...

  • 440 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @rcatelli  Here's a quick example https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth#user-authorization https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth#example-query-with-user-authorization  Get the user token from...

  • 0 kudos
Naren1
by New Contributor
  • 459 Views
  • 1 replies
  • 1 kudos

Resolved! CLuster Config

Hi, can we pass a parameter into job activity from ADF side to change the environment inside the job cluster configuration?

  • 459 Views
  • 1 replies
  • 1 kudos
Latest Reply
K_Anudeep
Databricks Employee
  • 1 kudos

Hello @Naren1 , Yes — you can pass parameters from ADF to a Databricks Job run, but you generally can’t use those parameters to change the job cluster configuration (node type, Spark version, autoscale, init scripts, etc.) for that run.In an ADF Data...

  • 1 kudos
halsgbs
by New Contributor III
  • 274 Views
  • 1 replies
  • 0 kudos

Resolved! Alert ID within job yaml file - different environments

Hi, I am trying to trigger an alert through a job, and the issue I'm experiencing is that we have the same alert name in our dev/test/pre/prod environments but they will all have different alert IDs. And I have to input an alert id within the job yam...

  • 274 Views
  • 1 replies
  • 0 kudos
Latest Reply
Raman_Unifeye
Honored Contributor III
  • 0 kudos

@halsgbs your notebook task workaround failed as Databricks Jobs expect a static alert_id at the time the job is submitted or created, not a dynamic variable evaluated during the run.The best way to deal with this is Asset Bundle (DAB) where you get ...

  • 0 kudos
turagittech
by Contributor
  • 973 Views
  • 2 replies
  • 1 kudos

Resolved! Databricks Governance Dashboards

I am looking for any prebuilt governance dashboards. I see in some demo videos a governance portal. Is that or similar available to load to our environments. I am aware of the data quality and profiling, but a single view of key indicators would be a...

  • 973 Views
  • 2 replies
  • 1 kudos
Latest Reply
ckunal_eng
New Contributor III
  • 1 kudos

@MoJaMa This looks beautiful and insightful.@turagittech We are currently building a governance dashboard separately in Power BI, let me know if you want some KPIs or how to start with that.

  • 1 kudos
1 More Replies
Labels