- 686 Views
- 1 replies
- 0 kudos
DBR minor version details are not exposed. However, in the documentation, it mentioned Databricks performs maintenance releases every 2 weeks. How can I determine if I am using the same minor version
- 686 Views
- 1 replies
- 0 kudos
Latest Reply
The below code snippet can help to determine the DBR Hash string for the DBR version. DBR hash string is unique for the DBR minor version. val scalaVersion = scala.util.Properties.versionString
val hadoopVersion = org.apache.hadoop.util.VersionInf...
- 759 Views
- 1 replies
- 0 kudos
Are there any event streams that are or could be exposed in AWS (such as Cloudwatch Eventbridge events or SNS messages? In particular I'm interested in events that detail jobs being run. The use case here would be for monitoring jobs from our web app...
- 759 Views
- 1 replies
- 0 kudos
Latest Reply
You could write code to call the PutLogEvents api at the beginning of each job to write out custom events to cloudwatch / or use aws sdk to send and SNS notification and route it to a desired consumer.
- 553 Views
- 1 replies
- 0 kudos
Are Repos HIPAA compliant, or is there a plan and timeline to support this? Customer is getting a warning when trying to enable the Repos feature in a HIPAA deployment on Azure Databricks.
- 553 Views
- 1 replies
- 0 kudos
Latest Reply
There is a plan to support this. For timeline, please reach out to your Databricks account team.
- 637 Views
- 1 replies
- 0 kudos
I am working on a disaster recovery plan for my environment which includes Databricks. Where do I start with my planning? What all do I need to consider when building a DR plan?
- 637 Views
- 1 replies
- 0 kudos
Latest Reply
Depending on your RPO/RTOs there are different recovery solution strategies that could be considered (active/passive, active/active) for Databricks deployments. A detailed explanation of these approaches are mentioned here
- 546 Views
- 1 replies
- 0 kudos
I would like it if I could make it so that the credentials of the user who initiates a job are used as the credentials for the job run. Is this possible?
- 546 Views
- 1 replies
- 0 kudos
Latest Reply
Is this in Azure? If so, it is not supported currently. https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough