cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

User16869510359
by Esteemed Contributor
  • 686 Views
  • 1 replies
  • 0 kudos

Resolved! How to determine if am using the same DBR minor version?

DBR minor version details are not exposed. However, in the documentation, it mentioned Databricks performs maintenance releases every 2 weeks. How can I determine if I am using the same minor version

  • 686 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16869510359
Esteemed Contributor
  • 0 kudos

The below code snippet can help to determine the DBR Hash string for the DBR version. DBR hash string is unique for the DBR minor version. val scalaVersion = scala.util.Properties.versionString   val hadoopVersion = org.apache.hadoop.util.VersionInf...

  • 0 kudos
Anonymous
by Not applicable
  • 621 Views
  • 2 replies
  • 0 kudos
  • 621 Views
  • 2 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

Delete workspace doesn't delete the root bucket. You could choose to use the same root bucket for more than one workspace ( though not recommended ) It is recommended to automate the infrastructure creation via terraform or quickstart so that cleanup...

  • 0 kudos
1 More Replies
Anonymous
by Not applicable
  • 759 Views
  • 1 replies
  • 0 kudos

Monitoring jobs

Are there any event streams that are or could be exposed in AWS (such as Cloudwatch Eventbridge events or SNS messages? In particular I'm interested in events that detail jobs being run. The use case here would be for monitoring jobs from our web app...

  • 759 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

You could write code to call the PutLogEvents api at the beginning of each job to write out custom events to cloudwatch / or use aws sdk to send and SNS notification and route it to a desired consumer.

  • 0 kudos
User16765131552
by Contributor III
  • 553 Views
  • 1 replies
  • 0 kudos

Azure Databricks Repos and HIPAA

Are Repos HIPAA compliant, or is there a plan and timeline to support this? Customer is getting a warning when trying to enable the Repos feature in a HIPAA deployment on Azure Databricks.

  • 553 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

There is a plan to support this. For timeline, please reach out to your Databricks account team.

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 866 Views
  • 1 replies
  • 0 kudos
  • 866 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Honored Contributor III
  • 0 kudos

Unfortunately this is not possible. The default user workspace name will be the user's email address.

  • 0 kudos
User16826992666
by Valued Contributor
  • 637 Views
  • 1 replies
  • 0 kudos

What do I need to think about for Disaster Recovery planning?

I am working on a disaster recovery plan for my environment which includes Databricks. Where do I start with my planning? What all do I need to consider when building a DR plan?

  • 637 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

Depending on your RPO/RTOs there are different recovery solution strategies that could be considered (active/passive, active/active) for Databricks deployments. A detailed explanation of these approaches are mentioned here

  • 0 kudos
User16826992666
by Valued Contributor
  • 546 Views
  • 1 replies
  • 0 kudos

Can you use credential passthrough for users running jobs?

I would like it if I could make it so that the credentials of the user who initiates a job are used as the credentials for the job run. Is this possible?

  • 546 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

Is this in Azure? If so, it is not supported currently. https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 489 Views
  • 1 replies
  • 0 kudos
  • 489 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Valued Contributor II
  • 0 kudos

Yes. We support this.Please see https://docs.databricks.com/administration-guide/workspace/storage.html#modify-the-storage-location-for-notebook-results and https://docs.databricks.com/administration-guide/workspace/storage.html#configure-the-storage...

  • 0 kudos
User16826992666
by Valued Contributor
  • 1055 Views
  • 1 replies
  • 0 kudos
  • 1055 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Honored Contributor III
  • 0 kudos

Databricks only charges for compute time while machines are being used. If a machine is on "IDLE" then Databricks does not charge you for those machines. The cloud providers will charge you for the machines that are running regardless if they are IDL...

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 821 Views
  • 1 replies
  • 1 kudos
  • 821 Views
  • 1 replies
  • 1 kudos
Latest Reply
MoJaMa
Valued Contributor II
  • 1 kudos

Yes. It is not self-service. We can "merge" accounts on AWS, such that you can manage all your Databricks workspaces from a single place at https://accounts.cloud.databricks.com/loginPlease contact your Databricks Representative.

  • 1 kudos
User16776431030
by New Contributor III
  • 624 Views
  • 1 replies
  • 0 kudos
  • 624 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

The DEK is cached in memory for several read/write operations and evicted from memory at a regular interval such that new requests require another request to your cloud service’s key management system. If you delete or revoke your key, reading or wri...

  • 0 kudos
Labels