cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

User16765131552
by Databricks Employee
  • 3088 Views
  • 1 replies
  • 0 kudos

Azure Databricks Repos and HIPAA

Are Repos HIPAA compliant, or is there a plan and timeline to support this? Customer is getting a warning when trying to enable the Repos feature in a HIPAA deployment on Azure Databricks.

  • 3088 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Databricks Employee
  • 0 kudos

There is a plan to support this. For timeline, please reach out to your Databricks account team.

  • 0 kudos
MoJaMa
by Databricks Employee
  • 2326 Views
  • 1 replies
  • 0 kudos
  • 2326 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Databricks Employee
  • 0 kudos

Unfortunately this is not possible. The default user workspace name will be the user's email address.

  • 0 kudos
User16826992666
by Databricks Employee
  • 1831 Views
  • 1 replies
  • 0 kudos

What do I need to think about for Disaster Recovery planning?

I am working on a disaster recovery plan for my environment which includes Databricks. Where do I start with my planning? What all do I need to consider when building a DR plan?

  • 1831 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Databricks Employee
  • 0 kudos

Depending on your RPO/RTOs there are different recovery solution strategies that could be considered (active/passive, active/active) for Databricks deployments. A detailed explanation of these approaches are mentioned here

  • 0 kudos
User16826992666
by Databricks Employee
  • 1848 Views
  • 1 replies
  • 0 kudos

Can you use credential passthrough for users running jobs?

I would like it if I could make it so that the credentials of the user who initiates a job are used as the credentials for the job run. Is this possible?

  • 1848 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Databricks Employee
  • 0 kudos

Is this in Azure? If so, it is not supported currently. https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough

  • 0 kudos
MoJaMa
by Databricks Employee
  • 3077 Views
  • 1 replies
  • 0 kudos
  • 3077 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

Yes. We support this.Please see https://docs.databricks.com/administration-guide/workspace/storage.html#modify-the-storage-location-for-notebook-results and https://docs.databricks.com/administration-guide/workspace/storage.html#configure-the-storage...

  • 0 kudos
User16826992666
by Databricks Employee
  • 2568 Views
  • 1 replies
  • 1 kudos
  • 2568 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ryan_Chynoweth
Databricks Employee
  • 1 kudos

Databricks only charges for compute time while machines are being used. If a machine is on "IDLE" then Databricks does not charge you for those machines. The cloud providers will charge you for the machines that are running regardless if they are IDL...

  • 1 kudos
MoJaMa
by Databricks Employee
  • 2355 Views
  • 1 replies
  • 1 kudos
  • 2355 Views
  • 1 replies
  • 1 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 1 kudos

Yes. It is not self-service. We can "merge" accounts on AWS, such that you can manage all your Databricks workspaces from a single place at https://accounts.cloud.databricks.com/loginPlease contact your Databricks Representative.

  • 1 kudos
tj-cycyota
by Databricks Employee
  • 1781 Views
  • 1 replies
  • 0 kudos

How long does customer KMS key get cached on each instance?

How long does customer KMS key get cached on each instance?

  • 1781 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

The DEK is cached in memory for several read/write operations and evicted from memory at a regular interval such that new requests require another request to your cloud service’s key management system. If you delete or revoke your key, reading or wri...

  • 0 kudos
MoJaMa
by Databricks Employee
  • 2499 Views
  • 1 replies
  • 0 kudos
  • 2499 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

These permissions are one of the list described here in Step 6.chttps://docs.databricks.com/administration-guide/account-api/iam-role.htmlIt is required because we use tags to identify the owners, and other minimum information, of clusters on AWS. It...

  • 0 kudos
Srikanth_Gupta_
by Databricks Employee
  • 3514 Views
  • 1 replies
  • 0 kudos
  • 3514 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Databricks Employee
  • 0 kudos

The recipient’s client authenticates to the sharing server (via a bearer token or other method) and asks to query a specific table. The client can also provide filters on the data (e.g. “country=US”) as a hint to read just a subset of the data.The se...

  • 0 kudos