cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

User16765131552
by Contributor III
  • 571 Views
  • 1 replies
  • 0 kudos

Azure Databricks Repos and HIPAA

Are Repos HIPAA compliant, or is there a plan and timeline to support this? Customer is getting a warning when trying to enable the Repos feature in a HIPAA deployment on Azure Databricks.

  • 571 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

There is a plan to support this. For timeline, please reach out to your Databricks account team.

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 878 Views
  • 1 replies
  • 0 kudos
  • 878 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Honored Contributor III
  • 0 kudos

Unfortunately this is not possible. The default user workspace name will be the user's email address.

  • 0 kudos
User16826992666
by Valued Contributor
  • 643 Views
  • 1 replies
  • 0 kudos

What do I need to think about for Disaster Recovery planning?

I am working on a disaster recovery plan for my environment which includes Databricks. Where do I start with my planning? What all do I need to consider when building a DR plan?

  • 643 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

Depending on your RPO/RTOs there are different recovery solution strategies that could be considered (active/passive, active/active) for Databricks deployments. A detailed explanation of these approaches are mentioned here

  • 0 kudos
User16826992666
by Valued Contributor
  • 552 Views
  • 1 replies
  • 0 kudos

Can you use credential passthrough for users running jobs?

I would like it if I could make it so that the credentials of the user who initiates a job are used as the credentials for the job run. Is this possible?

  • 552 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

Is this in Azure? If so, it is not supported currently. https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 500 Views
  • 1 replies
  • 0 kudos
  • 500 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Valued Contributor II
  • 0 kudos

Yes. We support this.Please see https://docs.databricks.com/administration-guide/workspace/storage.html#modify-the-storage-location-for-notebook-results and https://docs.databricks.com/administration-guide/workspace/storage.html#configure-the-storage...

  • 0 kudos
User16826992666
by Valued Contributor
  • 1075 Views
  • 1 replies
  • 0 kudos
  • 1075 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Honored Contributor III
  • 0 kudos

Databricks only charges for compute time while machines are being used. If a machine is on "IDLE" then Databricks does not charge you for those machines. The cloud providers will charge you for the machines that are running regardless if they are IDL...

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 833 Views
  • 1 replies
  • 1 kudos
  • 833 Views
  • 1 replies
  • 1 kudos
Latest Reply
MoJaMa
Valued Contributor II
  • 1 kudos

Yes. It is not self-service. We can "merge" accounts on AWS, such that you can manage all your Databricks workspaces from a single place at https://accounts.cloud.databricks.com/loginPlease contact your Databricks Representative.

  • 1 kudos
User16776431030
by New Contributor III
  • 632 Views
  • 1 replies
  • 0 kudos
  • 632 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

The DEK is cached in memory for several read/write operations and evicted from memory at a regular interval such that new requests require another request to your cloud service’s key management system. If you delete or revoke your key, reading or wri...

  • 0 kudos
MoJaMa
by Valued Contributor II
  • 1183 Views
  • 1 replies
  • 0 kudos
  • 1183 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Valued Contributor II
  • 0 kudos

These permissions are one of the list described here in Step 6.chttps://docs.databricks.com/administration-guide/account-api/iam-role.htmlIt is required because we use tags to identify the owners, and other minimum information, of clusters on AWS. It...

  • 0 kudos
Srikanth_Gupta_
by Valued Contributor
  • 1534 Views
  • 1 replies
  • 0 kudos
  • 1534 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

The recipient’s client authenticates to the sharing server (via a bearer token or other method) and asks to query a specific table. The client can also provide filters on the data (e.g. “country=US”) as a hint to read just a subset of the data.The se...

  • 0 kudos
Labels