- 2181 Views
- 1 replies
- 0 kudos
- 2181 Views
- 1 replies
- 0 kudos
- 0 kudos
By default all users will have access to the DE/DS and ML personas. So this is not an entitlement that's needed from that perspective, and will not show up as an option.However, if your workspace is enabled for Databricks SQL, then an Admin can choos...
- 0 kudos
- 5947 Views
- 1 replies
- 0 kudos
- 5947 Views
- 1 replies
- 0 kudos
- 0 kudos
SSO: Yes.MFA: Yes, but this is under the purview of your Identity Provider, so your IDP is responsible for the implementation since Databricks does not have access to the user's SSO credentials.https://docs.databricks.com/security/security-overview-e...
- 0 kudos
- 2130 Views
- 1 replies
- 0 kudos
- 2130 Views
- 1 replies
- 0 kudos
- 0 kudos
The exact version that will be tagged as the next LTS after DBR 7.3 LTS has not been decided. The likely timeline for this will be Fall 2021, and it will likely be built on Ubuntu 20.04
- 0 kudos
- 2466 Views
- 1 replies
- 0 kudos
- 2466 Views
- 1 replies
- 0 kudos
- 0 kudos
That option is still available, but we have added the option to download using the Account API in 3.48, without setting up delivery to a bucket.See https://docs.databricks.com/release-notes/product/2021/june.html#use-an-api-to-download-usage-data-dir...
- 0 kudos
- 1996 Views
- 1 replies
- 0 kudos
- 1996 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes. Since June 2021.Please refer https://docs.databricks.com/spark/latest/sparkr/shiny-notebooks.html
- 0 kudos
- 3287 Views
- 1 replies
- 0 kudos
- 3287 Views
- 1 replies
- 0 kudos
- 0 kudos
Tags can be added when creating a cluster through the API using the cluster tag data structure. In the UI you can add tags in the advanced options of the create cluster form.
- 0 kudos
- 4955 Views
- 2 replies
- 0 kudos
- 4955 Views
- 2 replies
- 0 kudos
- 0 kudos
I do not believe this is possible right now. The only way to do this would be with cluster policies and cluster policies does not support this functionality. Check out cluster policy documentation here.
- 0 kudos
- 2316 Views
- 1 replies
- 0 kudos
- 2316 Views
- 1 replies
- 0 kudos
- 0 kudos
Check out this doc : https://docs.databricks.com/resources/limits.html
- 0 kudos
- 2080 Views
- 0 replies
- 0 kudos
Can you customize the welcome email that is sent to users when they are added to a workspace?
Currently the email has a general welcome message, but I would like to add more details specific to our company.
- 2080 Views
- 0 replies
- 0 kudos
- 9419 Views
- 1 replies
- 0 kudos
What happens to clusters when a DBR reaches End of Support?
When a DBR version reaches its end of support date will clusters running that version be removed?
- 9419 Views
- 1 replies
- 0 kudos
- 0 kudos
When a DBR version reached End of Support it means that the version will no longer receive security patches and workloads running on that version will no longer be eligible for Databricks support. Unsupported versions may be subject to security vulne...
- 0 kudos
- 19404 Views
- 1 replies
- 0 kudos
What does it mean when a feature is in public preview?
I am confused about certain features being in public preview vs GA. What is the difference between these, and when should I start using a feature?
- 19404 Views
- 1 replies
- 0 kudos
- 0 kudos
If a feature has been released to public preview, it is available for use by any interested users and is fully supported. The feature is considered stable and can be used in production backed by an SLA. So a feature in public preview is generally rea...
- 0 kudos
- 10696 Views
- 1 replies
- 0 kudos
Do I have to choose an availability zone when creating a cluster?
I am worried about running out of IP's in my subnets. Is there any way to load balance across AZ's based on IP availability?
- 10696 Views
- 1 replies
- 0 kudos
- 0 kudos
If you don't want to choose an AZ at cluster creation or are worried about IP availability you can use the Automatic Availability Zone (Auto-AZ) feature. This will configure the cluster to automatically choose an AZ when the cluster starts based on t...
- 0 kudos
- 1798 Views
- 0 replies
- 0 kudos
- 1798 Views
- 0 replies
- 0 kudos
- 1856 Views
- 0 replies
- 0 kudos
- 1856 Views
- 0 replies
- 0 kudos
- 4824 Views
- 2 replies
- 0 kudos
Resolved! Accessing Delta tables
Is it possible to access Delta tables outside of DBR (Databricks Runtime) ?
- 4824 Views
- 2 replies
- 0 kudos
- 0 kudos
Delta Lake is an independent open-source project ( under Linux Foundation ) and is based on an open format . This has led to the community building connectors for delta from multiple engines - not just spark.Out side of DBR, Delta Lake could be used ...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
38 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
75 | |
36 | |
25 | |
17 | |
12 |