- 2534 Views
- 1 replies
- 0 kudos
- 2534 Views
- 1 replies
- 0 kudos
- 0 kudos
By default all users will have access to the DE/DS and ML personas. So this is not an entitlement that's needed from that perspective, and will not show up as an option.However, if your workspace is enabled for Databricks SQL, then an Admin can choos...
- 0 kudos
- 6382 Views
- 1 replies
- 0 kudos
- 6382 Views
- 1 replies
- 0 kudos
- 0 kudos
SSO: Yes.MFA: Yes, but this is under the purview of your Identity Provider, so your IDP is responsible for the implementation since Databricks does not have access to the user's SSO credentials.https://docs.databricks.com/security/security-overview-e...
- 0 kudos
- 2437 Views
- 1 replies
- 0 kudos
- 2437 Views
- 1 replies
- 0 kudos
- 0 kudos
The exact version that will be tagged as the next LTS after DBR 7.3 LTS has not been decided. The likely timeline for this will be Fall 2021, and it will likely be built on Ubuntu 20.04
- 0 kudos
- 2948 Views
- 1 replies
- 0 kudos
- 2948 Views
- 1 replies
- 0 kudos
- 0 kudos
That option is still available, but we have added the option to download using the Account API in 3.48, without setting up delivery to a bucket.See https://docs.databricks.com/release-notes/product/2021/june.html#use-an-api-to-download-usage-data-dir...
- 0 kudos
- 2450 Views
- 1 replies
- 0 kudos
- 2450 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes. Since June 2021.Please refer https://docs.databricks.com/spark/latest/sparkr/shiny-notebooks.html
- 0 kudos
- 3852 Views
- 1 replies
- 0 kudos
- 3852 Views
- 1 replies
- 0 kudos
- 0 kudos
Tags can be added when creating a cluster through the API using the cluster tag data structure. In the UI you can add tags in the advanced options of the create cluster form.
- 0 kudos
- 5704 Views
- 2 replies
- 0 kudos
- 5704 Views
- 2 replies
- 0 kudos
- 0 kudos
I do not believe this is possible right now. The only way to do this would be with cluster policies and cluster policies does not support this functionality. Check out cluster policy documentation here.
- 0 kudos
- 2600 Views
- 1 replies
- 0 kudos
- 2600 Views
- 1 replies
- 0 kudos
- 0 kudos
Check out this doc : https://docs.databricks.com/resources/limits.html
- 0 kudos
- 2440 Views
- 0 replies
- 0 kudos
Can you customize the welcome email that is sent to users when they are added to a workspace?
Currently the email has a general welcome message, but I would like to add more details specific to our company.
- 2440 Views
- 0 replies
- 0 kudos
- 9943 Views
- 1 replies
- 0 kudos
What happens to clusters when a DBR reaches End of Support?
When a DBR version reaches its end of support date will clusters running that version be removed?
- 9943 Views
- 1 replies
- 0 kudos
- 0 kudos
When a DBR version reached End of Support it means that the version will no longer receive security patches and workloads running on that version will no longer be eligible for Databricks support. Unsupported versions may be subject to security vulne...
- 0 kudos
- 21195 Views
- 1 replies
- 0 kudos
What does it mean when a feature is in public preview?
I am confused about certain features being in public preview vs GA. What is the difference between these, and when should I start using a feature?
- 21195 Views
- 1 replies
- 0 kudos
- 0 kudos
If a feature has been released to public preview, it is available for use by any interested users and is fully supported. The feature is considered stable and can be used in production backed by an SLA. So a feature in public preview is generally rea...
- 0 kudos
- 11670 Views
- 1 replies
- 0 kudos
Do I have to choose an availability zone when creating a cluster?
I am worried about running out of IP's in my subnets. Is there any way to load balance across AZ's based on IP availability?
- 11670 Views
- 1 replies
- 0 kudos
- 0 kudos
If you don't want to choose an AZ at cluster creation or are worried about IP availability you can use the Automatic Availability Zone (Auto-AZ) feature. This will configure the cluster to automatically choose an AZ when the cluster starts based on t...
- 0 kudos
- 2036 Views
- 0 replies
- 0 kudos
- 2036 Views
- 0 replies
- 0 kudos
- 2090 Views
- 0 replies
- 0 kudos
- 2090 Views
- 0 replies
- 0 kudos
- 6086 Views
- 2 replies
- 0 kudos
Resolved! Accessing Delta tables
Is it possible to access Delta tables outside of DBR (Databricks Runtime) ?
- 6086 Views
- 2 replies
- 0 kudos
- 0 kudos
Delta Lake is an independent open-source project ( under Linux Foundation ) and is based on an open format . This has led to the community building connectors for delta from multiple engines - not just spark.Out side of DBR, Delta Lake could be used ...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
74 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 53 | |
| 38 | |
| 36 | |
| 25 |