- 2252 Views
- 1 replies
- 0 kudos
- 2252 Views
- 1 replies
- 0 kudos
- 0 kudos
The recipient’s client authenticates to the sharing server (via a bearer token or other method) and asks to query a specific table. The client can also provide filters on the data (e.g. “country=US”) as a hint to read just a subset of the data.The se...
- 0 kudos
- 2671 Views
- 1 replies
- 0 kudos
Resolved! What are the LTS Databricks Runtimes?
What are the LTS Databricks Runtime versions available in the cluster configuration page? Example DBR 7.3 LTS
- 2671 Views
- 1 replies
- 0 kudos
- 0 kudos
Long Term Support (LTS) versions are supported for two years and non LTS version are supported for 6 months.Refer to below links for more details.https://docs.databricks.com/release-notes/runtime/databricks-runtime-ver.html#runtime-support https://do...
- 0 kudos
- 1326 Views
- 1 replies
- 0 kudos
- 1326 Views
- 1 replies
- 0 kudos
- 0 kudos
By default all users will have access to the DE/DS and ML personas. So this is not an entitlement that's needed from that perspective, and will not show up as an option.However, if your workspace is enabled for Databricks SQL, then an Admin can choos...
- 0 kudos
- 4748 Views
- 1 replies
- 0 kudos
- 4748 Views
- 1 replies
- 0 kudos
- 0 kudos
SSO: Yes.MFA: Yes, but this is under the purview of your Identity Provider, so your IDP is responsible for the implementation since Databricks does not have access to the user's SSO credentials.https://docs.databricks.com/security/security-overview-e...
- 0 kudos
- 1587 Views
- 1 replies
- 0 kudos
- 1587 Views
- 1 replies
- 0 kudos
- 0 kudos
The exact version that will be tagged as the next LTS after DBR 7.3 LTS has not been decided. The likely timeline for this will be Fall 2021, and it will likely be built on Ubuntu 20.04
- 0 kudos
- 1960 Views
- 1 replies
- 0 kudos
- 1960 Views
- 1 replies
- 0 kudos
- 0 kudos
That option is still available, but we have added the option to download using the Account API in 3.48, without setting up delivery to a bucket.See https://docs.databricks.com/release-notes/product/2021/june.html#use-an-api-to-download-usage-data-dir...
- 0 kudos
- 1432 Views
- 1 replies
- 0 kudos
- 1432 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes. Since June 2021.Please refer https://docs.databricks.com/spark/latest/sparkr/shiny-notebooks.html
- 0 kudos
- 2301 Views
- 1 replies
- 0 kudos
- 2301 Views
- 1 replies
- 0 kudos
- 0 kudos
Tags can be added when creating a cluster through the API using the cluster tag data structure. In the UI you can add tags in the advanced options of the create cluster form.
- 0 kudos
- 3636 Views
- 2 replies
- 0 kudos
- 3636 Views
- 2 replies
- 0 kudos
- 0 kudos
I do not believe this is possible right now. The only way to do this would be with cluster policies and cluster policies does not support this functionality. Check out cluster policy documentation here.
- 0 kudos
- 6733 Views
- 1 replies
- 0 kudos
- 6733 Views
- 1 replies
- 0 kudos
- 0 kudos
You can get the job details from the jobs get api, which takes the job id as a parameter. This will give you all the information available about the job, specifically the job name. Please note that there is not a field called "job description" in the...
- 0 kudos
- 1958 Views
- 1 replies
- 0 kudos
- 1958 Views
- 1 replies
- 0 kudos
- 0 kudos
Check out this doc : https://docs.databricks.com/resources/limits.html
- 0 kudos
- 1705 Views
- 0 replies
- 0 kudos
Can you customize the welcome email that is sent to users when they are added to a workspace?
Currently the email has a general welcome message, but I would like to add more details specific to our company.
- 1705 Views
- 0 replies
- 0 kudos
- 8656 Views
- 1 replies
- 0 kudos
What happens to clusters when a DBR reaches End of Support?
When a DBR version reaches its end of support date will clusters running that version be removed?
- 8656 Views
- 1 replies
- 0 kudos
- 0 kudos
When a DBR version reached End of Support it means that the version will no longer receive security patches and workloads running on that version will no longer be eligible for Databricks support. Unsupported versions may be subject to security vulne...
- 0 kudos
- 17248 Views
- 1 replies
- 0 kudos
What does it mean when a feature is in public preview?
I am confused about certain features being in public preview vs GA. What is the difference between these, and when should I start using a feature?
- 17248 Views
- 1 replies
- 0 kudos
- 0 kudos
If a feature has been released to public preview, it is available for use by any interested users and is fully supported. The feature is considered stable and can be used in production backed by an SLA. So a feature in public preview is generally rea...
- 0 kudos
- 9273 Views
- 1 replies
- 0 kudos
Do I have to choose an availability zone when creating a cluster?
I am worried about running out of IP's in my subnets. Is there any way to load balance across AZ's based on IP availability?
- 9273 Views
- 1 replies
- 0 kudos
- 0 kudos
If you don't want to choose an AZ at cluster creation or are worried about IP availability you can use the Automatic Availability Zone (Auto-AZ) feature. This will configure the cluster to automatically choose an AZ when the cluster starts based on t...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
36 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
26 | |
9 | |
8 | |
8 | |
7 |