- 7816 Views
- 6 replies
- 3 kudos
Different settings per target with Asset bundles
When generating the standard setup with databricks bundle init we will get databricks.yml that references resources/*. The targets are set in the databricks.yml and the resources (pipelines and jobs) are set in different files.I have dlt pipelines th...
- 7816 Views
- 6 replies
- 3 kudos
- 885 Views
- 2 replies
- 1 kudos
Resolved! Using the API, get the list of the schemas and tables a group or user has permissions for
I am attempting to use the Databricks API to get a list of the schemas and tables a group or user has permissions for. Is this possible? Is there another method I should be using instead?I see the Unity Catalog > Grants > Get permissions endpoint c...
- 885 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks, @Walter_C. In my case, I was able to get the data I needed by using the Databricks SQL Driver for Node.js , querying the information_schema.table_privileges table.
- 1 kudos
- 4911 Views
- 5 replies
- 0 kudos
Azure Entra SSO Error: Your user has not been registered
I have set up SSO within databricks and automatic user provisioning with Azure Entra and confirmed it is working for all users. However 1 user is presented with this when signing in. The user is in the enterprise app within Azure Entra and the user i...
- 4911 Views
- 5 replies
- 0 kudos
- 0 kudos
Figured out the issue, it seems like Email is case sensitive
- 0 kudos
- 1083 Views
- 0 replies
- 1 kudos
Databricks Clean Room costing
hi,Can you throw some light on how the compute, data sharing costing is done for various scenarios:1. Collaborator 1 and Collaborator 2 are having Databricks accounts in the same region and same cloud. Is there a DBU cost and who will pay for it? I a...
- 1083 Views
- 0 replies
- 1 kudos
- 1583 Views
- 2 replies
- 0 kudos
Resolved! Databricks Spot Instance: Completion Guarantee
Databricks allows to use spot instances for worker nodes. I consider to use them for interactive clusters. Do I have a gurantee that code will be completed without any errors even if spot instances are evicted? I would accept execution delays but no ...
- 1583 Views
- 2 replies
- 0 kudos
- 0 kudos
You could explore their "SPOT_WITH_FALLBAK" feature. If you don't want your jobs to fail because of eviction but this currently is not supported with interactive clusters. Hoping that they may extend this to all compute options soonCreate a pipeline ...
- 0 kudos
- 4427 Views
- 4 replies
- 0 kudos
Show all privileges granted to principal
Given the name of a principal in Databricks (I'm using account-level groups) is there an easy way to query or in other way obtain all privileges granted to this principal?I know I can obtain the information by querying in several of the system.inform...
- 4427 Views
- 4 replies
- 0 kudos
- 0 kudos
This link will provide details on how to verify all the privileges granted to Service Principals
- 0 kudos
- 955 Views
- 1 replies
- 0 kudos
Databricks Cache Options
Hi,We are working on Databricks solution hosted on AWS. We are exploring the caching options in Databricks. Apart from the Databricks cache and spark cache? What are the options? Is it feasible to use 3rd party Cache solutions like AWS Elastic Cache ...
- 955 Views
- 1 replies
- 0 kudos
- 0 kudos
Databricks provides several caching options to enhance performance by minimizing Input and Output (I/O) read and write operations. These include: Databricks Disk Cache: This cache accelerates data reads by creating copies of remote Parquet data file...
- 0 kudos
- 1691 Views
- 3 replies
- 0 kudos
Resolved! How could we share the Databricks ML runtime cluster among users when enable Unity Catalog
Hi team,Currently, we use the Databricks ML runtime to run our workflows and sometimes do the EDA. What we need is that we want to create a Databricks ML runtime for the team to share. When enabling Unity Catalog, how could we create a shared ML runt...
- 1691 Views
- 3 replies
- 0 kudos
- 0 kudos
Right now there is not plan to support ML runtime in shared clusters. Engineering is working on additional solutions but no ETA is currently available.In regards why it is not supported, principal reason is due to isolation which is not available in ...
- 0 kudos
- 690 Views
- 0 replies
- 0 kudos
Delta live table : run_as
Does Databricks have any plans to decouple the owner from the "run_as" identity in Delta Live Table like it can be done in jobs?The problem arise specially when using DABs. The service principal used to deploy DLTs shouldn't be the owner AND the runn...
- 690 Views
- 0 replies
- 0 kudos
- 3400 Views
- 0 replies
- 0 kudos
Pass secret in spark config when value is in form a.b.c={{secrets/scope/secret}}
I am configuring the Cluster for a spark-submit task and I am trying to specify `spark.executor.extraJavaOptions a.b.c={{secrets/scope/secret}}` but the literal {{secrets/scope/secret}} is being passed in rather than the secret value itself.I know th...
- 3400 Views
- 0 replies
- 0 kudos
- 976 Views
- 1 replies
- 0 kudos
Updating python version from 3.8 to 3.12 for s3 ingestion
I went to AWS Cloudformation stack and edited the template from python 3.8 to 3.12 and updated. I did this for both the workspace stack and the s3 ingestion stack. Will it break anything? Do I need to make any changes in the python code in the templa...
- 976 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod ,Thanks a lot! I will look at StackSets.As I mentioned the code is not written by me but Databricks. Why does Databricks not use a new python version in its default stacks? We are low on resources and heavily rely on the default Datab...
- 0 kudos
- 2155 Views
- 3 replies
- 5 kudos
Resolved! Move to 100% Serverless
Hi all,A few questions about the upcoming transition to 100% serverless, if anyone has any info that would be great!When will the move to serverless occur? I understand from 1st July (today) but has anyone seen a roadmap?What will the move to serverl...
- 2155 Views
- 3 replies
- 5 kudos
- 5 kudos
Hi, so our Databricks contact just assured us the following, after we asked about this issue:Databricks is officially (but won’t be GA in every region till end of July) 100% Serverless OPTIONAL.We understand many of our customers have begged for 100%...
- 5 kudos
- 1091 Views
- 1 replies
- 0 kudos
Resolved! Unity Catalog Pandas on Spark Limitation
According to Databricks UC Documentation, below are the some of the limitations on Shared Mode Cluster.1. In Databricks Runtime 13.3 LTS and above, Python scalar UDFs and Pandas UDFs are supported. Other Python UDFs, including UDAFs, UDTFs, and Panda...
- 1091 Views
- 1 replies
- 0 kudos
- 0 kudos
@RamlaSuhra, UDFs on Unity Catalog is a feature that, at the current moment is still on the Public Preview stage. This means that the development has yet not finished. UDFs can be used on DBR 13.3 and above, UDAFs are already available for DBR 15.2 ...
- 0 kudos
- 765 Views
- 2 replies
- 0 kudos
New Users don't receive onboarding email
When I create a new user in Databricks, the new user does not receive their onboarding email. It is not in their junkmail, deleted items or in their inbox.However, when I reset that user's password, they do receive the password reset link, and are a...
- 765 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @billraper , I'm sorry to hear about the trouble. Would you mind sharing more about whether this is happening with a community portal profile or with a product profile? Please share the link to the profile with which you're experiencing this issue...
- 0 kudos
- 601 Views
- 1 replies
- 0 kudos
cannot login to account management
HiI am not able to login to account management (https://accounts.cloud.databricks.com), It somehow reninforce SSO, cannot login with username and password.
- 601 Views
- 1 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
69 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
41 | |
22 | |
12 | |
9 | |
9 |