- 690 Views
- 1 replies
- 0 kudos
Resolved! Databricks Clean Room costing
hi,Can you throw some light on how the compute, data sharing costing is done for various scenarios:1. Collaborator 1 and Collaborator 2 are having Databricks accounts in the same region and same cloud. Is there a DBU cost and who will pay for it? I a...
- 690 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ShankarM, Collaborators in the same region and cloud: If both Collaborator 1 and Collaborator 2 have Databricks accounts in the same region and cloud, they will incur DBU (Databricks Unit) costs based on their usage. DBUs represent compute resou...
- 0 kudos
- 801 Views
- 2 replies
- 1 kudos
Security Consideration for OAUTH Secrets to use Service Principal to authenticate with Databricks
What are the security consideration we need to keep in mind when we want to us OAUTH Secrets to use a Service Principal to access Azure Databricks when Identity federation is disabled and workspace is not yet on boarded on to Unity Catalog? Can we co...
- 801 Views
- 2 replies
- 1 kudos
- 1 kudos
Thank you @Kaniz_Fatma for the response. I do have follow up questions.- What kind of encryption is used to store OAUTH secret?- Is there any way OAUTH can be generated by someone else who is not a manager of that SPN? We need this as a part of segr...
- 1 kudos
- 1333 Views
- 2 replies
- 0 kudos
Resolved! Databricks Spot Instance: Completion Guarantee
Databricks allows to use spot instances for worker nodes. I consider to use them for interactive clusters. Do I have a gurantee that code will be completed without any errors even if spot instances are evicted? I would accept execution delays but no ...
- 1333 Views
- 2 replies
- 0 kudos
- 0 kudos
You could explore their "SPOT_WITH_FALLBAK" feature. If you don't want your jobs to fail because of eviction but this currently is not supported with interactive clusters. Hoping that they may extend this to all compute options soonCreate a pipeline ...
- 0 kudos
- 13058 Views
- 9 replies
- 9 kudos
Resolved! Installing libraries on job clusters
Simple question : what is the way to go to install libraries on job clusters ? There does not seem to be a "Libraries" tab on the UI as opposed to regular clusters. Does it mean that the only option is to use init scripts ?
- 13058 Views
- 9 replies
- 9 kudos
- 9 kudos
You may want to copy required libs to a volume and load it during cluster setup to avoid downloading the libs for every run.
- 9 kudos
- 3179 Views
- 1 replies
- 2 kudos
Resolved! Pass secret in spark config when value is in form a.b.c={{secrets/scope/secret}}
I am configuring the Cluster for a spark-submit task and I am trying to specify `spark.executor.extraJavaOptions a.b.c={{secrets/scope/secret}}` but the literal {{secrets/scope/secret}} is being passed in rather than the secret value itself.I know th...
- 3179 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @macmiller1, Instead of using the {{secrets/scope/secret}} syntax, you can try using environment variables.If you prefer to use the {{secrets/scope/secret}} syntax, you can try escaping the equal sign (=) in your value.One way to do this is by us...
- 2 kudos
- 569 Views
- 1 replies
- 0 kudos
Delta live table : run_as
Does Databricks have any plans to decouple the owner from the "run_as" identity in Delta Live Table like it can be done in jobs?The problem arise specially when using DABs. The service principal used to deploy DLTs shouldn't be the owner AND the runn...
- 569 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @KevinGagnon, Databricks currently does not have plans to decouple the owner from the "run_as" identity in Delta Live Tables, unlike what can be done with jobs. The key points are: The Delta Live Table pipeline runs using the credentials of the p...
- 0 kudos
- 3387 Views
- 5 replies
- 0 kudos
Show all privileges granted to principal
Given the name of a principal in Databricks (I'm using account-level groups) is there an easy way to query or in other way obtain all privileges granted to this principal?I know I can obtain the information by querying in several of the system.inform...
- 3387 Views
- 5 replies
- 0 kudos
- 0 kudos
This link will provide details on how to verify all the privileges granted to Service Principals
- 0 kudos
- 435 Views
- 1 replies
- 0 kudos
Databricks Cache Options
Hi,We are working on Databricks solution hosted on AWS. We are exploring the caching options in Databricks. Apart from the Databricks cache and spark cache? What are the options? Is it feasible to use 3rd party Cache solutions like AWS Elastic Cache ...
- 435 Views
- 1 replies
- 0 kudos
- 0 kudos
Databricks provides several caching options to enhance performance by minimizing Input and Output (I/O) read and write operations. These include: Databricks Disk Cache: This cache accelerates data reads by creating copies of remote Parquet data file...
- 0 kudos
- 886 Views
- 3 replies
- 0 kudos
How could we share the Databricks ML runtime cluster among users when enable Unity Catalog
Hi team,Currently, we use the Databricks ML runtime to run our workflows and sometimes do the EDA. What we need is that we want to create a Databricks ML runtime for the team to share. When enabling Unity Catalog, how could we create a shared ML runt...
- 886 Views
- 3 replies
- 0 kudos
- 0 kudos
Right now there is not plan to support ML runtime in shared clusters. Engineering is working on additional solutions but no ETA is currently available.In regards why it is not supported, principal reason is due to isolation which is not available in ...
- 0 kudos
- 698 Views
- 2 replies
- 0 kudos
Updating python version from 3.8 to 3.12 for s3 ingestion
I went to AWS Cloudformation stack and edited the template from python 3.8 to 3.12 and updated. I did this for both the workspace stack and the s3 ingestion stack. Will it break anything? Do I need to make any changes in the python code in the templa...
- 698 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Kaniz_Fatma ,Thanks a lot! I will look at StackSets.As I mentioned the code is not written by me but Databricks. Why does Databricks not use a new python version in its default stacks? We are low on resources and heavily rely on the default Datab...
- 0 kudos
- 1328 Views
- 4 replies
- 6 kudos
Resolved! Move to 100% Serverless
Hi all,A few questions about the upcoming transition to 100% serverless, if anyone has any info that would be great!When will the move to serverless occur? I understand from 1st July (today) but has anyone seen a roadmap?What will the move to serverl...
- 1328 Views
- 4 replies
- 6 kudos
- 6 kudos
Hi, so our Databricks contact just assured us the following, after we asked about this issue:Databricks is officially (but won’t be GA in every region till end of July) 100% Serverless OPTIONAL.We understand many of our customers have begged for 100%...
- 6 kudos
- 571 Views
- 1 replies
- 0 kudos
Databricks (GCP) Cluster not resolving Hostname into IP address
we have #mongodb hosts that must be resolved to private internal loadbalancer ips ( of another cluster ), and that we are unable to add host aliases in the Databricks GKE cluster in order for the spark to be able to connect to a mongodb and resolve t...
- 571 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Sadam97, First, verify if your DNS server is responding. You can do this by running a ping command from a notebook in Databricks to reach your secondary DNS server.Launch a Web Terminal from the cluster workspace.Edit the /etc/resolv.conf fil...
- 0 kudos
- 773 Views
- 1 replies
- 0 kudos
Resolved! Unity Catalog Pandas on Spark Limitation
According to Databricks UC Documentation, below are the some of the limitations on Shared Mode Cluster.1. In Databricks Runtime 13.3 LTS and above, Python scalar UDFs and Pandas UDFs are supported. Other Python UDFs, including UDAFs, UDTFs, and Panda...
- 773 Views
- 1 replies
- 0 kudos
- 0 kudos
@RamlaS, UDFs on Unity Catalog is a feature that, at the current moment is still on the Public Preview stage. This means that the development has yet not finished. UDFs can be used on DBR 13.3 and above, UDAFs are already available for DBR 15.2 on G...
- 0 kudos
- 596 Views
- 2 replies
- 0 kudos
New Users don't receive onboarding email
When I create a new user in Databricks, the new user does not receive their onboarding email. It is not in their junkmail, deleted items or in their inbox.However, when I reset that user's password, they do receive the password reset link, and are a...
- 596 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @billraper , I'm sorry to hear about the trouble. Would you mind sharing more about whether this is happening with a community portal profile or with a product profile? Please share the link to the profile with which you're experiencing this issue...
- 0 kudos
- 489 Views
- 2 replies
- 0 kudos
cannot login to account management
HiI am not able to login to account management (https://accounts.cloud.databricks.com), It somehow reninforce SSO, cannot login with username and password.
- 489 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @JoyceZhang, Thank you for contacting Databricks Community Discussion Forum. Please note that for any issues related to the Databricks Community Edition product, you can find helpful resources here. If you encounter any difficulties beyond what'...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
43 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
36 | |
9 | |
9 | |
8 | |
8 |