- 9107 Views
- 1 replies
- 1 kudos
Notebook and folder owner
Hi allWe can use this API https://docs.databricks.com/api/workspace/dbsqlpermissions/transferownership to transfer the ownership of a Query.Is there anything similar for notebooks and folders?
- 9107 Views
- 1 replies
- 1 kudos
- 1 kudos
Workspace object permissions — Manage which users can read, run, edit, or manage directories, files, and notebooks.https://docs.databricks.com/api/workspace/workspace/setpermissions
- 1 kudos
- 5234 Views
- 4 replies
- 2 kudos
Resolved! Networking reduction cost for NATGateway and Shared Catalog
Use case and context:We have a databricks workspace in a specific region, reading and writing files from/to the same region.We also read from a Shared Catalog in a different company, a data provider, which is pointing to multi-region s3 buckets.The r...
- 5234 Views
- 4 replies
- 2 kudos
- 2 kudos
Thanks @Kaniz_Fatma for all the suggestions.After some days of monitoring NAT cost, I realized that the implementation of the S3 Gateway Endpoint it was actually working, the problem was that I thought that this change would be reflected right away i...
- 2 kudos
- 13738 Views
- 2 replies
- 1 kudos
Resolved! Error: default auth: cannot configure default credentials, please check...
Hola all, I'm experiencing a quite strange error. The problem is that and happens inside a GITLAB pipeline:$ databricks current-user meError: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/au...
- 13738 Views
- 2 replies
- 1 kudos
- 1 kudos
Hola Kaniz, the problem is not on Databricks CLI but is due to some interactions happening inside the Gitlab pipeline. According to the documentation reported here: Databricks personal access token authentication | Databricks on AWS ( at the bottom o...
- 1 kudos
- 9236 Views
- 2 replies
- 3 kudos
External locations being shared across workspaces
Currently, we have 3 Unity Catalog enabled workspaces sharing the same metastore. Now, when we create an external location or storage credential in any of the workspaces, it gets reflected across all workspaces. We are looking for some best practices...
- 9236 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @Debi-Moha Currently we do not have a mechanism to isolate the external locations and storage credentials based on workspaces, since the metastore is shared across the workspaces. Please check below document for recommendations on securing extern...
- 3 kudos
- 15584 Views
- 3 replies
- 1 kudos
Resolved! Bitbucket Cloud Repo Integration with Token
Hey,I am using Bitbucket Cloud and I want to connect my repository to Databricks. I am able to connect with my personal app password but what I am looking for is an authentication of a technical user.I need the integration to point to my dbt repo, wh...
- 15584 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Kaniz_Fatma,thank you for your response! With this link you provided, I was able to authenticate with Bitbucket Cloud. The solution was to use x-token-auth as a username. I have tried with the generated email address before which didn't work. Tha...
- 1 kudos
- 1934 Views
- 1 replies
- 0 kudos
Monitor and Alert Databricks Resource Utilization and Cost Consumption
We want to build monitoring and Alerting solution for Azure Databricks that should capture Resource Utilization details (like Aggregated CPU%, Memory% etc.) and Cost consumption at the Account Level.We have Unity Catalog Enabled and there are multipl...
- 1934 Views
- 1 replies
- 0 kudos
- 0 kudos
@smehta_0908 Greetings! You can utilize Datadog for monitoring CPU and memory of clusters. https://docs.datadoghq.com/integrations/databricks/?tab=driveronly For Cost consumption at accounts level you can make use of billable usage logs using the Acc...
- 0 kudos
- 8574 Views
- 2 replies
- 0 kudos
Is there a way to configure a cluster to have no internet access?
When experimenting with LLMs on Databricks clusters, I have become interested in knowing if the LLM (Llama2 or otherwise), tries to make calls to the internet (i.e., the settings for use_remote_code=True in Huggingface models, as just one example).Mo...
- 8574 Views
- 2 replies
- 0 kudos
- 2861 Views
- 1 replies
- 1 kudos
Resolved! Install xml maven library
When I`m trying to install xml package, I`m getting error PERMISSION_DENIED
- 2861 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey @FlavioSM The error message indicates that the library you're trying to install (com.databricks:spark-xml_2.13:0.17.0) is not on the allowlist for shared clusters in your Databricks workspace. Shared clusters are clusters that multiple users can ...
- 1 kudos
- 3049 Views
- 5 replies
- 0 kudos
Resolved! SCIM Synchronization for Email Change Cases in Azure AD
Hi everyone,I would like to know if the following behavior is expected or if it is a misconfiguration in SCIMWe are going through a change in the email of some users. So we did a test, changing the email of one of them, but the result was not OK beca...
- 3049 Views
- 5 replies
- 0 kudos
- 0 kudos
Hey there! Thanks a bunch for being part of our awesome community! We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...
- 0 kudos
- 11921 Views
- 4 replies
- 0 kudos
Resolved! Databricks job creator update with API
Hi team,Greetings.Do you know if there is a way to update the creator of a databricks_job using the API? the Documentation does not show "the creator" property and when I tried setting the creator, this property is not updated in the workspace UI.The...
- 11921 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @danmlopsmaz , Thanks for bringing up your concerns, always happy to help I understand the customer wanted to change the creator of the job but at this moment we could change the owner but not the creator. But you should be able to clone an exist...
- 0 kudos
- 12882 Views
- 3 replies
- 0 kudos
how register a published app
Is there a way to register a web app with Databricks to enable it to access a user's data using OAuth2? i.e., be included in the list of published apps with a pre-negotiated scope?
- 12882 Views
- 3 replies
- 0 kudos
- 0 kudos
Hey there! Thanks a bunch for being part of our awesome community! We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...
- 0 kudos
- 3794 Views
- 7 replies
- 2 kudos
Resolved! CloudWatch Agent Init Script Fails
Hi,I am trying to install the CloudWatch log agent on my cluster, using this tutorial from AWS https://aws.amazon.com/blogs/mt/how-to-monitor-databricks-with-amazon-cloudwatch/They provide an init script there but when I try to start my cluster I get...
- 3794 Views
- 7 replies
- 2 kudos
- 2 kudos
Hi @Yeshwanth I have the exact same issue here. I have tried to update my cluster with the init script that you kindly shared in this thread. However, my error is still:Init script failure:Cluster scoped init script s3://databricks-init-scripts-caden...
- 2 kudos
- 7580 Views
- 1 replies
- 0 kudos
Resolved! workspace local group once workspace has been migrated/Associated to Unity Catalog
Hello,Once we associate/migrate Databricks workspace to Unity Catalog, all the workspaces group will sync with Unity Catalog and workspace group would be renamed to workspace-local group. Databricks recommends to remove workspace-local group. Do we k...
- 7580 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @VJ3, When you associate or migrate a Databricks workspace to the Unity Catalog, all workspace groups will synchronize with the Unity Catalog, and the workspace groups will be renamed as workspace-local groups. Databricks indeed recommends removin...
- 0 kudos
- 8554 Views
- 6 replies
- 1 kudos
Is a central UC Catalog management a Good Practice?
I am working at large company with many more or less independent divisions and we are currently working on the roll out of Unity Catalog in Azure. The idea was to have a central infrastructure repository (deployed via Terraform) to manage all central...
- 8554 Views
- 6 replies
- 1 kudos
- 1 kudos
Workspace Admins: Consider configuring permissions for workspace admins in the Account console to strike a balance between autonomy and governance.@Kaniz_Fatma Do you have any information about this configuration? I cannot find such thing in the Acco...
- 1 kudos
- 13934 Views
- 1 replies
- 1 kudos
Resolved! Guidance for creating DLT pipeline with raw json call logs
Hi I am looking for some support on how to handle the following situation. I have a call center that generates call log files in json format that are sent to an s3 bucket. Some of the raw json files contain more than one call log object and they ar...
- 13934 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @chrisf_sts, Handling call log files in JSON format and creating a Delta Live data processing pipeline involves several steps. Let’s break it down: Ingestion: You can ingest the raw JSON files directly into a bronze table without separating t...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
42 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
36 | |
9 | |
9 | |
8 | |
8 |