- 2454 Views
- 8 replies
- 2 kudos
Resolved! Creating Groups with API and Python
I am working on a notebook to help me create Azure Databricks Groups. When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them. When I create a group using the API, and I create the w...
- 2454 Views
- 8 replies
- 2 kudos
- 2 kudos
I have a couple of questions regarding the Token to achieve this, If I create a workspace PAT token, is it limited to only the workspace or all the workspaces I have access to. And Do my account admin privileges translated to the PAT token I create i...
- 2 kudos
- 635 Views
- 1 replies
- 0 kudos
How to write files to Databricks Volumes while running code in local VS Code (without cp)
How to write files to Databricks Volumes while running code in local VS Code (without cp)I always struggle to seamlessly use VS Code with databricks. Its so not user friendly. Do you also feel the same?
- 635 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @gowtham-talluru, If you're trying to write directly to Volumes from local code, you can use the Databricks SDK for Python.Try something like this:from databricks.sdk import WorkspaceClientw = WorkspaceClient()with open("local_file.csv", "rb") as ...
- 0 kudos
- 1402 Views
- 2 replies
- 2 kudos
Resolved! Transfer Account Ownership
I have the same issue as this previous user, who had their question resolved before an actionable solution was provided: https://community.databricks.com/t5/data-engineering/how-to-transfer-ownership-of-a-databricks-cloud-standard-account/td-p/34737I...
- 1402 Views
- 2 replies
- 2 kudos
- 2 kudos
You are going to have a very difficult time with the transfer as it can only be done on the backside by Databricks. Your only real option would be to have your customer create their own account and migrate the workspace assets over outside of having...
- 2 kudos
- 896 Views
- 2 replies
- 1 kudos
users' usage report (for frontend power bi)
Hi All,Hi All,I'm currently working on retrieving usage information by querying system tables. At the moment, I'm using the system.access.audit table. However, I've noticed that the list of users retrieved appears to be incomplete when compared to si...
- 896 Views
- 2 replies
- 1 kudos
- 1 kudos
thank you for the replay.if i understand correctly, when using PBI direct query connectivity the users being used is not s service principle but the end user who open the PBI dashboard. correct?did you implement any usage report? Regards,Uri
- 1 kudos
- 1921 Views
- 1 replies
- 0 kudos
unable to enable external sharing when creating deltashare - azure databricks trial
I have started a PayGo Azure tenancy and a Databricks 14 day trialI have signed up using my gmail account.with the above user , I logged into azure and created a workspace and tried to share a schema deltasharing.I am unable to share to a open user ...
- 1921 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mandarsu ,To enable external Delta Sharing in Databricks:Enable External Sharing:Go to the Databricks Account Console, open your Unity Catalog metastore settings, and enable the “External Delta Sharing” option.Check Permissions:Ensure you have th...
- 0 kudos
- 4597 Views
- 2 replies
- 2 kudos
Resolved! Cost
Do you have information that helps me optimize costs and follow up?
- 4597 Views
- 2 replies
- 2 kudos
- 2 kudos
@Athul97 provided a pretty solid list of best practices. To go deeper into Budgets & Alerts, I have found a lot of good success with the Consumption and Budget feature in the Databricks Account Portal under the Usage menu. Once you embed tagging in...
- 2 kudos
- 1352 Views
- 3 replies
- 0 kudos
python databricks sdk get object path from id
when using the databricks SDK to get permissions of objects we get inherited_from_object=['/directories/1636517342231743']from what I can see the workspace list and get_status methods only work with the actual path. Is there a way to look up that di...
- 1352 Views
- 3 replies
- 0 kudos
- 0 kudos
@BriGuy Here I have small code snippet which we have used. Hope this works well with youfrom databricks.sdk import WorkspaceClient w = WorkspaceClient() def find_path_by_object_id(target_id, base_path="/"): items = w.workspace.list(path=base_p...
- 0 kudos
- 3449 Views
- 4 replies
- 3 kudos
Drop table - permission management
Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...
- 3449 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @PiotrM, I see there is a feature request already in place. It's been considered for the future: https://databricks.aha.io/ideas/ideas/DB-I-7480
- 3 kudos
- 1896 Views
- 2 replies
- 0 kudos
spark.databricks documentation
I cannot find any documentation related to the spark.databricks.* I was able to find the spark related documentation but it does not contain any information on possible properties or arguments for spark.databricks in particular. Thank you!
- 1896 Views
- 2 replies
- 0 kudos
- 0 kudos
Thus, as of now, the documentation is lacing an obvious and easy to provide element, that can only be found partially, spread around random threads over the internet, or gained by guess-asking the platform developers.When will it be made available?
- 0 kudos
- 1196 Views
- 1 replies
- 0 kudos
How worker nodes get the packages during scale-up?
Hi,We are working with one of the repository where we used to download the artifact/python package from that repository using index url in global init script but now the logic is going to be change we need give the cred to download the package and th...
- 1196 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, the new worker node will execute the global init script independently when it starts. It does not get the package from the driver or other existing nodes and will hit the configured index URL directly, and try to download the package on its own....
- 0 kudos
- 2885 Views
- 5 replies
- 4 kudos
Resolved! Account level groups
When I query my user from an account client and workspace client, I get different answers. Why is this? In addition, why can I only see some account level groups from my workspace, and not others?
- 2885 Views
- 5 replies
- 4 kudos
- 4 kudos
If you have a relatively modern Databricks instance, when you create a group in workspace UI, it creates an account-level group (which you can see in "Source" column – it says "Account"). So this process essentially consists of two steps: 1) create a...
- 4 kudos
- 804 Views
- 1 replies
- 0 kudos
Resolved! How to create a function using the functions API in databricks?
https://docs.databricks.com/api/workspace/functions/createThis documentation gives the sample request payload, and one of the fields is type_json, and there is very little explanation of what is expected in this field. What am I supposed to pass here...
- 804 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @chinmay0924 ,The type_json field describes your function’s input parameters and return type using a specific JSON format. You’ll need to include each parameter’s name, type (like "STRING", "INT", "ARRAY", or "STRUCT"), and position, along with th...
- 0 kudos
- 2135 Views
- 3 replies
- 1 kudos
Terraform - Azure Databricks workspace without NAT gateway
Hi all,I have experienced an increase in costs - even when not using Databricks compute.It is due to the NAT-gateway, that are (suddenly) automatically deployed.When creating Azure Databricks workspaces using Terraform:A NAT-gateway is created. When ...
- 2135 Views
- 3 replies
- 1 kudos
- 1 kudos
In Azure Databricks, a NAT Gateway will be required (by Microsoft) for all egress from VMs, which affects Databricks compute: Azure updates | Microsoft Azure
- 1 kudos
- 2516 Views
- 2 replies
- 0 kudos
Resolved! External Locations to Azure Storage via Private Endpoint
When working with Azure Databricks (with VNET injection) to connect securely to an Azure Storage account via private endpoint, there's a few locations it needs to connect to, firstly the vnet that databricks is connected to, which works well when con...
- 2516 Views
- 2 replies
- 0 kudos
- 0 kudos
I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with ze...
- 0 kudos
- 1421 Views
- 5 replies
- 0 kudos
Any documentation mentioning connectivity from Azure SQL database connectivity to Azure Databricks
Any documentation available to connect from the Azure SQL database to Azure Databricks SQL workspace. We created a SQL warehouse personal access token for a user in a different team who can connect from his on-prem SQL DB to Databricks using the conn...
- 1421 Views
- 5 replies
- 0 kudos
- 0 kudos
Here are some considerations: SQL Trigger: Define a trigger in Azure SQL that activates on specific DML operations (e.g., INSERT, UPDATE). External Call: The trigger can log events to an intermediate service (like a control table or Event Grid). ...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |