- 1215 Views
- 2 replies
- 2 kudos
JJDBC Insert Performance and Unsupported Data Types
We are reaching out regarding two observations with the Databricks JDBC driver:We’ve noticed that each INSERT query is taking approximately 1 second to execute via the JDBC driver (please refer to the attached screenshot). This seems unusually slow f...
- 1215 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @ankit_kothiya1 Please find below my findings for your query 1. Slow INSERT Performance via Databricks JDBC Driver Observation:Each INSERT query takes about 1 second via the Databricks JDBC driver, which is unusually slow for high-throughput use ...
- 2 kudos
- 923 Views
- 2 replies
- 0 kudos
Guidance on Populating the cloud_infra_cost Table in System Catalog
In the system catalog, there are three tables: cloud_infra_cost, list_prices, and usage. While the list_prices and usage tables contain cost-related information, the cloud_infra_cost table is currently empty. I am using AWS cloud. Can anyone provide ...
- 923 Views
- 2 replies
- 0 kudos
- 0 kudos
I am opted in for the features in preview but I dont see any data in this table
- 0 kudos
- 830 Views
- 3 replies
- 1 kudos
Principals given access to and their owners
Hi allIn a large global data platform built with Azure Databricks, I like to know the best practice of how we maintain the users to which Databricks objects (typically views) have been access to, for example - a view has been given access to a servic...
- 830 Views
- 3 replies
- 1 kudos
- 1 kudos
@noorbasha534 - I created a helper function/script to do this in my environment that queries the Unity Catalog system tables to generate a unique list of impacted principals/users. It takes in a list of fully qualified object names and will display ...
- 1 kudos
- 219 Views
- 0 replies
- 0 kudos
View Refresh Frequency
Dear allwe have around 5000+ finished data products (aka views) in several schemas of unity catalog. One question that comes from business users frequently is - how frequently these get refreshed?for that the answer is not simpler as the underlying t...
- 219 Views
- 0 replies
- 0 kudos
- 2014 Views
- 8 replies
- 2 kudos
Resolved! Creating Groups with API and Python
I am working on a notebook to help me create Azure Databricks Groups. When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them. When I create a group using the API, and I create the w...
- 2014 Views
- 8 replies
- 2 kudos
- 2 kudos
I have a couple of questions regarding the Token to achieve this, If I create a workspace PAT token, is it limited to only the workspace or all the workspaces I have access to. And Do my account admin privileges translated to the PAT token I create i...
- 2 kudos
- 429 Views
- 1 replies
- 0 kudos
How to write files to Databricks Volumes while running code in local VS Code (without cp)
How to write files to Databricks Volumes while running code in local VS Code (without cp)I always struggle to seamlessly use VS Code with databricks. Its so not user friendly. Do you also feel the same?
- 429 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @gowtham-talluru, If you're trying to write directly to Volumes from local code, you can use the Databricks SDK for Python.Try something like this:from databricks.sdk import WorkspaceClientw = WorkspaceClient()with open("local_file.csv", "rb") as ...
- 0 kudos
- 1052 Views
- 2 replies
- 2 kudos
Resolved! Transfer Account Ownership
I have the same issue as this previous user, who had their question resolved before an actionable solution was provided: https://community.databricks.com/t5/data-engineering/how-to-transfer-ownership-of-a-databricks-cloud-standard-account/td-p/34737I...
- 1052 Views
- 2 replies
- 2 kudos
- 2 kudos
You are going to have a very difficult time with the transfer as it can only be done on the backside by Databricks. Your only real option would be to have your customer create their own account and migrate the workspace assets over outside of having...
- 2 kudos
- 1338 Views
- 2 replies
- 3 kudos
How to access UnityCatalog's Volume inside Databricks App?
I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...
- 1338 Views
- 2 replies
- 3 kudos
- 3 kudos
As I mention, I am not reading a table, so Spark is not the right fit here (plus I don't want to included spark as dependencies to read a csv either). I also don't have dbutils.I found this works:```cfg = Config() #This is available inside appw = Wor...
- 3 kudos
- 768 Views
- 2 replies
- 1 kudos
users' usage report (for frontend power bi)
Hi All,Hi All,I'm currently working on retrieving usage information by querying system tables. At the moment, I'm using the system.access.audit table. However, I've noticed that the list of users retrieved appears to be incomplete when compared to si...
- 768 Views
- 2 replies
- 1 kudos
- 1 kudos
thank you for the replay.if i understand correctly, when using PBI direct query connectivity the users being used is not s service principle but the end user who open the PBI dashboard. correct?did you implement any usage report? Regards,Uri
- 1 kudos
- 1613 Views
- 1 replies
- 0 kudos
unable to enable external sharing when creating deltashare - azure databricks trial
I have started a PayGo Azure tenancy and a Databricks 14 day trialI have signed up using my gmail account.with the above user , I logged into azure and created a workspace and tried to share a schema deltasharing.I am unable to share to a open user ...
- 1613 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mandarsu ,To enable external Delta Sharing in Databricks:Enable External Sharing:Go to the Databricks Account Console, open your Unity Catalog metastore settings, and enable the “External Delta Sharing” option.Check Permissions:Ensure you have th...
- 0 kudos
- 3997 Views
- 2 replies
- 2 kudos
Resolved! Cost
Do you have information that helps me optimize costs and follow up?
- 3997 Views
- 2 replies
- 2 kudos
- 2 kudos
@Athul97 provided a pretty solid list of best practices. To go deeper into Budgets & Alerts, I have found a lot of good success with the Consumption and Budget feature in the Databricks Account Portal under the Usage menu. Once you embed tagging in...
- 2 kudos
- 407 Views
- 0 replies
- 0 kudos
GKE Cluster Shows "Starting" Even After its turned on
Curious if anyone else has run into this. After changing to GKE based clusters, they all turn on but don't show as turned on - we'll have it show as "Starting" but be able to see the same cluster in the dropdown that's already active. "Changing" to t...
- 407 Views
- 0 replies
- 0 kudos
- 355 Views
- 0 replies
- 0 kudos
Spark executor logs path
We are running spark workloads and have enabled cluster log discovery to push executor logs to Azure blog. While that's running fine, I'd also like to know the local path of the executor logs so that I can make use of oneagent from dynatrace and send...
- 355 Views
- 0 replies
- 0 kudos
- 1088 Views
- 3 replies
- 0 kudos
python databricks sdk get object path from id
when using the databricks SDK to get permissions of objects we get inherited_from_object=['/directories/1636517342231743']from what I can see the workspace list and get_status methods only work with the actual path. Is there a way to look up that di...
- 1088 Views
- 3 replies
- 0 kudos
- 0 kudos
@BriGuy Here I have small code snippet which we have used. Hope this works well with youfrom databricks.sdk import WorkspaceClient w = WorkspaceClient() def find_path_by_object_id(target_id, base_path="/"): items = w.workspace.list(path=base_p...
- 0 kudos
- 2767 Views
- 1 replies
- 0 kudos
Lakeflow Connect: can't change general privilege requirements
I want to set up Lakeflow Connect to ETL data from Azure SQL Server (Microsoft SQL Azure (RTM) - 12.0.2000.8 Feb 9 2025) using change tracking (we don't need the data retention of CDC). In the documentation, there is a list off system tables, views ...
- 2767 Views
- 1 replies
- 0 kudos
- 0 kudos
Got same issue. Did you find a way to configure required permissions?
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
38 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
75 | |
36 | |
25 | |
17 | |
12 |