- 608 Views
- 1 replies
- 0 kudos
Resolved! scroll bar disappears on widgets in dashboards
Databricks newbie.I've created a dashboard that has several widgets to allow users to select multiple values from a drop-down list. When I first open the widget to select the values, there is a scroll bar on the right side of the box which allows me...
- 608 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @RichC! You’re not missing any setting here. This is expected behavior. The scrollbar auto-hides after a couple of seconds, but it’s still active. If you start scrolling again (mouse wheel or trackpad), the scrollbar will reappear.
- 0 kudos
- 824 Views
- 3 replies
- 1 kudos
Tracing SQL costs
Hello, Databricks community!In our Account Usage Dashboard, the biggest portion of our costs are labeled simply "SQL".We want to drill deeper to see where the SQL costs are coming from.By querying the `system.usage.billing` table we see that it's mos...
- 824 Views
- 3 replies
- 1 kudos
- 1 kudos
@simenheg - first of all, It’s not an error as Serverls SQL often produces null metadata fields.So you will need to follow below steps for the costUse SQL Warehouse Query Historyjoin billing data with SQL query history - system.billing.usage.usage_da...
- 1 kudos
- 1409 Views
- 4 replies
- 3 kudos
Resolved! AI/BI Dashboard embed issue in Databricks App
Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...
- 1409 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.
- 3 kudos
- 488 Views
- 2 replies
- 0 kudos
Databricks Job : Unable to read Databricks job run parameter in scala code and sql query.
we are created data bricks job with Jar (scala code) and provided parameters/jar parameters and able to read those as arguments in main method. we are running job with parameters (run parameters / job parameters) , those parametes are not able to re...
- 488 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Louis_Frolio Thank you for your suggestion. We are following the approach you recommended, but we encountered an issue while creating the job.We are creating a Databricks job using a JSON file through a pipeline. When we declare job-level parame...
- 0 kudos
- 711 Views
- 2 replies
- 1 kudos
Resolved! Databricks Asset Bundles capability for cross cloud migration
Hi everyone,We are planning a migration from Azure Databricks to GCP Databricks and would like to understand whether Databricks Asset Bundles (DAB) can be used to migrate workspace assets such as jobs, pipelines, notebooks, and custom serving endpoin...
- 711 Views
- 2 replies
- 1 kudos
- 1 kudos
@iyashk-DB Thanks for the details.. It helps.
- 1 kudos
- 727 Views
- 2 replies
- 1 kudos
How do you manage alerts?
Hey all,I'm curious how do teams manage Databricks alerts?My use case is that I have around 10 Spark workflows, and need to validate their output tables.My first iteration was to create alerts manually, e.g. define SQL, evaluation criteria, notificat...
- 727 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @smirnoal ,If you want to have more dynamic behaviour you can use Python for Databricks Asset Bundles which extends Databricks Asset Bundles so that you can:Define resources in Python code. These definitions can coexist with resources defined in Y...
- 1 kudos
- 711 Views
- 4 replies
- 7 kudos
Issue when creating Salesforce Connector
HiI'm trying to create a Salesforce Connector in Lakeflow.In the "salesforce authentication step", I'm entering my Salesforce Username and Password and then I get stucked with the following error message : "OAUTH_APPROVAL_ERROR_GENERIC"My Salesforce...
- 711 Views
- 4 replies
- 7 kudos
- 7 kudos
Hi guys Sorry to bother you again...I know, the salesforce adm in my company is quite "narrow-minded" regarding security issues, and I don't know enough about security protocol to answer him.Do you know why we need a "Connected App" usage while we ar...
- 7 kudos
- 342 Views
- 3 replies
- 0 kudos
Can't display histogram in Databricks Notebooks
Is it a known bug? It says to use display(dataframe) for is not workingHow can I display it using Databricks Visualizations? Thanks in advance
- 342 Views
- 3 replies
- 0 kudos
- 0 kudos
The describe looks like a completely different table to the image of values, they have different field names and different types. At what point do you get the error? Does it let you get the visualisation pop up? It should work fine. Is it when you tr...
- 0 kudos
- 696 Views
- 2 replies
- 3 kudos
Resolved! Databricks Free Edition Account Migration
Hello,I set up a Databricks Free Edition account with the intention of running it on Azure, since my environment is based in the Azure cloud. However, the account was provisioned on AWS instead. Is there a way to migrate it? Please provide the steps ...
- 696 Views
- 2 replies
- 3 kudos
- 3 kudos
@libpekin - Short Answer - its AWS-only and no such 'automated' path/choice to migrate to Azure.
- 3 kudos
- 702 Views
- 5 replies
- 0 kudos
Proxy configuration - while bootstraping
I am trying to start a cluster in Az databricks,our policy is to use proxy for outbound traffic. I have configured http_proxy, https_proxy, HTTP_PROXY, HTTPS_PROXY, no_proxy and NO_PROXY List in env variables and global . Made sure proxy is bypassin...
- 702 Views
- 5 replies
- 0 kudos
- 764 Views
- 2 replies
- 2 kudos
Resolved! Model serving with provisioned throughput fails
I'm trying to serve a model with provisioned throughput but I'm getting this error:Build could not start due to an internal error. If you are serving a model from UC and Azure storage firewall or Private Link is configured on your storage account, pl...
- 764 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi team, Creating an endpoint in your workspace needs Serverless, and so you need to update the storage account’s firewall to allow Databricks serverless compute via your workspace’s Network Connectivity Configuration (NCC). If the storage account f...
- 2 kudos
- 941 Views
- 1 replies
- 1 kudos
Resolved! Updating projects created from Databricks Asset Bundles
Hi allWe are using Databricks Asset Bundles for our data science / ML projects. The asset bundle we have, have spawned quite a few projects by now, but now we need to make some updates to the asset bundle. The updates should also be added to the spaw...
- 941 Views
- 1 replies
- 1 kudos
- 1 kudos
Greetings @Sleiny , Here’s what’s really going on, plus a pragmatic, field-tested plan you can actually execute without tearing up your repo strategy. Let’s dig in. What’s happening Databricks Asset Bundles templates are used at initialization time ...
- 1 kudos
- 1358 Views
- 5 replies
- 6 kudos
Resolved! AbfsRestOperationException when adding privatelink.dfs.core.windows.net
Hey Databricks forum,Have been searching a lot, but can't find a solution. I have the following setup:- a vnet connected to the databricks workspace with - public-subnet (deligated to Microsoft.Databricks/workspaces) and a NSG - private-subnet (d...
- 1358 Views
- 5 replies
- 6 kudos
- 6 kudos
Yes, that's the solution! I thought I had tested this (maybe some caching..)When I changed it to abfss://metastore@<storageaccount>.dfs.core.windows.net it still failed with:Failed to access cloud storage: [AbfsRestOperationException]The storage publ...
- 6 kudos
- 4804 Views
- 6 replies
- 1 kudos
Unable to destroy NCC private endpoint
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 4804 Views
- 6 replies
- 1 kudos
- 1 kudos
Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...
- 1 kudos
- 1716 Views
- 3 replies
- 3 kudos
Resolved! Lakebase -- Enable RLS in synced Table
Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...
- 1716 Views
- 3 replies
- 3 kudos
- 3 kudos
Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?
- 3 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
74 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 126 | |
| 54 | |
| 38 | |
| 38 | |
| 25 |