cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the conversation to deepen your understanding and maximize your usage of the Databricks platform.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Data Engineering

Join discussions on data engineering best practices, architectures, and optimization strategies with...

11801 Posts

Data Governance

Join discussions on data governance practices, compliance, and security within the Databricks Commun...

509 Posts

Generative AI

Explore discussions on generative artificial intelligence techniques and applications within the Dat...

330 Posts

Machine Learning

Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithm...

995 Posts

Warehousing & Analytics

Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Communi...

660 Posts

Activity in Databricks Platform Discussions

NatJ
by > New Contributor
  • 19 Views
  • 0 replies
  • 0 kudos

Removing access to Lakehouse and only allowing Databricks One?

Hello, I am trying to set up a user group for business users in our Azure Databricks that will only be able to query data. It looks like Databricks One is the solution to use. So I followed the documentation and granted the user group Consumer Access...

  • 19 Views
  • 0 replies
  • 0 kudos
ScottH
by > New Contributor III
  • 24 Views
  • 0 replies
  • 0 kudos

Can I create a serverless budget policy via Python SDK on Azure Databricks?

Hi, I am trying to use the Databricks Python SDK (v0.74.0) to automate the creation of budget policies in our Databricks account. See the Python code below where I am trying to create a serverless budget policy. Note the error.When I click the "Diagn...

ScottH_0-1766168891911.png
  • 24 Views
  • 0 replies
  • 0 kudos
excavator-matt
by > Contributor
  • 107 Views
  • 1 replies
  • 0 kudos

How do I grant access to find a table in Databricks, without giving access to query the table?

Hi!By default it seems users can only see tables and views in Unity Catalog that they have SELECT permission/privilege on. However, we would like to use Unity Catalog as a data catalog of tables we have. They wouldn't then be able to request access t...

Data Governance
permission
privilege
Unity Catalog
  • 107 Views
  • 1 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

@excavator-matt you can grant BROWSE privilege on your catalog to a broad audience (for example, the “All account users” group). This lets users see object metadata (names, comments, lineage, search results, information_schema, etc.) in Catalog Explo...

  • 0 kudos
RichC
by > Visitor
  • 40 Views
  • 1 replies
  • 0 kudos

scroll bar disappears on widgets in dashboards

Databricks newbie.I've created a dashboard that has several widgets to allow users to select multiple values from a drop-down list.  When I first open the widget to select the values, there is a scroll bar on the right side of the box which allows me...

  • 40 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @RichC! You’re not missing any setting here. This is expected behavior. The scrollbar auto-hides after a couple of seconds, but it’s still active. If you start scrolling again (mouse wheel or trackpad), the scrollbar will reappear.

  • 0 kudos
Maxrb
by > New Contributor
  • 131 Views
  • 7 replies
  • 2 kudos

pkgutils walk_packages stopped working in DBR 17.2

Hi,After moving from Databricks runtime 17.1 to 17.2 suddenly my pkgutils walk_packages doesn't identify any packages within my repository anymore.This is my example code:import pkgutil import os packages = pkgutil.walk_packages([os.getcwd()]) print...

  • 131 Views
  • 7 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Hey @Maxrb , Just thinking out loud here, but this might be worth experimenting with. You could try using a Unity Catalog Volume as a lightweight package repository. Volumes can act as a secure, governed home for Python wheels (and JARs), and Databri...

  • 2 kudos
6 More Replies
Joost1024
by > New Contributor
  • 171 Views
  • 4 replies
  • 0 kudos

Read Array of Arrays of Objects JSON file using Spark

Hi Databricks Community! This is my first post in this forum, so I hope you can forgive me if it's not according to the forum best practices After lots of searching, I decided to share the peculiar issue I'm running into in this community.I try to lo...

  • 171 Views
  • 4 replies
  • 0 kudos
Latest Reply
Joost1024
New Contributor
  • 0 kudos

I guess I was a bit over enthusiastic by accepting the answer.When I run the following on the single object array of arrays (as shown in the original post) I get a single row with column "value" and value null. from pyspark.sql import functions as F,...

  • 0 kudos
3 More Replies
jpassaro
by > New Contributor
  • 81 Views
  • 1 replies
  • 0 kudos

does databricks respect parallel vacuum setting?

I am trying to run VACUUM on a delta table that i know has millions of obselete files.out of the box, VACUUM runs the deletes in sequence on the driver. that is bad news for me!According to OSS delta docs, the setting spark.databricks.delta.vacuum.pa...

  • 81 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @jpassaro ,  Thanks for laying out the context and the links. Let me clarify what’s actually happening here and how I’d recommend moving forward. Short answer No. On Databricks Runtime, the spark.databricks.delta.vacuum.parallelDelete.enabl...

  • 0 kudos
piotrsofts
by > Contributor
  • 116 Views
  • 4 replies
  • 2 kudos

Accessing Knowledge Base from Databricks One

Is it possible to use Knowledge Assistant from Databricks one ? 

  • 116 Views
  • 4 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

@piotrsofts , if you are happy please accept as a solution so others can be confident in the approach.  Cheers, Louis.

  • 2 kudos
3 More Replies
simenheg
by > Visitor
  • 76 Views
  • 3 replies
  • 1 kudos

Tracing SQL costs

Hello, Databricks community!In our Account Usage Dashboard, the biggest portion of our costs are labeled simply "SQL".We want to drill deeper to see where the SQL costs are coming from.By querying the `system.usage.billing` table we see that it's mos...

  • 76 Views
  • 3 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@simenheg - first of all, It’s not an error as Serverls SQL often produces null metadata fields.So you will need to follow below steps for the costUse SQL Warehouse Query Historyjoin billing data with SQL query history - system.billing.usage.usage_da...

  • 1 kudos
2 More Replies
ismaelhenzel
by > Contributor III
  • 25 Views
  • 0 replies
  • 0 kudos

Declarative Pipelines - Dynamic Overwrite

Regarding the limitations of declarative pipelines—specifically the inability to use replaceWhere—I discovered through testing that materialized views actually support dynamic overwrites. This handles several scenarios where replaceWhere would typica...

  • 25 Views
  • 0 replies
  • 0 kudos
oye
by > New Contributor II
  • 92 Views
  • 3 replies
  • 0 kudos

Unavailable GPU compute

Hello,I would like to create a ML compute with GPU. I am on GCP europe-west1 and the only available options for me are the G2 family and one instance of the A3 family (a3-highgpu-8g [H100]). I have been trying multiple times at different times but I ...

  • 92 Views
  • 3 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor II
  • 0 kudos

Hi @oye ,You’re hitting a cloud capacity issue, not a Databricks configuration problem. The Databricks GCP GPU docs list A2 and G2 as the supported GPU instance families. A3/H100 is not in the supported list: https://docs.databricks.com/gcp/en/comput...

  • 0 kudos
2 More Replies
abhijit007
by > New Contributor III
  • 205 Views
  • 4 replies
  • 3 kudos

Resolved! AI/BI Dashboard embed issue in Databricks App

Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...

Administration & Architecture
AIBI Dashboard
Databricks Apps
  • 205 Views
  • 4 replies
  • 3 kudos
Latest Reply
abhijit007
New Contributor III
  • 3 kudos

Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.

  • 3 kudos
3 More Replies
shivamrai162
by > New Contributor III
  • 168 Views
  • 2 replies
  • 1 kudos

Resolved! Not able to add scorer to multi agent supervisor

Hello,When I try to add scorers to Multi agent endpoint based on the last 10 traces that I have logged and visible in the experiments tab, i get this error.Also, are there any demos which i can refer regarding the tabs within the evaluation bar expla...

shivamrai162_0-1763609354150.png shivamrai162_2-1763609468060.png
  • 168 Views
  • 2 replies
  • 1 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 1 kudos

Hi @shivamrai162 , Did you add the last 10 traces to the evaluation dataset? You can follow the steps here to make sure you added the traces to the evaluation dataset. To answer your second question, here is a good article that covers the concepts an...

  • 1 kudos
1 More Replies
Sunil_Patidar
by > New Contributor
  • 79 Views
  • 1 replies
  • 1 kudos

Unable to read from or write to Snowflake Open Catalog via Databricks

I have Snowflake Iceberg tables whose metadata is stored in Snowflake Open Catalog. I am trying to read these tables from the Open Catalog and write back to the Open Catalog using Databricks.I have explored the available documentation but haven’t bee...

  • 79 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Greetings @Sunil_Patidar ,  Databricks and Snowflake can interoperate cleanly around Iceberg today — but how you do it matters. At a high level, interoperability works because both platforms meet at Apache Iceberg and the Iceberg REST Catalog API. Wh...

  • 1 kudos
969091
by > New Contributor II
  • 37612 Views
  • 11 replies
  • 10 kudos

Send custom emails from databricks notebook without using third party SMTP server. Would like to utilize databricks existing smtp or databricks api.

We want to use existing databricks smtp server or if databricks api can used to send custom emails. Databricks Workflows sends email notifications on success, failure, etc. of jobs but cannot send custom emails. So we want to send custom emails to di...

  • 37612 Views
  • 11 replies
  • 10 kudos
Latest Reply
Shivaprasad
Contributor
  • 10 kudos

Did you able to get the custom email working from databricks notebook. I was trying but was not successful. let me know

  • 10 kudos
10 More Replies