cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the conversation to deepen your understanding and maximize your usage of the Databricks platform.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Data Engineering

Join discussions on data engineering best practices, architectures, and optimization strategies with...

11801 Posts

Data Governance

Join discussions on data governance practices, compliance, and security within the Databricks Commun...

509 Posts

Generative AI

Explore discussions on generative artificial intelligence techniques and applications within the Dat...

330 Posts

Machine Learning

Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithm...

995 Posts

Warehousing & Analytics

Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Communi...

660 Posts

Activity in Databricks Platform Discussions

s_agarwal
by > New Contributor
  • 73 Views
  • 1 replies
  • 0 kudos

Queries from Serverless compute referring to older/deleted/vacuumed version of the delta tables.

Hi Team,I have a unity catalog based managed delta table which I am able to successfully query using the regular compute/cluster options.But when I try to query the same table using a Serverless/SQL Warehouse, they are referring to an older version /...

  • 73 Views
  • 1 replies
  • 0 kudos
Latest Reply
Saritha_S
Databricks Employee
  • 0 kudos

Hi @s_agarwal  Please find below my findinsg for your query.  Serverless uses cached Unity Catalog metadata Your UC metadata points to an old Delta version Regular clusters bypass this cache Fix by refreshing or forcing UC metadata rewrite

  • 0 kudos
seefoods
by > Valued Contributor
  • 73 Views
  • 1 replies
  • 0 kudos

spark conf for serveless jobs

Hello Guys, I use serveless on databricks Azure, so i have build a decorator which instanciate a SparkSession. My job use autolaoder / kafka using mode availableNow. Someone Knows which spark conf is required beacause i want to add it  ? Thanx import...

  • 73 Views
  • 1 replies
  • 0 kudos
Latest Reply
Saritha_S
Databricks Employee
  • 0 kudos

Hi @seefoods  Please find below my findings for your case. You don’t need (and can’t meaningfully add) any Spark conf to enable availableNow on Databricks Serverless. Let me explain clearly, and then show what is safe to do in your decorator. availa...

  • 0 kudos
Joost1024
by > New Contributor
  • 212 Views
  • 5 replies
  • 0 kudos

Read Array of Arrays of Objects JSON file using Spark

Hi Databricks Community! This is my first post in this forum, so I hope you can forgive me if it's not according to the forum best practices After lots of searching, I decided to share the peculiar issue I'm running into in this community.I try to lo...

  • 212 Views
  • 5 replies
  • 0 kudos
Latest Reply
Joost1024
New Contributor
  • 0 kudos

I guess I was a bit over enthusiastic by accepting the answer.When I run the following on the single object array of arrays (as shown in the original post) I get a single row with column "value" and value null. from pyspark.sql import functions as F,...

  • 0 kudos
4 More Replies
NatJ
by > New Contributor
  • 49 Views
  • 0 replies
  • 0 kudos

Removing access to Lakehouse and only allowing Databricks One?

Hello, I am trying to set up a user group for business users in our Azure Databricks that will only be able to query data. It looks like Databricks One is the solution to use. So I followed the documentation and granted the user group Consumer Access...

  • 49 Views
  • 0 replies
  • 0 kudos
ScottH
by > New Contributor III
  • 55 Views
  • 0 replies
  • 0 kudos

Can I create a serverless budget policy via Python SDK on Azure Databricks?

Hi, I am trying to use the Databricks Python SDK (v0.74.0) to automate the creation of budget policies in our Databricks account. See the Python code below where I am trying to create a serverless budget policy. Note the error.When I click the "Diagn...

ScottH_0-1766168891911.png
  • 55 Views
  • 0 replies
  • 0 kudos
excavator-matt
by > Contributor
  • 126 Views
  • 1 replies
  • 0 kudos

How do I grant access to find a table in Databricks, without giving access to query the table?

Hi!By default it seems users can only see tables and views in Unity Catalog that they have SELECT permission/privilege on. However, we would like to use Unity Catalog as a data catalog of tables we have. They wouldn't then be able to request access t...

Data Governance
permission
privilege
Unity Catalog
  • 126 Views
  • 1 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

@excavator-matt you can grant BROWSE privilege on your catalog to a broad audience (for example, the “All account users” group). This lets users see object metadata (names, comments, lineage, search results, information_schema, etc.) in Catalog Explo...

  • 0 kudos
RichC
by > New Contributor
  • 79 Views
  • 1 replies
  • 0 kudos

scroll bar disappears on widgets in dashboards

Databricks newbie.I've created a dashboard that has several widgets to allow users to select multiple values from a drop-down list.  When I first open the widget to select the values, there is a scroll bar on the right side of the box which allows me...

  • 79 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @RichC! You’re not missing any setting here. This is expected behavior. The scrollbar auto-hides after a couple of seconds, but it’s still active. If you start scrolling again (mouse wheel or trackpad), the scrollbar will reappear.

  • 0 kudos
Maxrb
by > New Contributor
  • 159 Views
  • 7 replies
  • 2 kudos

pkgutils walk_packages stopped working in DBR 17.2

Hi,After moving from Databricks runtime 17.1 to 17.2 suddenly my pkgutils walk_packages doesn't identify any packages within my repository anymore.This is my example code:import pkgutil import os packages = pkgutil.walk_packages([os.getcwd()]) print...

  • 159 Views
  • 7 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Hey @Maxrb , Just thinking out loud here, but this might be worth experimenting with. You could try using a Unity Catalog Volume as a lightweight package repository. Volumes can act as a secure, governed home for Python wheels (and JARs), and Databri...

  • 2 kudos
6 More Replies
jpassaro
by > New Contributor
  • 100 Views
  • 1 replies
  • 0 kudos

does databricks respect parallel vacuum setting?

I am trying to run VACUUM on a delta table that i know has millions of obselete files.out of the box, VACUUM runs the deletes in sequence on the driver. that is bad news for me!According to OSS delta docs, the setting spark.databricks.delta.vacuum.pa...

  • 100 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @jpassaro ,  Thanks for laying out the context and the links. Let me clarify what’s actually happening here and how I’d recommend moving forward. Short answer No. On Databricks Runtime, the spark.databricks.delta.vacuum.parallelDelete.enabl...

  • 0 kudos
piotrsofts
by > Contributor
  • 125 Views
  • 4 replies
  • 2 kudos

Accessing Knowledge Base from Databricks One

Is it possible to use Knowledge Assistant from Databricks one ? 

  • 125 Views
  • 4 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

@piotrsofts , if you are happy please accept as a solution so others can be confident in the approach.  Cheers, Louis.

  • 2 kudos
3 More Replies
simenheg
by > New Contributor
  • 112 Views
  • 3 replies
  • 1 kudos

Tracing SQL costs

Hello, Databricks community!In our Account Usage Dashboard, the biggest portion of our costs are labeled simply "SQL".We want to drill deeper to see where the SQL costs are coming from.By querying the `system.usage.billing` table we see that it's mos...

  • 112 Views
  • 3 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@simenheg - first of all, It’s not an error as Serverls SQL often produces null metadata fields.So you will need to follow below steps for the costUse SQL Warehouse Query Historyjoin billing data with SQL query history - system.billing.usage.usage_da...

  • 1 kudos
2 More Replies
ismaelhenzel
by > Contributor III
  • 31 Views
  • 0 replies
  • 0 kudos

Declarative Pipelines - Dynamic Overwrite

Regarding the limitations of declarative pipelines—specifically the inability to use replaceWhere—I discovered through testing that materialized views actually support dynamic overwrites. This handles several scenarios where replaceWhere would typica...

  • 31 Views
  • 0 replies
  • 0 kudos
oye
by > New Contributor II
  • 102 Views
  • 3 replies
  • 0 kudos

Unavailable GPU compute

Hello,I would like to create a ML compute with GPU. I am on GCP europe-west1 and the only available options for me are the G2 family and one instance of the A3 family (a3-highgpu-8g [H100]). I have been trying multiple times at different times but I ...

  • 102 Views
  • 3 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor II
  • 0 kudos

Hi @oye ,You’re hitting a cloud capacity issue, not a Databricks configuration problem. The Databricks GCP GPU docs list A2 and G2 as the supported GPU instance families. A3/H100 is not in the supported list: https://docs.databricks.com/gcp/en/comput...

  • 0 kudos
2 More Replies
abhijit007
by > New Contributor III
  • 212 Views
  • 4 replies
  • 3 kudos

Resolved! AI/BI Dashboard embed issue in Databricks App

Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...

Administration & Architecture
AIBI Dashboard
Databricks Apps
  • 212 Views
  • 4 replies
  • 3 kudos
Latest Reply
abhijit007
New Contributor III
  • 3 kudos

Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.

  • 3 kudos
3 More Replies
shivamrai162
by > New Contributor III
  • 174 Views
  • 2 replies
  • 1 kudos

Resolved! Not able to add scorer to multi agent supervisor

Hello,When I try to add scorers to Multi agent endpoint based on the last 10 traces that I have logged and visible in the experiments tab, i get this error.Also, are there any demos which i can refer regarding the tabs within the evaluation bar expla...

shivamrai162_0-1763609354150.png shivamrai162_2-1763609468060.png
  • 174 Views
  • 2 replies
  • 1 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 1 kudos

Hi @shivamrai162 , Did you add the last 10 traces to the evaluation dataset? You can follow the steps here to make sure you added the traces to the evaluation dataset. To answer your second question, here is a good article that covers the concepts an...

  • 1 kudos
1 More Replies