cancel
Showing results for 
Search instead for 
Did you mean: 
Discussions
Engage in dynamic conversations covering diverse topics within the Databricks Community. Explore discussions on data engineering, machine learning, and more. Join the conversation and expand your knowledge base with insights from experts and peers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Community Discussions

Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Expl...

4355 Posts

Activity in Discussions

RichC
by > Visitor
  • 12 Views
  • 1 replies
  • 0 kudos

scroll bar disappears on widgets in dashboards

Databricks newbie.I've created a dashboard that has several widgets to allow users to select multiple values from a drop-down list.  When I first open the widget to select the values, there is a scroll bar on the right side of the box which allows me...

  • 12 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @RichC! You’re not missing any setting here. This is expected behavior. The scrollbar auto-hides after a couple of seconds, but it’s still active. If you start scrolling again (mouse wheel or trackpad), the scrollbar will reappear.

  • 0 kudos
lance-gliser
by > New Contributor
  • 4479 Views
  • 9 replies
  • 0 kudos

Databricks apps - Volumes and Workspace - FileNotFound issues

I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test:  def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...

  • 4479 Views
  • 9 replies
  • 0 kudos
Latest Reply
SarahA
New Contributor II
  • 0 kudos

I would also like to see some responses to this problem.

  • 0 kudos
8 More Replies
Maxrb
by > New Contributor
  • 114 Views
  • 7 replies
  • 2 kudos

pkgutils walk_packages stopped working in DBR 17.2

Hi,After moving from Databricks runtime 17.1 to 17.2 suddenly my pkgutils walk_packages doesn't identify any packages within my repository anymore.This is my example code:import pkgutil import os packages = pkgutil.walk_packages([os.getcwd()]) print...

  • 114 Views
  • 7 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Hey @Maxrb , Just thinking out loud here, but this might be worth experimenting with. You could try using a Unity Catalog Volume as a lightweight package repository. Volumes can act as a secure, governed home for Python wheels (and JARs), and Databri...

  • 2 kudos
6 More Replies
PavanDasa
by > Visitor
  • 20 Views
  • 1 replies
  • 0 kudos

Certificate not received even after 5 days of exam completion

Hi Team,I haven't received the digital credentials even after 5 days of completion the data engineer associate certification successfully.Need help on this.Thank you

  • 20 Views
  • 1 replies
  • 0 kudos
Latest Reply
cert-ops
Databricks Employee
  • 0 kudos

Hello @PavanDasa, Please check your spam just in case it went there. If you do not receive your badge in 48 hours, please file a ticket with our support team. Please also provide them with your email address so that they can look up your account (it ...

  • 0 kudos
Joost1024
by > New Contributor
  • 148 Views
  • 4 replies
  • 0 kudos

Read Array of Arrays of Objects JSON file using Spark

Hi Databricks Community! This is my first post in this forum, so I hope you can forgive me if it's not according to the forum best practices After lots of searching, I decided to share the peculiar issue I'm running into in this community.I try to lo...

  • 148 Views
  • 4 replies
  • 0 kudos
Latest Reply
Joost1024
New Contributor
  • 0 kudos

I guess I was a bit over enthusiastic by accepting the answer.When I run the following on the single object array of arrays (as shown in the original post) I get a single row with column "value" and value null. from pyspark.sql import functions as F,...

  • 0 kudos
3 More Replies
jpassaro
by > Visitor
  • 57 Views
  • 1 replies
  • 0 kudos

does databricks respect parallel vacuum setting?

I am trying to run VACUUM on a delta table that i know has millions of obselete files.out of the box, VACUUM runs the deletes in sequence on the driver. that is bad news for me!According to OSS delta docs, the setting spark.databricks.delta.vacuum.pa...

  • 57 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @jpassaro ,  Thanks for laying out the context and the links. Let me clarify what’s actually happening here and how I’d recommend moving forward. Short answer No. On Databricks Runtime, the spark.databricks.delta.vacuum.parallelDelete.enabl...

  • 0 kudos
piotrsofts
by > Contributor
  • 97 Views
  • 4 replies
  • 2 kudos

Accessing Knowledge Base from Databricks One

Is it possible to use Knowledge Assistant from Databricks one ? 

  • 97 Views
  • 4 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

@piotrsofts , if you are happy please accept as a solution so others can be confident in the approach.  Cheers, Louis.

  • 2 kudos
3 More Replies
Anonym40
by > New Contributor II
  • 19 Views
  • 2 replies
  • 1 kudos

Ingesting data from APIs

Hi, I need to ingest some data available at API endpoint. I was thinking of this option - 1. make API call from Notebook and save data to ADLS2. use AutoLoader to load data from ADLS location. But then, i have some doubts - like I can directly write ...

  • 19 Views
  • 2 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@Anonym40 - its generally a good idea to break the direct API calls to your rest of the data pipeline. By staging the data to ADLS, you are protecting your downstream to upstream processes and getting more restartability/maintenance in your e2e flow....

  • 1 kudos
1 More Replies
simenheg
by > Visitor
  • 49 Views
  • 3 replies
  • 1 kudos

Tracing SQL costs

Hello, Databricks community!In our Account Usage Dashboard, the biggest portion of our costs are labeled simply "SQL".We want to drill deeper to see where the SQL costs are coming from.By querying the `system.usage.billing` table we see that it's mos...

  • 49 Views
  • 3 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@simenheg - first of all, It’s not an error as Serverls SQL often produces null metadata fields.So you will need to follow below steps for the costUse SQL Warehouse Query Historyjoin billing data with SQL query history - system.billing.usage.usage_da...

  • 1 kudos
2 More Replies
ymmmm
by > Visitor
  • 71 Views
  • 6 replies
  • 1 kudos

Account reset and loss of access to paid Databricks Academy Labs subscription

Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...

  • 71 Views
  • 6 replies
  • 1 kudos
Latest Reply
ymmmm
Visitor
  • 1 kudos

Thank you for your support and tour help

  • 1 kudos
5 More Replies
jlancaster86
by > New Contributor
  • 241 Views
  • 3 replies
  • 2 kudos

Can't find zip file for "Get Started with Databricks Free Edition"

Hi, all. I'm getting stuck on the second step of the "Get Started with Databricks Free Edition" course. It's obviously instructing me to download a zip file from a GitHub repository, but there is no link provided for the repository. What am I missing...

  • 241 Views
  • 3 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Update: The course instructions have now been updated. You can find the required ZIP file in the Repository section at the bottom of the lesson page.Apologies for the inconvenience, @jlancaster86 & @zwilk.

  • 2 kudos
2 More Replies
Hubert-Dudek
by Databricks MVP
  • 55 Views
  • 1 replies
  • 2 kudos

Databricks Advent Calendar 2025 #18

Automatic file retention in the autoloader is one of my favourite new features of 2025. Automatically move cloud files to cold storage or just delete.

2025_18.png
  • 55 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Thanks for sharing @Hubert-Dudek ! That's a really great feature. It simplified a lot data maintenance process at one of my clients

  • 2 kudos
ismaelhenzel
by > Contributor III
  • 18 Views
  • 0 replies
  • 0 kudos

Declarative Pipelines - Dynamic Overwrite

Regarding the limitations of declarative pipelines—specifically the inability to use replaceWhere—I discovered through testing that materialized views actually support dynamic overwrites. This handles several scenarios where replaceWhere would typica...

  • 18 Views
  • 0 replies
  • 0 kudos
oye
by > New Contributor II
  • 80 Views
  • 3 replies
  • 0 kudos

Unavailable GPU compute

Hello,I would like to create a ML compute with GPU. I am on GCP europe-west1 and the only available options for me are the G2 family and one instance of the A3 family (a3-highgpu-8g [H100]). I have been trying multiple times at different times but I ...

  • 80 Views
  • 3 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor II
  • 0 kudos

Hi @oye ,You’re hitting a cloud capacity issue, not a Databricks configuration problem. The Databricks GCP GPU docs list A2 and G2 as the supported GPU instance families. A3/H100 is not in the supported list: https://docs.databricks.com/gcp/en/comput...

  • 0 kudos
2 More Replies
abhijit007
by > New Contributor III
  • 187 Views
  • 4 replies
  • 3 kudos

Resolved! AI/BI Dashboard embed issue in Databricks App

Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...

Administration & Architecture
AIBI Dashboard
Databricks Apps
  • 187 Views
  • 4 replies
  • 3 kudos
Latest Reply
abhijit007
New Contributor III
  • 3 kudos

Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.

  • 3 kudos
3 More Replies