Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the co...
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the co...
Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Expl...
Databricks newbie.I've created a dashboard that has several widgets to allow users to select multiple values from a drop-down list. When I first open the widget to select the values, there is a scroll bar on the right side of the box which allows me...
Hello @RichC! You’re not missing any setting here. This is expected behavior. The scrollbar auto-hides after a couple of seconds, but it’s still active. If you start scrolling again (mouse wheel or trackpad), the scrollbar will reappear.
I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test: def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...
I would also like to see some responses to this problem.
Hi,After moving from Databricks runtime 17.1 to 17.2 suddenly my pkgutils walk_packages doesn't identify any packages within my repository anymore.This is my example code:import pkgutil import os packages = pkgutil.walk_packages([os.getcwd()]) print...
Hey @Maxrb , Just thinking out loud here, but this might be worth experimenting with. You could try using a Unity Catalog Volume as a lightweight package repository. Volumes can act as a secure, governed home for Python wheels (and JARs), and Databri...
Hi Team,I haven't received the digital credentials even after 5 days of completion the data engineer associate certification successfully.Need help on this.Thank you
Hello @PavanDasa, Please check your spam just in case it went there. If you do not receive your badge in 48 hours, please file a ticket with our support team. Please also provide them with your email address so that they can look up your account (it ...
Hi Databricks Community! This is my first post in this forum, so I hope you can forgive me if it's not according to the forum best practices After lots of searching, I decided to share the peculiar issue I'm running into in this community.I try to lo...
I guess I was a bit over enthusiastic by accepting the answer.When I run the following on the single object array of arrays (as shown in the original post) I get a single row with column "value" and value null. from pyspark.sql import functions as F,...
I am trying to run VACUUM on a delta table that i know has millions of obselete files.out of the box, VACUUM runs the deletes in sequence on the driver. that is bad news for me!According to OSS delta docs, the setting spark.databricks.delta.vacuum.pa...
Greetings @jpassaro , Thanks for laying out the context and the links. Let me clarify what’s actually happening here and how I’d recommend moving forward. Short answer No. On Databricks Runtime, the spark.databricks.delta.vacuum.parallelDelete.enabl...
Is it possible to use Knowledge Assistant from Databricks one ?
@piotrsofts , if you are happy please accept as a solution so others can be confident in the approach. Cheers, Louis.
Hi, I need to ingest some data available at API endpoint. I was thinking of this option - 1. make API call from Notebook and save data to ADLS2. use AutoLoader to load data from ADLS location. But then, i have some doubts - like I can directly write ...
@Anonym40 - its generally a good idea to break the direct API calls to your rest of the data pipeline. By staging the data to ADLS, you are protecting your downstream to upstream processes and getting more restartability/maintenance in your e2e flow....
Hello, Databricks community!In our Account Usage Dashboard, the biggest portion of our costs are labeled simply "SQL".We want to drill deeper to see where the SQL costs are coming from.By querying the `system.usage.billing` table we see that it's mos...
@simenheg - first of all, It’s not an error as Serverls SQL often produces null metadata fields.So you will need to follow below steps for the costUse SQL Warehouse Query Historyjoin billing data with SQL query history - system.billing.usage.usage_da...
Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...
Hi, all. I'm getting stuck on the second step of the "Get Started with Databricks Free Edition" course. It's obviously instructing me to download a zip file from a GitHub repository, but there is no link provided for the repository. What am I missing...
Update: The course instructions have now been updated. You can find the required ZIP file in the Repository section at the bottom of the lesson page.Apologies for the inconvenience, @jlancaster86 & @zwilk.
Automatic file retention in the autoloader is one of my favourite new features of 2025. Automatically move cloud files to cold storage or just delete.
Thanks for sharing @Hubert-Dudek ! That's a really great feature. It simplified a lot data maintenance process at one of my clients
Regarding the limitations of declarative pipelines—specifically the inability to use replaceWhere—I discovered through testing that materialized views actually support dynamic overwrites. This handles several scenarios where replaceWhere would typica...
Hello,I would like to create a ML compute with GPU. I am on GCP europe-west1 and the only available options for me are the G2 family and one instance of the A3 family (a3-highgpu-8g [H100]). I have been trying multiple times at different times but I ...
Hi @oye ,You’re hitting a cloud capacity issue, not a Databricks configuration problem. The Databricks GCP GPU docs list A2 and G2 as the supported GPU instance families. A3/H100 is not in the supported list: https://docs.databricks.com/gcp/en/comput...
Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...
Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.
| User | Count |
|---|---|
| 1917 | |
| 923 | |
| 912 | |
| 478 | |
| 317 |