Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the co...
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the co...
Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Expl...
Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...
Hi @Louis_Frolio ,I have made changes my master menu with page navigation and used iframe inside submenu and it does work... Thanks for your insightful solution.
The global fashion industry has undergone a dramatic digital transformation, with e-commerce becoming the primary growth engine for leading apparel, luxury, and lifestyle brands. Today’s consumers expect seamless navigation, immersive product discove...
Hello,When I try to add scorers to Multi agent endpoint based on the last 10 traces that I have logged and visible in the experiments tab, i get this error.Also, are there any demos which i can refer regarding the tabs within the evaluation bar expla...
Hi @shivamrai162 , Did you add the last 10 traces to the evaluation dataset? You can follow the steps here to make sure you added the traces to the evaluation dataset. To answer your second question, here is a good article that covers the concepts an...
Manboa While it’s not a magic, consistent use combined with a healthy lifestyle can deliver meaningful improvements in energy, confidence, and performance. Official website: https://manboaa.com.au/
I am trying to run VACUUM on a delta table that i know has millions of obselete files.out of the box, VACUUM runs the deletes in sequence on the driver. that is bad news for me!According to OSS delta docs, the setting spark.databricks.delta.vacuum.pa...
New unified Databricks navigationDatabricks plans to enable the new navigation experience (Public Preview) by default for all users. You’ll be able to opt out by clicking Disable new UI in the sidebar.The goal of the new experience is to reduce click...
Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...
Is it possible to use Knowledge Assistant from Databricks one ?
@piotrsofts , Short answer: yes, you can build and use a Databricks Knowledge Assistant today. Exposing it directly inside the Databricks One chat UI is currently limited to specific agent types. What the Knowledge Assistant is The Knowledge Assista...
I have Snowflake Iceberg tables whose metadata is stored in Snowflake Open Catalog. I am trying to read these tables from the Open Catalog and write back to the Open Catalog using Databricks.I have explored the available documentation but haven’t bee...
Greetings @Sunil_Patidar , Databricks and Snowflake can interoperate cleanly around Iceberg today — but how you do it matters. At a high level, interoperability works because both platforms meet at Apache Iceberg and the Iceberg REST Catalog API. Wh...
Hi,After moving from Databricks runtime 17.1 to 17.2 suddenly my pkgutils walk_packages doesn't identify any packages within my repository anymore.This is my example code:import pkgutil import os packages = pkgutil.walk_packages([os.getcwd()]) print...
Hmmm, I have not personally experienced this. I dug a little deeper in our internal docs and and leveraged some internal tools to put togehter another approach for you. Please give this a try and let me know. You’re running into a subtle but very re...
I have created both AWS and Databricks account but I cannot move to further steps in aws marketplace (configure and launch section)
Hello @Prathy!Also, please check out this video: https://www.youtube.com/watch?v=uzjHI0DNbbsRefer to the deck linked in the video’s description (https://drive.google.com/file/d/1ovZd...) and check slide no. 16, titled “Linking AWS to your Databricks ...
We want to use existing databricks smtp server or if databricks api can used to send custom emails. Databricks Workflows sends email notifications on success, failure, etc. of jobs but cannot send custom emails. So we want to send custom emails to di...
Did you able to get the custom email working from databricks notebook. I was trying but was not successful. let me know
Automatic file retention in the autoloader is one of my favourite new features of 2025. Automatically move cloud files to cold storage or just delete.
we are created data bricks job with Jar (scala code) and provided parameters/jar parameters and able to read those as arguments in main method. we are running job with parameters (run parameters / job parameters) , those parametes are not able to re...
Hi @Louis_Frolio Thank you for your suggestion. We are following the approach you recommended, but we encountered an issue while creating the job.We are creating a Databricks job using a JSON file through a pipeline. When we declare job-level parame...
I have tried Databricks job task to refresh power bi dataset and I have found 2 issues.1. I set up tables in Power BI Desktop using Import mode. After deploying the model to Power BI Service, I was able to download it as an Import mode model. However...
Can you send a screenshot of the refresh power BI task in the jobs UI within Databricks please?
| User | Count |
|---|---|
| 1916 | |
| 923 | |
| 911 | |
| 478 | |
| 317 |