Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I am currently unable to use the MAPS (Choropleth) feature in databricks dashboard, however I am able to use it in the legacy dashboard. I would like my users to use a single dashboard that has all the combined features than exposing them to 2 differ...
Thanks for your input @Brahmareddy , I just met some folks from Databricks at a booth in Missouri and they confirmed April end would be the planned timeline for the maps feature release. Thanks for your input
Hi community,We have deployed the wheel package internally in our bundle repository: artifacts:
rnc_lib:
type: whl
build: poetry build
path: .
# For passing wheel package to workspace
sync:
include:
- ./dist/*.whlThe problem is t...
Hi @jeremy98 You can upload the wheel to a shared workspace location and configure it for cluster-level installation by attaching it as a library.Or you can also automate the process by adding the wheel to the libraries section of your databricks.yml...
Hi, I want to change the cluster I am using. However, when I click on the "Compute" tab on the platform, I get automatically redirected to the "SQL Warehouses" page. I am not able to click and enter the "Compute" page. How can I solve this? Thank you
@jmeer Does the platform’s Compute tab refer to the Compute option in the sidebar? Could you please tell me exactly what you’re clicking on when you get redirected?
Hi there @naineel , one approach can be you can convert your project into a whl fileand then create a python whl task for it and schedulehttps://docs.databricks.com/aws/en/jobs/python-wheel
Greetings,I see that Delta Live Tables has various real-time connectors such as Kafka, Kinesis, Google's Pub Sub, and so on. I also see that Apache had maintained an mqtt connector to Spark through the 2.x series called Bahir, but dropped it in versi...
Hi All, I'm wondering if anyone has had any luck setting up multi valued parameters on SSRS using ODBC connection to Databricks? I'm getting "Cannot add multi value query parameter" error everytime I change my parameter to multi value. In the query s...
Hi I am working on having SSRS reports access Databricks and facing similar challenges. I see you tried this back in 2022. Can you please advice the approach to handle the multi value parameters? Thanks Sam
Hello. I'm currently having an issue that I simply cannot understand nor find an adequate work-around for. Recently, my team within our organization has undergone the effort of migrating our Python code from Databricks notebooks into regular Python m...
Lol, OK so in my case it was because I had a file called databricks.py with clashed with the installed databricks. Renaming my file to databricks_utils.py solved it.
Hi, I recently found a blog online about Databricks using AI Agents to automate ETL, but I can't find where these capabilities are located in Databricks. Does anyone know?Here is the blog,https://www.heliverse.com/blog/databricks-ai-agents-streamlini...
Hi NiilDatabrick has introduced AI Agent sub-categories as part of its Generative AI capabilities. We can now automate tasks such as Extract, Transformation, and Load (ETL).Ex, Information Extraction Agent - we can able to transform large a volume ...
Hey there, in our local development flow we heavily rely on databricks asset bundles and databricks connect. Recently, locally run workflows (i.e. just pyspark python files) have begun to frequently fail with the following grpc error:pyspark.errors.e...
@marcelhfm it might be a Spark Connect issueI would say it is the same for the rest of you, guys
Nothing much to do until the situation is fixed by Databricks
Hi all, Today what are the best option available today for observability and monitoring databricks jobs accross all workspaces. We have 100 of workspaces and it hard to do monitoring to check failed and successeded jobs.We tried using: 1. Team webhoo...
Hey Brahmareddy, Thanks so much for responding. Sorry I forgot to mention we are in Azure. Lets go through one by one.1. Audit logs(Azure Monitor) : AFAIK this requires init scripts and jar build that will not support in serverless or its not the ca...
This is my dlt pipeline event_log - why is it not in readable foramt how can i correct it.This is my pipeline code : import logging
logger = logging.getLogger(__name__)
logger.info("Error")
raise "Error is error"
Hi @ashraf1395 ,I'm working with Delta Live Tables (DLT) and the event_log table. I would like to know if it is possible to access the event handler that DLT uses to write custom logs and send them to this table when events are published.If this is n...
Hi all,I would like to publish the event_log of my DLT Pipeline to a specific schema in Unity Catalog.Following this article (https://docs.databricks.com/gcp/en/dlt/observability#query-the-event-log) this can be done by writing this into the DLTs set...
Hi @susanne , indeed , i tried to create it using dabs as well. This feature is not available using dabs I guess, maybe they will add it once event_logs moves to ga from public preview.databricks API will be a good alternative but if you try it using...
Hi there, i have dlt pipeline and I recently came to know about the event_log feature, i want to deploy my dlt pipeline along with the event_log using databricks asset bundles but i am not able to find any resources for it.If anyone has tried it , yo...
Hi there @21f3001806 ,I guess you are talking about the this : https://docs.databricks.com/api/workspace/pipelines/create#event_logIts still in public preview I tried creating it through UI or by chaing the pipeline settings it worked. But when I imp...