- 3749 Views
- 3 replies
- 4 kudos
Metabase support
Databricks x MetabaseHi, as someone who previously used Metabase as their self-service BI tool in their org, I was disappointed to see that Databricks is not supported officially: https://www.metabase.com/data_sources/The community drivers project is...
- 3749 Views
- 3 replies
- 4 kudos
- 4 kudos
It happened! With the release of Metabase 51, Databricks is now officially supported as a driver 拾- Databricks and Metabase docs- Video guide
- 4 kudos
- 1679 Views
- 1 replies
- 1 kudos
Unable to list a folder with square bracket in name
I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.this logic works perfectly fine for all folder except for the folders eh...
- 1679 Views
- 1 replies
- 1 kudos
- 1 kudos
[] comes under valid characters, so it is an issue with dbfs.fs. dbutils.fs.ls("/Workspace/Users/user@databricks.com/qwe[rt]") fails with java.net.URISyntaxException: Illegal character in path at index I tested a workaround meanwhile that can help ...
- 1 kudos
- 6814 Views
- 10 replies
- 1 kudos
Notebook cell gets hung up but code completes
Have been running into an issue when running a pymc-marketing model in a Databricks notebook. The cell that fits the model gets hung up and the progress bar stops moving, however the code completes and dumps all needed output into a folder. After the...
- 6814 Views
- 10 replies
- 1 kudos
- 1 kudos
@tim-mcwilliams I'm not sure if you found a workaround or a fix for this issue. We have recently found another issue (Integration between PyMC and Databricks Kernel does not go well. Specifically, the rendering logic of the progress bar in PyMC) that...
- 1 kudos
- 190 Views
- 1 replies
- 0 kudos
[Error reached the limit for number of private messages]
Dear team,Today I want to reply my friend on databricks private message but failed with the error "You have reached the limit for number of private messages that you can send for now. Please try again later."Could you help me check on this?Thanks!
- 190 Views
- 1 replies
- 0 kudos
- 0 kudos
This is likely a rate limit imposed to prevent spam or excessive messaging, can you try later and advise if the problem persists?
- 0 kudos
- 3429 Views
- 4 replies
- 0 kudos
Creating a python package that uses dbutils.secrets
Hello Databricks,I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook I want to use dbutils.secret for ...
- 3429 Views
- 4 replies
- 0 kudos
- 0 kudos
from pyspark.dbutils import DBUtils from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate() dbutils = DBUtils(spark)Adding this to the module file solves the problem.
- 0 kudos
- 520 Views
- 4 replies
- 0 kudos
Delta Live Table Pipeline to EventHub
i want to read and load the data to eventhub. And there is an error message:org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 138.0 failed 4 times, most recent failure: Lost task 0.3 in stage 138.0 (TID 177) (10.139.6...
- 520 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, The error originates from the eventhub connector. Kindly check this with the EventHub Spark connector team. Please use the latest connector. https://github.com/Azure/azure-event-hubs-spark There are known issues with the Event hub connector like ...
- 0 kudos
- 163 Views
- 1 replies
- 0 kudos
%run working different in different taks of same job
I am using %run "/Workspace/Shared/SCPO_POC/Tredence-Dev-Phase3/Publix Weekly Sales POC - Main Folder/Weekly_Workflow_Parametrized/Get_Widgets_Values" before all my notebooks of a single job which has bunch of child jobs and child tasks... The % run ...
- 163 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Can you please share the complete error?Can you try restarting the cluster, that might helo to mitigate the issue.
- 0 kudos
- 885 Views
- 2 replies
- 1 kudos
Resolved! Delta Live Table Pipeline
I have the error message when try to create a delta live table pipeline.My error is: com.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster 1207-112912-8e84v9h5: Encountered Quota Exhaustion issue in ...
- 885 Views
- 2 replies
- 1 kudos
- 206 Views
- 1 replies
- 0 kudos
How do I set up Databricks observability using AWS cloudwatch
How do I set up Databricks observability, including metrics and logging? I am using AWS-based Databricks and want to monitor it. I plan to use CloudWatch as the observability tool but couldn't find proper documentation to configure it.
- 206 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @chethankumar, Setting up CloudWatch might required customer setup, by setting clusters using init script. You might want to consider contacting your account team to guide you through the process. Additional please find the post which could be use...
- 0 kudos
- 211 Views
- 1 replies
- 0 kudos
Where can I quickly test custom metric queries?
I'm working on adapting some custom metrics to add to the tables' dashboard. Right now when I throw in a test query, I need to refresh metrics and it takes 5-10 minutes to let me know I've probably forgotten a parenthesis somewhere.Where can I test t...
- 211 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @HeckGunderson, Have you tried running the code using notebook with your Databricks workspace? https://docs.databricks.com/en/notebooks/index.html Also you can do it via CLI using SQL connector. https://docs.databricks.com/ja/dev-tools/python-sql-...
- 0 kudos
- 367 Views
- 2 replies
- 2 kudos
Databricks Materialized View - DLT Serverless Incremental
I'm currently working with Delta Live Tables, utilizing materialized views and serverless computing. While testing the incremental updates of materialized views, I've observed that deleting a record from the source table triggers a complete refresh o...
- 367 Views
- 2 replies
- 2 kudos
- 2 kudos
EDIT: My Delta Lake table contains 136 columns. I initially tested with fewer columns, and both the updates and deletes were applied incrementally without issues. Specifically, I tested with 34 columns, and everything worked fine. However, when I inc...
- 2 kudos
- 412 Views
- 2 replies
- 1 kudos
Can't download resources from Databricks Academy courses
I seem to be unable to download hands-on lab resources from the Databricks Academy courses. No link for such files (like .dbc, zip, slides, etc.) exist. It seems the problem is with the new UI. It was fine before. Please help.
- 412 Views
- 2 replies
- 1 kudos
- 1 kudos
There was a recent change, students are no longer able to download dbc files. The slides are still available in Academy but you can only view them from within Academy, you cannot download them. Hope this helps, Louis.
- 1 kudos
- 429 Views
- 3 replies
- 0 kudos
Parameters in dashboards data section passing via asset bundles
A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...
- 429 Views
- 3 replies
- 0 kudos
- 0 kudos
variables: catalog: description: "Catalog name for the dataset" default: "dev" parameters: catalog: ${var.catalog}doesn't replace parameter values prod -> dev in json when it is being deployed"datasets": [ { "displayName": "my_t...
- 0 kudos
- 1733 Views
- 3 replies
- 2 kudos
In databricks deployment .py files getting converted to notebooks
A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...
- 1733 Views
- 3 replies
- 2 kudos
- 2 kudos
Did you manage to find the solution? If so, could you share it? I have the same problem, and if I find the solution, I'll share it here.
- 2 kudos
- 576 Views
- 1 replies
- 0 kudos
how to enable unity catalog on azure databricks?
Hi guys We have an azure databricks workspace upgrade from standard to premium and we want to enable unity catalog such that we can federate with snowflake. the workspace is more than two years old. How to enable unity catalog? I am an admin but unde...
- 576 Views
- 1 replies
- 0 kudos
- 0 kudos
Unity Catalog is configured at the account level, and then you add workspaces into it. In your workspace selector, you should see "Manage account" at the bottom of the list (or go to https://accounts.azuredatabricks.net/). If you can sign in to tha...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »