cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AlexVB
by New Contributor III
  • 3749 Views
  • 3 replies
  • 4 kudos

Metabase support

Databricks x MetabaseHi, as someone who previously used Metabase as their self-service BI tool in their org, I was disappointed to see that Databricks is not supported officially: https://www.metabase.com/data_sources/The community drivers project is...

  • 3749 Views
  • 3 replies
  • 4 kudos
Latest Reply
ramiro
New Contributor II
  • 4 kudos

It happened! With the release of Metabase 51, Databricks is now officially supported as a driver 拾- Databricks and Metabase docs- Video guide

  • 4 kudos
2 More Replies
Loki466
by New Contributor
  • 1679 Views
  • 1 replies
  • 1 kudos

Unable to list a folder with square bracket in name

I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.this logic works perfectly fine for all folder except for the folders eh...

  • 1679 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

[] comes under valid characters, so it is an issue with dbfs.fs. dbutils.fs.ls("/Workspace/Users/user@databricks.com/qwe[rt]") fails with java.net.URISyntaxException: Illegal character in path at index I tested a workaround meanwhile that can help ...

  • 1 kudos
tim-mcwilliams
by New Contributor III
  • 6814 Views
  • 10 replies
  • 1 kudos

Notebook cell gets hung up but code completes

Have been running into an issue when running a pymc-marketing model in a Databricks notebook. The cell that fits the model gets hung up and the progress bar stops moving, however the code completes and dumps all needed output into a folder. After the...

  • 6814 Views
  • 10 replies
  • 1 kudos
Latest Reply
Lingesh
Databricks Employee
  • 1 kudos

@tim-mcwilliams I'm not sure if you found a workaround or a fix for this issue. We have recently found another issue (Integration between PyMC and Databricks Kernel does not go well. Specifically, the rendering logic of the progress bar in PyMC) that...

  • 1 kudos
9 More Replies
NhanNguyen
by Contributor III
  • 190 Views
  • 1 replies
  • 0 kudos

[Error reached the limit for number of private messages]

Dear team,Today I want to reply my friend on databricks private message but failed with the error "You have reached the limit for number of private messages that you can send for now. Please try again later."Could you help me check on this?Thanks!

  • 190 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

This is likely a rate limit imposed to prevent spam or excessive messaging, can you try later and advise if the problem persists?

  • 0 kudos
samarth_solanki
by New Contributor II
  • 3429 Views
  • 4 replies
  • 0 kudos

Creating a python package that uses dbutils.secrets

Hello Databricks,I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook I want to use dbutils.secret for ...

  • 3429 Views
  • 4 replies
  • 0 kudos
Latest Reply
krishnab
New Contributor II
  • 0 kudos

from pyspark.dbutils import DBUtils from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate() dbutils = DBUtils(spark)Adding this to the module file solves the problem.  

  • 0 kudos
3 More Replies
NIK251
by New Contributor III
  • 520 Views
  • 4 replies
  • 0 kudos

Delta Live Table Pipeline to EventHub

i want to read and load the data to eventhub. And there is an error message:org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 138.0 failed 4 times, most recent failure: Lost task 0.3 in stage 138.0 (TID 177) (10.139.6...

metrics-infor.png metrics-infor1.png
  • 520 Views
  • 4 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi, The error originates from the eventhub connector. Kindly check this with the EventHub Spark connector team. Please use the latest connector. https://github.com/Azure/azure-event-hubs-spark There are known issues with the Event hub connector like ...

  • 0 kudos
3 More Replies
naatyraja
by New Contributor
  • 163 Views
  • 1 replies
  • 0 kudos

%run working different in different taks of same job

I am using %run "/Workspace/Shared/SCPO_POC/Tredence-Dev-Phase3/Publix Weekly Sales POC - Main Folder/Weekly_Workflow_Parametrized/Get_Widgets_Values" before all my notebooks of a single job which has bunch of child jobs and child tasks... The % run ...

  • 163 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi, Can you please share the complete error?Can you try restarting the cluster, that might helo to mitigate the issue.

  • 0 kudos
NIK251
by New Contributor III
  • 885 Views
  • 2 replies
  • 1 kudos

Resolved! Delta Live Table Pipeline

I have the error message when try to create a delta live table pipeline.My error is: com.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster 1207-112912-8e84v9h5: Encountered Quota Exhaustion issue in ...

  • 885 Views
  • 2 replies
  • 1 kudos
Latest Reply
NIK251
New Contributor III
  • 1 kudos

Thanks sir, I solved it.

  • 1 kudos
1 More Replies
chethankumar
by New Contributor III
  • 206 Views
  • 1 replies
  • 0 kudos

How do I set up Databricks observability using AWS cloudwatch

How do I set up Databricks observability, including metrics and logging? I am using AWS-based Databricks and want to monitor it. I plan to use CloudWatch as the observability tool but couldn't find proper documentation to configure it.

  • 206 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @chethankumar, Setting up CloudWatch might required customer setup, by setting clusters using init script. You might want to consider contacting your account team to guide you through the process. Additional please find the post which could be use...

  • 0 kudos
HeckGunderson
by New Contributor
  • 211 Views
  • 1 replies
  • 0 kudos

Where can I quickly test custom metric queries?

I'm working on adapting some custom metrics to add to the tables' dashboard. Right now when I throw in a test query, I need to refresh metrics and it takes 5-10 minutes to let me know I've probably forgotten a parenthesis somewhere.Where can I test t...

  • 211 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @HeckGunderson, Have you tried running the code using notebook with your Databricks workspace? https://docs.databricks.com/en/notebooks/index.html Also you can do it via CLI using SQL connector. https://docs.databricks.com/ja/dev-tools/python-sql-...

  • 0 kudos
ismaelhenzel
by Contributor
  • 367 Views
  • 2 replies
  • 2 kudos

Databricks Materialized View - DLT Serverless Incremental

I'm currently working with Delta Live Tables, utilizing materialized views and serverless computing. While testing the incremental updates of materialized views, I've observed that deleting a record from the source table triggers a complete refresh o...

  • 367 Views
  • 2 replies
  • 2 kudos
Latest Reply
ismaelhenzel
Contributor
  • 2 kudos

EDIT: My Delta Lake table contains 136 columns. I initially tested with fewer columns, and both the updates and deletes were applied incrementally without issues. Specifically, I tested with 34 columns, and everything worked fine. However, when I inc...

  • 2 kudos
1 More Replies
pankajmehta1242
by New Contributor
  • 412 Views
  • 2 replies
  • 1 kudos

Can't download resources from Databricks Academy courses

I seem to be unable to download hands-on lab resources from the Databricks Academy courses. No link for such files (like .dbc, zip, slides, etc.) exist. It seems the problem is with the new UI. It was fine before. Please help. 

pankajmehta1242_0-1733316769310.png
  • 412 Views
  • 2 replies
  • 1 kudos
Latest Reply
BigRoux
Databricks Employee
  • 1 kudos

There was a recent change, students are no longer able to download dbc files.  The slides are still available in Academy but you can only view them from within Academy, you cannot download them. Hope this helps, Louis.

  • 1 kudos
1 More Replies
drag7ter
by Contributor
  • 429 Views
  • 3 replies
  • 0 kudos

Parameters in dashboards data section passing via asset bundles

A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...

  • 429 Views
  • 3 replies
  • 0 kudos
Latest Reply
drag7ter
Contributor
  • 0 kudos

variables:  catalog:    description: "Catalog name for the dataset"    default: "dev" parameters:        catalog: ${var.catalog}doesn't replace parameter values prod -> dev in json when it is being deployed"datasets": [    {      "displayName": "my_t...

  • 0 kudos
2 More Replies
amit_jbs
by New Contributor II
  • 1733 Views
  • 3 replies
  • 2 kudos

In databricks deployment .py files getting converted to notebooks

A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...

  • 1733 Views
  • 3 replies
  • 2 kudos
Latest Reply
amarantevitor94
New Contributor II
  • 2 kudos

Did you manage to find the solution? If so, could you share it? I have the same problem, and if I find the solution, I'll share it here.

  • 2 kudos
2 More Replies
martkev
by New Contributor
  • 576 Views
  • 1 replies
  • 0 kudos

how to enable unity catalog on azure databricks?

Hi guys We have an azure databricks workspace upgrade from standard to premium and we want to enable unity catalog such that we can federate with snowflake. the workspace is more than two years old. How to enable unity catalog? I am an admin but unde...

  • 576 Views
  • 1 replies
  • 0 kudos
Latest Reply
Rjdudley
Valued Contributor II
  • 0 kudos

Unity Catalog is configured at the account level, and then you add workspaces into it.  In your workspace selector, you should see "Manage account" at the bottom of the list (or go to https://accounts.azuredatabricks.net/).  If you can sign in to tha...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors