cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AlexVB
by New Contributor III
  • 4935 Views
  • 3 replies
  • 4 kudos

Metabase support

Databricks x MetabaseHi, as someone who previously used Metabase as their self-service BI tool in their org, I was disappointed to see that Databricks is not supported officially: https://www.metabase.com/data_sources/The community drivers project is...

  • 4935 Views
  • 3 replies
  • 4 kudos
Latest Reply
ramiro
New Contributor II
  • 4 kudos

It happened! With the release of Metabase 51, Databricks is now officially supported as a driver 拾- Databricks and Metabase docs- Video guide

  • 4 kudos
2 More Replies
tyorisoo
by New Contributor III
  • 1196 Views
  • 3 replies
  • 0 kudos

Databricks Lake House Data Clean Room Utilization

Hello.Is it possible to do masking on column data, etc. for the data provided by the cleanroom creatorI am wondering because I don't think Delta shaling allows masking of column data.

  • 1196 Views
  • 3 replies
  • 0 kudos
Latest Reply
ozaaditya
Contributor
  • 0 kudos

Yes, you are correct. Databricks cleanrooms are built on Delta Sharing, which is foundational to how data is securely shared in Databricks. Since Delta Sharing itself does not natively support column-level masking or row-level security, these feature...

  • 0 kudos
2 More Replies
Loki466
by New Contributor
  • 2089 Views
  • 1 replies
  • 1 kudos

Unable to list a folder with square bracket in name

I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.this logic works perfectly fine for all folder except for the folders eh...

  • 2089 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

[] comes under valid characters, so it is an issue with dbfs.fs. dbutils.fs.ls("/Workspace/Users/user@databricks.com/qwe[rt]") fails with java.net.URISyntaxException: Illegal character in path at index I tested a workaround meanwhile that can help ...

  • 1 kudos
tim-mcwilliams
by New Contributor III
  • 11625 Views
  • 10 replies
  • 1 kudos

Notebook cell gets hung up but code completes

Have been running into an issue when running a pymc-marketing model in a Databricks notebook. The cell that fits the model gets hung up and the progress bar stops moving, however the code completes and dumps all needed output into a folder. After the...

  • 11625 Views
  • 10 replies
  • 1 kudos
Latest Reply
Lingesh
Databricks Employee
  • 1 kudos

@tim-mcwilliams I'm not sure if you found a workaround or a fix for this issue. We have recently found another issue (Integration between PyMC and Databricks Kernel does not go well. Specifically, the rendering logic of the progress bar in PyMC) that...

  • 1 kudos
9 More Replies
NhanNguyen
by Contributor III
  • 515 Views
  • 1 replies
  • 0 kudos

[Error reached the limit for number of private messages]

Dear team,Today I want to reply my friend on databricks private message but failed with the error "You have reached the limit for number of private messages that you can send for now. Please try again later."Could you help me check on this?Thanks!

  • 515 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

This is likely a rate limit imposed to prevent spam or excessive messaging, can you try later and advise if the problem persists?

  • 0 kudos
Takuya-Omi
by Valued Contributor III
  • 2470 Views
  • 1 replies
  • 0 kudos

About "Jobs Light Compute" Listed in the Azure Databricks Pricing Table

Hello,While reviewing the Azure Databricks pricing page to check the cost for Job Compute, I came across a term I hadn’t seen before: "Jobs Light Compute."I suspect this refers to the now end-of-support Databricks Runtime known as Databricks Light:Da...

TakuyaOmi_0-1733846641884.png
  • 2470 Views
  • 1 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

"Light" is deprecated, and you can't create a new compute with that type.  Usually when deprecated products show up on pricing pages, someone is paying for extended support to Microsoft, but sometimes it means they didn't edit that page.  You can alw...

  • 0 kudos
samarth_solanki
by New Contributor II
  • 5111 Views
  • 4 replies
  • 0 kudos

Creating a python package that uses dbutils.secrets

Hello Databricks,I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook I want to use dbutils.secret for ...

  • 5111 Views
  • 4 replies
  • 0 kudos
Latest Reply
krishnab
New Contributor II
  • 0 kudos

from pyspark.dbutils import DBUtils from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate() dbutils = DBUtils(spark)Adding this to the module file solves the problem.  

  • 0 kudos
3 More Replies
NIK251
by New Contributor III
  • 1461 Views
  • 4 replies
  • 0 kudos

Delta Live Table Pipeline to EventHub

i want to read and load the data to eventhub. And there is an error message:org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 138.0 failed 4 times, most recent failure: Lost task 0.3 in stage 138.0 (TID 177) (10.139.6...

metrics-infor.png metrics-infor1.png
  • 1461 Views
  • 4 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi, The error originates from the eventhub connector. Kindly check this with the EventHub Spark connector team. Please use the latest connector. https://github.com/Azure/azure-event-hubs-spark There are known issues with the Event hub connector like ...

  • 0 kudos
3 More Replies
naatyraja
by New Contributor
  • 444 Views
  • 1 replies
  • 0 kudos

%run working different in different taks of same job

I am using %run "/Workspace/Shared/SCPO_POC/Tredence-Dev-Phase3/Publix Weekly Sales POC - Main Folder/Weekly_Workflow_Parametrized/Get_Widgets_Values" before all my notebooks of a single job which has bunch of child jobs and child tasks... The % run ...

  • 444 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Hi, Can you please share the complete error?Can you try restarting the cluster, that might helo to mitigate the issue.

  • 0 kudos
chethankumar
by New Contributor III
  • 1404 Views
  • 1 replies
  • 0 kudos

How do I set up Databricks observability using AWS cloudwatch

How do I set up Databricks observability, including metrics and logging? I am using AWS-based Databricks and want to monitor it. I plan to use CloudWatch as the observability tool but couldn't find proper documentation to configure it.

  • 1404 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @chethankumar, Setting up CloudWatch might required customer setup, by setting clusters using init script. You might want to consider contacting your account team to guide you through the process. Additional please find the post which could be use...

  • 0 kudos
HeckGunderson
by New Contributor
  • 543 Views
  • 1 replies
  • 0 kudos

Where can I quickly test custom metric queries?

I'm working on adapting some custom metrics to add to the tables' dashboard. Right now when I throw in a test query, I need to refresh metrics and it takes 5-10 minutes to let me know I've probably forgotten a parenthesis somewhere.Where can I test t...

  • 543 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @HeckGunderson, Have you tried running the code using notebook with your Databricks workspace? https://docs.databricks.com/en/notebooks/index.html Also you can do it via CLI using SQL connector. https://docs.databricks.com/ja/dev-tools/python-sql-...

  • 0 kudos
eduardo_v
by New Contributor
  • 2837 Views
  • 0 replies
  • 0 kudos

Problem when use synapse ml lightgbm with feature store

 I'm running a model using LGBM with Spark, within Pipeline, but when I log it to MlFlow using the log_model feature store function, I can't replicate the pipeline in prediction. When I execute:predict = fs.score_batch(logged_model, df_pred) display(...

eduardo_v_2-1733517977930.png eduardo_v_1-1733517924811.png eduardo_v_3-1733518156483.png
  • 2837 Views
  • 0 replies
  • 0 kudos
ismaelhenzel
by Contributor
  • 3462 Views
  • 2 replies
  • 2 kudos

Databricks Materialized View - DLT Serverless Incremental

I'm currently working with Delta Live Tables, utilizing materialized views and serverless computing. While testing the incremental updates of materialized views, I've observed that deleting a record from the source table triggers a complete refresh o...

  • 3462 Views
  • 2 replies
  • 2 kudos
Latest Reply
ismaelhenzel
Contributor
  • 2 kudos

EDIT: My Delta Lake table contains 136 columns. I initially tested with fewer columns, and both the updates and deletes were applied incrementally without issues. Specifically, I tested with 34 columns, and everything worked fine. However, when I inc...

  • 2 kudos
1 More Replies
pankajmehta1242
by New Contributor
  • 1112 Views
  • 2 replies
  • 3 kudos

Can't download resources from Databricks Academy courses

I seem to be unable to download hands-on lab resources from the Databricks Academy courses. No link for such files (like .dbc, zip, slides, etc.) exist. It seems the problem is with the new UI. It was fine before. Please help. 

pankajmehta1242_0-1733316769310.png
  • 1112 Views
  • 2 replies
  • 3 kudos
Latest Reply
BigRoux
Databricks Employee
  • 3 kudos

There was a recent change, students are no longer able to download dbc files.  The slides are still available in Academy but you can only view them from within Academy, you cannot download them. Hope this helps, Louis.

  • 3 kudos
1 More Replies
Revathy123
by New Contributor III
  • 14593 Views
  • 5 replies
  • 3 kudos

Resolved! Databricks Monitoring

Hi Everyone,can someone suggest me to select best native job monitoring tool available in Databricks for fulfill my need;we need to monitor the following:Number of failed jobs and its name : for last 24 hoursTable that are not getting dataLatest inge...

  • 14593 Views
  • 5 replies
  • 3 kudos
Latest Reply
Yaadhudbe
New Contributor II
  • 3 kudos

You can use the databricks API to collect all required information.. https://docs.databricks.com/api/workspace/jobs/listLoad the output to a delta table. Use the Databricks dashboards in displaying this data.. schedule the job for loading the databri...

  • 3 kudos
4 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels