- 4935 Views
- 3 replies
- 4 kudos
Metabase support
Databricks x MetabaseHi, as someone who previously used Metabase as their self-service BI tool in their org, I was disappointed to see that Databricks is not supported officially: https://www.metabase.com/data_sources/The community drivers project is...
- 4935 Views
- 3 replies
- 4 kudos
- 4 kudos
It happened! With the release of Metabase 51, Databricks is now officially supported as a driver 拾- Databricks and Metabase docs- Video guide
- 4 kudos
- 1196 Views
- 3 replies
- 0 kudos
Databricks Lake House Data Clean Room Utilization
Hello.Is it possible to do masking on column data, etc. for the data provided by the cleanroom creatorI am wondering because I don't think Delta shaling allows masking of column data.
- 1196 Views
- 3 replies
- 0 kudos
- 0 kudos
Yes, you are correct. Databricks cleanrooms are built on Delta Sharing, which is foundational to how data is securely shared in Databricks. Since Delta Sharing itself does not natively support column-level masking or row-level security, these feature...
- 0 kudos
- 2089 Views
- 1 replies
- 1 kudos
Unable to list a folder with square bracket in name
I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.this logic works perfectly fine for all folder except for the folders eh...
- 2089 Views
- 1 replies
- 1 kudos
- 1 kudos
[] comes under valid characters, so it is an issue with dbfs.fs. dbutils.fs.ls("/Workspace/Users/user@databricks.com/qwe[rt]") fails with java.net.URISyntaxException: Illegal character in path at index I tested a workaround meanwhile that can help ...
- 1 kudos
- 11625 Views
- 10 replies
- 1 kudos
Notebook cell gets hung up but code completes
Have been running into an issue when running a pymc-marketing model in a Databricks notebook. The cell that fits the model gets hung up and the progress bar stops moving, however the code completes and dumps all needed output into a folder. After the...
- 11625 Views
- 10 replies
- 1 kudos
- 1 kudos
@tim-mcwilliams I'm not sure if you found a workaround or a fix for this issue. We have recently found another issue (Integration between PyMC and Databricks Kernel does not go well. Specifically, the rendering logic of the progress bar in PyMC) that...
- 1 kudos
- 515 Views
- 1 replies
- 0 kudos
[Error reached the limit for number of private messages]
Dear team,Today I want to reply my friend on databricks private message but failed with the error "You have reached the limit for number of private messages that you can send for now. Please try again later."Could you help me check on this?Thanks!
- 515 Views
- 1 replies
- 0 kudos
- 0 kudos
This is likely a rate limit imposed to prevent spam or excessive messaging, can you try later and advise if the problem persists?
- 0 kudos
- 2470 Views
- 1 replies
- 0 kudos
About "Jobs Light Compute" Listed in the Azure Databricks Pricing Table
Hello,While reviewing the Azure Databricks pricing page to check the cost for Job Compute, I came across a term I hadn’t seen before: "Jobs Light Compute."I suspect this refers to the now end-of-support Databricks Runtime known as Databricks Light:Da...
- 2470 Views
- 1 replies
- 0 kudos
- 0 kudos
"Light" is deprecated, and you can't create a new compute with that type. Usually when deprecated products show up on pricing pages, someone is paying for extended support to Microsoft, but sometimes it means they didn't edit that page. You can alw...
- 0 kudos
- 5111 Views
- 4 replies
- 0 kudos
Creating a python package that uses dbutils.secrets
Hello Databricks,I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook I want to use dbutils.secret for ...
- 5111 Views
- 4 replies
- 0 kudos
- 0 kudos
from pyspark.dbutils import DBUtils from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate() dbutils = DBUtils(spark)Adding this to the module file solves the problem.
- 0 kudos
- 1461 Views
- 4 replies
- 0 kudos
Delta Live Table Pipeline to EventHub
i want to read and load the data to eventhub. And there is an error message:org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 138.0 failed 4 times, most recent failure: Lost task 0.3 in stage 138.0 (TID 177) (10.139.6...
- 1461 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, The error originates from the eventhub connector. Kindly check this with the EventHub Spark connector team. Please use the latest connector. https://github.com/Azure/azure-event-hubs-spark There are known issues with the Event hub connector like ...
- 0 kudos
- 444 Views
- 1 replies
- 0 kudos
%run working different in different taks of same job
I am using %run "/Workspace/Shared/SCPO_POC/Tredence-Dev-Phase3/Publix Weekly Sales POC - Main Folder/Weekly_Workflow_Parametrized/Get_Widgets_Values" before all my notebooks of a single job which has bunch of child jobs and child tasks... The % run ...
- 444 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Can you please share the complete error?Can you try restarting the cluster, that might helo to mitigate the issue.
- 0 kudos
- 1404 Views
- 1 replies
- 0 kudos
How do I set up Databricks observability using AWS cloudwatch
How do I set up Databricks observability, including metrics and logging? I am using AWS-based Databricks and want to monitor it. I plan to use CloudWatch as the observability tool but couldn't find proper documentation to configure it.
- 1404 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @chethankumar, Setting up CloudWatch might required customer setup, by setting clusters using init script. You might want to consider contacting your account team to guide you through the process. Additional please find the post which could be use...
- 0 kudos
- 543 Views
- 1 replies
- 0 kudos
Where can I quickly test custom metric queries?
I'm working on adapting some custom metrics to add to the tables' dashboard. Right now when I throw in a test query, I need to refresh metrics and it takes 5-10 minutes to let me know I've probably forgotten a parenthesis somewhere.Where can I test t...
- 543 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @HeckGunderson, Have you tried running the code using notebook with your Databricks workspace? https://docs.databricks.com/en/notebooks/index.html Also you can do it via CLI using SQL connector. https://docs.databricks.com/ja/dev-tools/python-sql-...
- 0 kudos
- 2837 Views
- 0 replies
- 0 kudos
Problem when use synapse ml lightgbm with feature store
I'm running a model using LGBM with Spark, within Pipeline, but when I log it to MlFlow using the log_model feature store function, I can't replicate the pipeline in prediction. When I execute:predict = fs.score_batch(logged_model, df_pred) display(...
- 2837 Views
- 0 replies
- 0 kudos
- 3462 Views
- 2 replies
- 2 kudos
Databricks Materialized View - DLT Serverless Incremental
I'm currently working with Delta Live Tables, utilizing materialized views and serverless computing. While testing the incremental updates of materialized views, I've observed that deleting a record from the source table triggers a complete refresh o...
- 3462 Views
- 2 replies
- 2 kudos
- 2 kudos
EDIT: My Delta Lake table contains 136 columns. I initially tested with fewer columns, and both the updates and deletes were applied incrementally without issues. Specifically, I tested with 34 columns, and everything worked fine. However, when I inc...
- 2 kudos
- 1112 Views
- 2 replies
- 3 kudos
Can't download resources from Databricks Academy courses
I seem to be unable to download hands-on lab resources from the Databricks Academy courses. No link for such files (like .dbc, zip, slides, etc.) exist. It seems the problem is with the new UI. It was fine before. Please help.
- 1112 Views
- 2 replies
- 3 kudos
- 3 kudos
There was a recent change, students are no longer able to download dbc files. The slides are still available in Academy but you can only view them from within Academy, you cannot download them. Hope this helps, Louis.
- 3 kudos
- 14593 Views
- 5 replies
- 3 kudos
Resolved! Databricks Monitoring
Hi Everyone,can someone suggest me to select best native job monitoring tool available in Databricks for fulfill my need;we need to monitor the following:Number of failed jobs and its name : for last 24 hoursTable that are not getting dataLatest inge...
- 14593 Views
- 5 replies
- 3 kudos
- 3 kudos
You can use the databricks API to collect all required information.. https://docs.databricks.com/api/workspace/jobs/listLoad the output to a delta table. Use the Databricks dashboards in displaying this data.. schedule the job for loading the databri...
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |