- 193 Views
- 2 replies
- 1 kudos
How to tag/ cost track Databricks Data Profiling?
We recently started using the Data Profiling/ Lakehouse monitoring feature from Databricks https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/. Data Profiling is using serverless compute for running the profilin...
- 193 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @szymon_dybczak Thanks for th quick replay.But it seems serverless budget policies cannot be applied to data profiling/ monitoring jobs. https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/Serverless budget po...
- 1 kudos
- 164 Views
- 1 replies
- 1 kudos
Wheel File name is changed after using Databricks Asset Bundle Deployment on Github Actions
Hi Team,I am deploying to the Databricks workspace using GitHub and DAB. I have noticed that during deployment, the wheel file name is being converted to all lowercase letters (e.g., pyTestReportv2.whl becomes pytestreportv2.whl). This issue does not...
- 164 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Nisha_Tech It seems like a Git issue rather than Databricks or DAB. There is a git configuration parameter decides the upper case/ lower case of the file names deploved. Please refer here: https://github.com/desktop/desktop/issues/2672#issuecomme...
- 1 kudos
- 1296 Views
- 2 replies
- 0 kudos
I can't create a compute resource beyond "SQL Warehouse", "Vector Search" and "Apps"?
None of the LLMs even understand why I can't create a compute resource. I was using community (now free edition) until yesterday, when I became apparent that I needed the paid version, so I upgraded. I've even got my AWS account connected, which was ...
- 1296 Views
- 2 replies
- 0 kudos
- 0 kudos
I have a similar issue and how can I upgrade?
- 0 kudos
- 3883 Views
- 2 replies
- 0 kudos
Cluster Auto Termination Best Practices
Are there any recommended practices to set cluster auto termination for cost optimization?
- 3883 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello!We use a Databricks Job to adjust the cluster’s auto-termination setting based on the day and time. During weekdays from 6 AM to 6 PM, we set auto_termination_minutes to None, which keeps the cluster running. Outside those hours, and on weekend...
- 0 kudos
- 216 Views
- 0 replies
- 2 kudos
Webinar on Unity Catalog plus giveway
Building Trust in Data: Why Governance Matters More Than EverData has become the heartbeat of every organisation, driving decisions, shaping strategies, and fuelling innovation.Yet, even in data-rich companies, there’s a quiet problem that every lead...
- 216 Views
- 0 replies
- 2 kudos
- 236 Views
- 3 replies
- 0 kudos
Migration from Oracle DB in Linux server to Azure Databricks
Can anyone provide steps for the migration steps
- 236 Views
- 3 replies
- 0 kudos
- 0 kudos
Can anyone share high level steps to migrate from Oracle container DB 19c to Azure Databricks
- 0 kudos
- 541 Views
- 1 replies
- 0 kudos
Issue with UCX Assessment Installation via Automation Script - Schema Not Found Error
Hello,I'm encountering an issue while installing UCX Assessment via an automation script in Databricks. When running the script, I get the following error:13:38:06 WARNING [databricks.labs.ucx.hive_metastore.tables] {listing_tables_0} failed-table-c...
- 541 Views
- 1 replies
- 0 kudos
- 0 kudos
The error occurs because the automation script explicitly sets WORKSPACE_GROUPS="<ALL>" and DATABASES="<ALL>", which the UCX installer interprets literally as a schema called "ALL"—instead of using the special meaning that the manual prompt does when...
- 0 kudos
- 2782 Views
- 1 replies
- 0 kudos
Usability bug in SPARKSQL
Hi Databricks & Team, Spark Cluster: 16.3Being a databricks partner I am unable to raise a support ticket, hence am positing this here. pyspark is good in rendering multiple results in a single cell. refer screenshot below (Screenshot 1)However, SPAR...
- 2782 Views
- 1 replies
- 0 kudos
- 0 kudos
Databricks notebooks currently support multiple outputs per cell in Python (pyspark) but do not provide the same behavior for SQL cells. When running several SQL statements in a single notebook cell, Databricks will only render the output from the la...
- 0 kudos
- 663 Views
- 2 replies
- 0 kudos
Workflow entry point not working on runtime 16.4-LTS
Hello all,I'm developing a python code that it is packaged as a wheel and installed inside a docker image. However, since this program requires numpy >= 2.0 I'm forced to use the runtime 16.4-LTS.When I try to run it as a workflow on databricks I'm e...
- 663 Views
- 2 replies
- 0 kudos
- 0 kudos
Databricks Runtime 16.4-LTS introduced some changes that affect workflow configuration, environment management, and, notably, support for Python third-party libraries like NumPy. The main issue you’re encountering with NumPy >= 2.0 in Databricks work...
- 0 kudos
- 4300 Views
- 7 replies
- 0 kudos
Databricks apps - Volumes and Workspace - FileNotFound issues
I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test: def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...
- 4300 Views
- 7 replies
- 0 kudos
- 0 kudos
I am also running into this issue. @Alberto_Umana can you advise on what to do here? Seems like a lot of people are having this issue
- 0 kudos
- 282 Views
- 3 replies
- 3 kudos
Resolved! How to update Databricks official documentation?
Databricks provides excellent documentation. But in some rare cases I found it could have been improved. For example in the documentation for Data Profiling requirements (https://docs.databricks.com/aws/en/data-quality-monitoring/data-profiling/#requ...
- 282 Views
- 3 replies
- 3 kudos
- 3 kudos
@Charuvil, you can send the feedback to the Databricks team via email link below: Send us feedback
- 3 kudos
- 2997 Views
- 5 replies
- 2 kudos
Resolved! Cluster by auto pyspark
I can find documentation to enable automatic liquid clustering with SQL code: CLUSTER BY AUTO. But how do I do this with Pyspark? I know I can do it with spark.sql("ALTER TABLE CLUSTER BY AUTO") but ideally I want to pass it as an .option().Thanks in...
- 2997 Views
- 5 replies
- 2 kudos
- 2 kudos
This is supported now for DBR 16.4+ for both DataframeWriterV1 and DataframeWriterV2 APIs, and also for DLT, and DataStreaming APIs. More details are here: https://docs.databricks.com/aws/en/delta/clustering . Basically using the option, `.option("cl...
- 2 kudos
- 313 Views
- 3 replies
- 3 kudos
where do I find the AI/BI dashboard?
I only see dashboard under the header SQL on the left menu, but I expected a header data. Got the feeling that under SQL, the dashboard lacks functionality.Best regards,Stefaan
- 313 Views
- 3 replies
- 3 kudos
- 3 kudos
@Neirynck I'd HIGHLY recommend checking out Databricks' Youtube channel or Demo Center when you're exploring part of the product. I'll link them below. @Neirynck as for your question, here's a video on AI/BI that actually sold Databricks to me: https...
- 3 kudos
- 11008 Views
- 16 replies
- 6 kudos
Unable to Login - Account Verification Loop
I'm having trouble logging in to my Databricks account at databricks.com. Here's what happens:I enter my email address and password.I receive an account verification code via email.I enter the verification code on the login page.Instead of logging me...
- 11008 Views
- 16 replies
- 6 kudos
- 6 kudos
Same problem here for me I tried 10 different resend codes and nothing works. It keeps telling me invalid code.
- 6 kudos
- 110 Views
- 0 replies
- 1 kudos
WHERE WE SHOULD HOST THE NEXT MEETUP IN THE US?
Hi all, I collaborate with a Databricks partner, and we've started organizing in-person Meetups in the states. We want to arrange another one, possibly in December or late January. Where do you think most people will be interested in joining? It's fr...
- 110 Views
- 0 replies
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
9 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 127 | |
| 57 | |
| 57 | |
| 42 |