- 648 Views
- 1 replies
- 0 kudos
Requirements for Managed Iceberg tables with Unity Catalog
Does Databricks support creating native Apache iceberg tables(managed) in unity catalog or is it possible only with private preview, so what are the requirements?
- 648 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @zent! Databricks now fully supports creating Apache Iceberg managed tables in Unity Catalog, and this capability is available in Public Preview (not just private preview). These managed Iceberg tables can be read and written by Databricks and ...
- 0 kudos
- 2519 Views
- 2 replies
- 1 kudos
Resolved! New Regional Group Request
Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul @Sujitha
- 2519 Views
- 2 replies
- 1 kudos
- 1 kudos
Is there a group you already created??
- 1 kudos
- 1324 Views
- 3 replies
- 3 kudos
Resolved! How be a part of Databricks Groups
Hello, I am part of a Community Databricks Crew LATAM, where we have achieved 300 people connected and we have executed 3 events, one by month, we want to be part of Databricks Groups but we dont know how to do that, if somebody can help me I will a...
- 1324 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi Ana, Thanks for reaching out! I won’t be attending DAIS this time, but we do have a Databricks Community booth set up near the Expo Hall. My colleague @Sujitha will be there. Do stop by to say hi and learn about all the exciting things we have go...
- 3 kudos
- 157 Views
- 0 replies
- 0 kudos
How is ur experience with dbx 2025
How is ur experience with dbx 2025
- 157 Views
- 0 replies
- 0 kudos
- 2535 Views
- 2 replies
- 0 kudos
How to "Python versions in the Spark Connect client and server are different. " in UDF
I've read all relevant articles but none have solution that I could understand. Sorry I'm new to it.I have a simple UDF to demonstrate the problem:df = spark.createDataFrame([(1, 1.0, 'a'), (1, 2.0, 'b'), (2, 3.0, 'c'), (2, 5.0, 'd'), (2, 10.0, 'e')]...
- 2535 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Dimitry ,The error you're seeing indicates that the Python version in your notebook (3.11) doesn't match the version used by Databricks Serverless, which is typically Python 3.12. Since Serverless environments use a fixed Python version, this mis...
- 0 kudos
- 705 Views
- 1 replies
- 1 kudos
Databricks Dashboard run from Job issue
Hello, i am trying to trigger a databricks dashboard via workflow task.1.when i deploy the job triggering the dashboard task via local "Deploy bundle" command deployment is successful.2. when i try to deploy to a different environment via CICD while ...
- 705 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @anilsampson ,The error means your dashboard_task is not properly nested under the tasks section.tasks:- task_key: dashboard_task dashboard_task: dashboard_id: ${resources.dashboards.nyc_taxi_trip_analysis.id} warehouse_id: ${var.warehouse_...
- 1 kudos
- 3999 Views
- 6 replies
- 2 kudos
In databricks deployment .py files getting converted to notebooks
A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...
- 3999 Views
- 6 replies
- 2 kudos
- 2 kudos
Another option is Databricks Asset Bundles.
- 2 kudos
- 1213 Views
- 1 replies
- 2 kudos
Resolved! Cannot run merge statement in the notebook
Hi allI'm trialing Databricks for running complex python integration scripts. It will be different data sources (MS SQL, CSV files etc.) that I need to push to a target system via GraphQL. So I selected Databricks vs MS Fabric as it can handle comple...
- 1213 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Dimitry ,The issue you're seeing is due to delta.enableRowTracking = true. This feature adds hidden _metadata columns, which serverless compute doesn't support, that's why the MERGE fails there.Try this out:You can disable row tracking with:ALTER...
- 2 kudos
- 1047 Views
- 2 replies
- 0 kudos
feature store
i need to build for data science team feature store that will return one big df after one hot encoding for almost each dimension,join and group by. should I create one feature store for final output that contain all the relevant data or create featur...
- 1047 Views
- 2 replies
- 0 kudos
- 0 kudos
Here are some things to consider: The best practice for designing a feature store in your scenario depends on balancing scalability, maintainability, and the dynamic nature of some dimensions like doctor names. Here's an outlined recommendation bas...
- 0 kudos
- 1418 Views
- 2 replies
- 0 kudos
Databricks DLT ADLS Access issue
We have a DLT pipeline configure with spn inside the notebook, which was working fine. Now after credentials expiry, we created new one and updated the same in notebook. Now we are pipeline is not able to read from ADLS.SPN and my UserId is having co...
- 1418 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @VigneshJaisanka The issue likely comes from a permissions or configuration mismatch. Here are a few things worth checking:Make sure the SPN is set as the pipeline owner and has the necessary permissions on the ADLS resource.If you’re using Unity ...
- 0 kudos
- 475 Views
- 1 replies
- 0 kudos
Delta Sharing & UC: Understanding the Initial Empty Predicate Query
We're testing our Delta Sharing server with Unity Catalog (UC) and noticed a behavior where a simple query like SELECT COUNT(1) FROM table_name WHERE col1 = 'value' triggers two /query requests to our server.The initial request arrives with empty pre...
- 475 Views
- 1 replies
- 0 kudos
- 0 kudos
The initial /query request during a Delta Sharing operation with Unity Catalog serves a critical purpose in the query lifecycle. It is intended to retrieve the schema and basic metadata of the table, which helps in query planning and optimization. Th...
- 0 kudos
- 1247 Views
- 2 replies
- 0 kudos
Migration of PowerBI reports from Synapse to Databricks sql (DBSQL)
We have 250 powerbi reports build on top of Azure Synapse, now we are migrating from Azure Synapse to Databricks (DB SQL). How to plan for cutover and strategy for PowerBII just seeking high level points we have to take care for planning. Any techie ...
- 1247 Views
- 2 replies
- 0 kudos
- 0 kudos
While your account Solution Architect (SA) will be able to guide you, if you still want to check what peers did here https://community.databricks.com/t5/warehousing-analytics/migrate-azure-synapse-analytics-data-to-databricks/td-p/90663 and here http...
- 0 kudos
- 9849 Views
- 15 replies
- 6 kudos
Unable to Login - Account Verification Loop
I'm having trouble logging in to my Databricks account at databricks.com. Here's what happens:I enter my email address and password.I receive an account verification code via email.I enter the verification code on the login page.Instead of logging me...
- 9849 Views
- 15 replies
- 6 kudos
- 6 kudos
Same problem. Has anyone found a solution yet?
- 6 kudos
- 2236 Views
- 3 replies
- 1 kudos
Resolved! Delta Live Table Pipeline
I have the error message when try to create a delta live table pipeline.My error is: com.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster 1207-112912-8e84v9h5: Encountered Quota Exhaustion issue in ...
- 2236 Views
- 3 replies
- 1 kudos
- 1604 Views
- 1 replies
- 0 kudos
Databricks User Group Meetups
Are there any Databricks User Group Meetups in Charlotte?
- 1604 Views
- 1 replies
- 0 kudos
- 0 kudos
I am interested to participate in this group if it is available
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
115 | |
56 | |
42 | |
34 |