- 623 Views
- 2 replies
- 0 kudos
DLT Pipeline Design
I am new to DLT and trying to understand the process. My bronze table will receive incremental data from SAP in real time. In my bronze table, we are not maintaining any history and any data older than 2 weeks will be deleted. This data from bronze w...
- 623 Views
- 2 replies
- 0 kudos
- 0 kudos
The scenario you mentioned can be efficiently managed using a Delta Live Table (DLT) combined with a CDC (Change Data Capture) flow. From what I understand, the bronze table is always in append mode. Check out the link below for more details. Hope th...
- 0 kudos
- 514 Views
- 6 replies
- 2 kudos
Execute Immediate not working to fetch table name based on year
I am trying to pass the year as argument so it can be used in the table name. Ex: there are tables like Claims_total_2021 , Claims_total_2022 and so on till 2025. Now I want to pass the year in parameter , say 2024 and it must fetch the table Claims...
- 514 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi, I believe the solution shared by Martison would fix this issue. In Databricks SQL, when using EXECUTE IMMEDIATE, the SQL string must be a single variable or single string literal, not an inline expression using string concatenation ('...' || clai...
- 2 kudos
- 605 Views
- 6 replies
- 20 kudos
Resolved! Databricks Community Innovators - Program
Hi, I'd like to know what's happened with the Databricks Community innovators program? https://community.databricks.com/t5/databricks-community-innovators/bg-p/databricks-community-news-members Is this still alive? I've applied and emailed: databrick...
- 605 Views
- 6 replies
- 20 kudos
- 20 kudos
Hi Mandy! Thanks for the introduction and update. I'm really looking forward to being a part of everything moving forward. Just echoing what @TheOC mentioned above, I'm happy to help/assist in whatever way I can.Absolutely buzzing based off the info ...
- 20 kudos
- 1104 Views
- 11 replies
- 43 kudos
Databricks Monthly Spotlight - Discontinued?
Hi, I'm curious what's happened with the Databricks monthly spotlight? https://community.databricks.com/t5/databricks-community-innovators/bg-p/databricks-community-news-members I can there hasn't been anyone in the spotlight since April 2025. Has th...
- 1104 Views
- 11 replies
- 43 kudos
- 43 kudos
Hi all,I just wanted to jump in this thread to say how motivating this kind of conversation is. It's great that we've got passion from users and Databricks employees alike to ensure integrity in the Community - it's really reassuring as someone start...
- 43 kudos
- 307 Views
- 3 replies
- 3 kudos
Python
Guys am not able to access pyspark in free edition somebody help me.
- 307 Views
- 3 replies
- 3 kudos
- 3 kudos
@szymon_dybczak it may be that @SinchBhat has the notebook set to SQL as the default language ?Could also, by mistake, be using the SQL editor.@SinchBhat could you provide some screenshots of what your UI looks like please. It'll help to resolve the ...
- 3 kudos
- 421 Views
- 1 replies
- 0 kudos
Workflow entry point not working on runtime 16.4-LTS
Hello all,I'm developing a python code that it is packaged as a wheel and installed inside a docker image. However, since this program requires numpy >= 2.0 I'm forced to use the runtime 16.4-LTS.When I try to run it as a workflow on databricks I'm e...
- 421 Views
- 1 replies
- 0 kudos
- 244 Views
- 3 replies
- 2 kudos
Resolved! Not able to login Databricks
I am trying to login in databricks but when I enter the OTP it says something went wrong
- 244 Views
- 3 replies
- 2 kudos
- 2 kudos
HI @joypillai ,You want to login to databricks free edition? Could you provide screenshot?
- 2 kudos
- 274 Views
- 1 replies
- 1 kudos
Best practices for reducing noise in data quality monitoring?
Hi all,We’ve been improving our data quality monitoring for several pipelines, but we keep running into the same problem — too many alerts, most of which aren’t actionable. Over time, it becomes harder to trust them.Right now, we’re doing:Freshness c...
- 274 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sifflet, This is genuinely complex—and while you mentioned alerting and monitoring, in my experience the biggest lever to reduce noise is to treat problems at the source (i.e., in the transformation layer). Make the transformations enforce the co...
- 1 kudos
- 612 Views
- 1 replies
- 0 kudos
DLT Pipeline
I am working on DLT pipeline and have one question. As explained on this page (https://docs.databricks.com/aws/en/dlt/tutorial-pipelines?language=Python), we will end up creating 3 tables i.e. customer_cdc_bronze, customer_cdc_clean and customer. Thi...
- 612 Views
- 1 replies
- 0 kudos
- 0 kudos
With this optimized approach, I would suggest creating view for Clean table:1. Bronze Table: Raw CDC data (full storage)2. Clean View: No physical storage - computed on-demand3. Silver Table: Final processed data with SCD2 historyResult: ~67% storage...
- 0 kudos
- 10302 Views
- 7 replies
- 0 kudos
Issue with Visual Studio Code Databricks extension
Hello,I successfully installed the extension and connected it to my databricks account. But when I try to select the repo (which already exists under repos in my databricks repo account) for syncing , I don't see it.I use Azure Devops (Git repo) as s...
- 10302 Views
- 7 replies
- 0 kudos
- 0 kudos
I installed VS code 2 (strange name and version from MS) then i see all the option .... thanks just if someone also facing this issue
- 0 kudos
- 5092 Views
- 3 replies
- 3 kudos
Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector
We are trying to connect Databricks to OneLake, to read data from a Fabric workspace into Databricks, using a notebook. We also use Unity Catalog. We are able to read data from the workspace with a Service Principal like this:from pyspark.sql.types i...
- 5092 Views
- 3 replies
- 3 kudos
- 3 kudos
One thing you can check if the Databricks Access connector have storage blob contributor access on the Datalake
- 3 kudos
- 241 Views
- 1 replies
- 0 kudos
Prakash Hinduja (Geneva) How fix SQL errors like INVALID_IDENTIFIER when running workflows?
I am Prakash Hinduja, a global financial strategist with deep roots in India and a current base in Geneva Switzerland (Swiss).I have been running into SQL errors like INVALID_IDENTIFIER when executing workflows in Databricks. I’ve checked for typos a...
- 241 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @prakashhinduja2 Are you getting error with this code? SQLSTATE: 42602Does your code has any Unquoted identifiers?
- 0 kudos
- 677 Views
- 3 replies
- 1 kudos
Resolved! Not able to create Databriks Compute in Central US
I Created my databricks account in Central US and I am not able to create Compute . So I need help to create Compute.
- 677 Views
- 3 replies
- 1 kudos
- 1 kudos
If everything looks fine, opening a support ticket with Databricks or your cloud provider (Azure/AWS) would be the quickest way to resolve this and get your compute set up.
- 1 kudos
- 886 Views
- 2 replies
- 2 kudos
Resolved! Databricks Claude Access Error - Permission Denied
I'm using databricks-claude-sonnet-3.7 through Azure Databricks, and it was working until yesterday, but when I accessed it now, I got this error: Error: 403 {"error_code":"PERMISSION_DENIED","message":"PERMISSION_DENIED: Endpoint databricks-claude-s...
- 886 Views
- 2 replies
- 2 kudos
- 2 kudos
@szymon_dybczak Thank you! I'll wait a couple days and try again. Much appreciated!
- 2 kudos
- 1555 Views
- 2 replies
- 1 kudos
Pandas API on Spark creates huge query plans
Hello,I have a piece of code written in Pyspark and Pandas API on Spark. On comparing the query plans, I see Pandas API on Spark creates huge query plans whereas Pyspark plan is a tiny one. Furthermore, with Pandas API on spark, we see a lot of incon...
- 1555 Views
- 2 replies
- 1 kudos
- 1 kudos
@FRB1984 could you provide some examples? I'm curious. My first thoughts would be around the shuffling. Check this out: https://spark.apache.org/docs/3.5.4/api/python/user_guide/pandas_on_spark/best_practices.html . There's an argument to be made abo...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
115 | |
56 | |
42 | |
34 |