- 760 Views
- 6 replies
- 2 kudos
Execute Immediate not working to fetch table name based on year
I am trying to pass the year as argument so it can be used in the table name. Ex: there are tables like Claims_total_2021 , Claims_total_2022 and so on till 2025. Now I want to pass the year in parameter , say 2024 and it must fetch the table Claims...
- 760 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi, I believe the solution shared by Martison would fix this issue. In Databricks SQL, when using EXECUTE IMMEDIATE, the SQL string must be a single variable or single string literal, not an inline expression using string concatenation ('...' || clai...
- 2 kudos
- 850 Views
- 6 replies
- 20 kudos
Resolved! Databricks Community Innovators - Program
Hi, I'd like to know what's happened with the Databricks Community innovators program? https://community.databricks.com/t5/databricks-community-innovators/bg-p/databricks-community-news-members Is this still alive? I've applied and emailed: databrick...
- 850 Views
- 6 replies
- 20 kudos
- 20 kudos
Hi Mandy! Thanks for the introduction and update. I'm really looking forward to being a part of everything moving forward. Just echoing what @TheOC mentioned above, I'm happy to help/assist in whatever way I can.Absolutely buzzing based off the info ...
- 20 kudos
- 1423 Views
- 11 replies
- 43 kudos
Databricks Monthly Spotlight - Discontinued?
Hi, I'm curious what's happened with the Databricks monthly spotlight? https://community.databricks.com/t5/databricks-community-innovators/bg-p/databricks-community-news-members I can there hasn't been anyone in the spotlight since April 2025. Has th...
- 1423 Views
- 11 replies
- 43 kudos
- 43 kudos
Hi all,I just wanted to jump in this thread to say how motivating this kind of conversation is. It's great that we've got passion from users and Databricks employees alike to ensure integrity in the Community - it's really reassuring as someone start...
- 43 kudos
- 431 Views
- 3 replies
- 3 kudos
Python
Guys am not able to access pyspark in free edition somebody help me.
- 431 Views
- 3 replies
- 3 kudos
- 3 kudos
@szymon_dybczak it may be that @SinchBhat has the notebook set to SQL as the default language ?Could also, by mistake, be using the SQL editor.@SinchBhat could you provide some screenshots of what your UI looks like please. It'll help to resolve the ...
- 3 kudos
- 391 Views
- 3 replies
- 2 kudos
Resolved! Not able to login Databricks
I am trying to login in databricks but when I enter the OTP it says something went wrong
- 391 Views
- 3 replies
- 2 kudos
- 2 kudos
HI @joypillai ,You want to login to databricks free edition? Could you provide screenshot?
- 2 kudos
- 383 Views
- 1 replies
- 1 kudos
Best practices for reducing noise in data quality monitoring?
Hi all,We’ve been improving our data quality monitoring for several pipelines, but we keep running into the same problem — too many alerts, most of which aren’t actionable. Over time, it becomes harder to trust them.Right now, we’re doing:Freshness c...
- 383 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sifflet, This is genuinely complex—and while you mentioned alerting and monitoring, in my experience the biggest lever to reduce noise is to treat problems at the source (i.e., in the transformation layer). Make the transformations enforce the co...
- 1 kudos
- 792 Views
- 1 replies
- 0 kudos
DLT Pipeline
I am working on DLT pipeline and have one question. As explained on this page (https://docs.databricks.com/aws/en/dlt/tutorial-pipelines?language=Python), we will end up creating 3 tables i.e. customer_cdc_bronze, customer_cdc_clean and customer. Thi...
- 792 Views
- 1 replies
- 0 kudos
- 0 kudos
With this optimized approach, I would suggest creating view for Clean table:1. Bronze Table: Raw CDC data (full storage)2. Clean View: No physical storage - computed on-demand3. Silver Table: Final processed data with SCD2 historyResult: ~67% storage...
- 0 kudos
- 11125 Views
- 7 replies
- 0 kudos
Issue with Visual Studio Code Databricks extension
Hello,I successfully installed the extension and connected it to my databricks account. But when I try to select the repo (which already exists under repos in my databricks repo account) for syncing , I don't see it.I use Azure Devops (Git repo) as s...
- 11125 Views
- 7 replies
- 0 kudos
- 0 kudos
I installed VS code 2 (strange name and version from MS) then i see all the option .... thanks just if someone also facing this issue
- 0 kudos
- 310 Views
- 1 replies
- 0 kudos
Prakash Hinduja (Geneva) How fix SQL errors like INVALID_IDENTIFIER when running workflows?
I am Prakash Hinduja, a global financial strategist with deep roots in India and a current base in Geneva Switzerland (Swiss).I have been running into SQL errors like INVALID_IDENTIFIER when executing workflows in Databricks. I’ve checked for typos a...
- 310 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @prakashhinduja2 Are you getting error with this code? SQLSTATE: 42602Does your code has any Unquoted identifiers?
- 0 kudos
- 858 Views
- 3 replies
- 1 kudos
Resolved! Not able to create Databriks Compute in Central US
I Created my databricks account in Central US and I am not able to create Compute . So I need help to create Compute.
- 858 Views
- 3 replies
- 1 kudos
- 1 kudos
If everything looks fine, opening a support ticket with Databricks or your cloud provider (Azure/AWS) would be the quickest way to resolve this and get your compute set up.
- 1 kudos
- 1248 Views
- 2 replies
- 2 kudos
Resolved! Databricks Claude Access Error - Permission Denied
I'm using databricks-claude-sonnet-3.7 through Azure Databricks, and it was working until yesterday, but when I accessed it now, I got this error: Error: 403 {"error_code":"PERMISSION_DENIED","message":"PERMISSION_DENIED: Endpoint databricks-claude-s...
- 1248 Views
- 2 replies
- 2 kudos
- 2 kudos
@szymon_dybczak Thank you! I'll wait a couple days and try again. Much appreciated!
- 2 kudos
- 1713 Views
- 2 replies
- 1 kudos
Pandas API on Spark creates huge query plans
Hello,I have a piece of code written in Pyspark and Pandas API on Spark. On comparing the query plans, I see Pandas API on Spark creates huge query plans whereas Pyspark plan is a tiny one. Furthermore, with Pandas API on spark, we see a lot of incon...
- 1713 Views
- 2 replies
- 1 kudos
- 1 kudos
@FRB1984 could you provide some examples? I'm curious. My first thoughts would be around the shuffling. Check this out: https://spark.apache.org/docs/3.5.4/api/python/user_guide/pandas_on_spark/best_practices.html . There's an argument to be made abo...
- 1 kudos
- 630 Views
- 1 replies
- 1 kudos
Resolved! Table Counts
Hello,My company loads a lot of tables into a databricks schema. I would like to build a dashboard on what has been loaded, but SQL commands like select * from information_schema do not work. Instead we have SHOW TABLES {FROM} LIKE {}; And that fails...
- 630 Views
- 1 replies
- 1 kudos
- 1 kudos
Just trying to rule out some of the lower-hanging stuff. When you run your SQL statements i.e. select * from information_schemaAre you using the correct namespace syntax i.e. {catalog_here}.information_schemaAre you using Unity Catalog?Example of th...
- 1 kudos
- 704 Views
- 3 replies
- 0 kudos
What permissions are needed to fix [INSUFFICIENT_PERMISSIONS] User does not have permission toSELECT
Hi,I am getting the following error in Databricks when running a SELECT query: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any file. SQLSTATE: 42501Context:Environment: Unity Catalog enabledI am tryin...
- 704 Views
- 3 replies
- 0 kudos
- 0 kudos
If you’re getting an “Insufficient Permissions” error in Databricks, it usually means your user is missing one or more privileges required for the action you’re trying to perform. In Unity Catalog, for example, querying a view in dedicated compute mo...
- 0 kudos
- 1031 Views
- 4 replies
- 2 kudos
Resolved! add new column to a table and failing the previous jobs
Hello community! I’m new to Databricks and currently working on a project structured in Bronze / Silver / Gold layers using Delta Lake and Change Data Feed.I recently added 3 new columns to a table and initially applied these changes via PySpark SQ...
- 1031 Views
- 4 replies
- 2 kudos
- 2 kudos
Hello @leticialima__ Good dayCan you please share the error observed on the driver log. is it : [Errno 13] Permission denied or No such file or directory? Please let me know the error on the driver log. THank you.
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
12 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 128 | |
| 62 | |
| 57 | |
| 42 |