- 10048 Views
- 11 replies
- 10 kudos
Azure Synapse versus databricks SQL endpoint performance comparison
Has anyone done this and share details? I have a sample sql which ran on large SQL endpoint in 8min and synapse 1000DWU setting in 1hr. On small SQL endpoint it took 34min. What's the equivalent SQL Endpoint compute for Synapse@1000DWU? I know th...
- 10048 Views
- 11 replies
- 10 kudos
- 12961 Views
- 9 replies
- 4 kudos
Resolved! How to get usage statistics from Databricks or SQL Databricks?
Hi, I am looking for a way to get usage statistics from Databricks (Data Science & Engineering and SQL persona). For example: I created a table. I want to know how many times a specific user queried that table.How many times a pipeline was triggered?...
- 12961 Views
- 9 replies
- 4 kudos
- 4 kudos
You can use System Tables, now available in Unity Catalog metastore, to create the views you described. https://docs.databricks.com/en/admin/system-tables/index.html
- 4 kudos
- 11226 Views
- 4 replies
- 3 kudos
Connect to Databricks SQL Endpoint using Programming language
Hi, I would like to know whether there is a feasibility/options available to connect to databricks sql endpoint using a programming language like java/scala/c#. I can see JDBC URL, but would like to whether it can be considered as any other jdbc conn...
- 11226 Views
- 4 replies
- 3 kudos
- 3 kudos
I found a similar question on Stackoverflow https://stackoverflow.com/questions/77477103/ow-to-properly-connect-to-azure-databricks-warehouse-from-c-sharp-net-using-jdb
- 3 kudos
- 2208 Views
- 1 replies
- 2 kudos
How to prevent users from scheduling SQL queries?
We have noticed that users can schedule SQL queries, but currently we haven't found a way to find these scheduled queries (this does not show up in the jobs workplane). Therefore, we don't know that people scheduled this. The only way is to look at t...
- 2208 Views
- 1 replies
- 2 kudos

- 2 kudos
@Paulo Rijnberg​ :In Databricks, you can use the following approaches to prevent users from scheduling SQL queries and to receive notifications when such queries are scheduled:Cluster-level permissionsJobs APINotification hooksAudit logs and monitori...
- 2 kudos
- 5436 Views
- 6 replies
- 5 kudos
Migration of Databricks Jobs, SQL dasboards and Alerts from lower environment to higher environment?
I want to move Databricks Jobs, SQL dasboards, Queries and Alerts from lower environment to higher environment, how we can move?
- 5436 Views
- 6 replies
- 5 kudos

- 5 kudos
Hi @Shubham Agagwral​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answ...
- 5 kudos
- 2013 Views
- 1 replies
- 3 kudos
How to change compression codec of sql warehouse written files?
Hi, I'm currently starting to use SQL Warehouse, and we have most of our lake in a compression different than snappy.How can I set the SQL warehouse to use a compression like gzip, zstd, on CREATE, INSERT, etc?Tried this:set spark.sql.parquet.compre...
- 2013 Views
- 1 replies
- 3 kudos

- 3 kudos
Hi @Alejandro Martinez​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 3 kudos
- 11163 Views
- 7 replies
- 3 kudos
Resolved! AWS Glue and Databricks
Hello, we're receiving an error when running glue jobs to try and connect to and read from a Databricks SQL endpoint.Hello, we're receiving an error when running glue jobs to try and connect to and read from a Databricks SQL endpoint. An error occu...
- 11163 Views
- 7 replies
- 3 kudos
- 3 kudos
Hello @Vidula Khanna​ @Debayan Mukherjee​ ,I wanted to give you an update that might be helpful for your future customers, we worked with @Pavan Kumar Chalamcharla​ and through lots of trial and error we figured out a combination that works for SQL e...
- 3 kudos
- 1478 Views
- 0 replies
- 2 kudos
Availability of SQL Warehouse to Data Science and Engineering persona ​Hi All,Now we can use SQL Warehouse in our notebook execution.It's in previ...
Availability of SQL Warehouse to Data Science and Engineering persona​Hi All,Now we can use SQL Warehouse in our notebook execution.It's in preview now and soon will be GA.
- 1478 Views
- 0 replies
- 2 kudos
- 3213 Views
- 2 replies
- 1 kudos
Resolved! How/where can I set credentials for DataBricks SQL to create a external table.
I've triend this code in Databricks SQLcreate table people_db.GLAccount USING PARQUET LOCATION "abfss://datamesh@dlseu2dtaedwetldtlak9.dfs.core.windows.net/PricingAnalysis/rdv_60_134.vGLAccount.parquet"But I'm getting a "Invalid configuration v...
- 3213 Views
- 2 replies
- 1 kudos
- 1 kudos
you can define 'data access configuration' in the admin panel.go to SQL warehouse settings -> Data Access configurationhttps://learn.microsoft.com/en-us/azure/databricks/sql/admin/data-access-configuration
- 1 kudos
- 3601 Views
- 1 replies
- 9 kudos
Refreshing SQL DashboardYou can schedule the dashboard to automatically refresh at an interval.At the top of the page, click Schedule.If the dashboard...
Refreshing SQL DashboardYou can schedule the dashboard to automatically refresh at an interval.At the top of the page, click Schedule.If the dashboard already has a schedule, you see Scheduled instead of Schedule.Select an interval, such as Every 1 h...
- 3601 Views
- 1 replies
- 9 kudos
- 11917 Views
- 5 replies
- 2 kudos
How to query object ID in Databricks SQL warehouse using only SQL?
I can see on Databricks SQL warehouse Data tab that clusters, catalogs and schemas have a unique ID. User created tables, views and functions must have and unique ID too, but it is not exposed to the user as far as I can tell.I need to retrieve the...
- 11917 Views
- 5 replies
- 2 kudos
- 5380 Views
- 1 replies
- 0 kudos
Spark UI SQL/Dataframe tab missing queries
Hi,Recently, I am having some problems viewing the query plans in the Spark UI SQL/Dataframe tab.I would expect to see large query plans in the SQL tab where we can observe the details of the query such as the rows read/written/shuffled. However, I s...
- 5380 Views
- 1 replies
- 0 kudos

- 0 kudos
@Koray Beyaz​ :This issue may be related to a change in the default behavior of the Spark UI in recent versions of Databricks Runtime. In earlier versions, the Spark UI would display the full query plan for SQL and DataFrame operations in the SQL/Dat...
- 0 kudos
- 2335 Views
- 3 replies
- 2 kudos
Resolved! About SQL workspace option
I can't find SQL workspace option in my free community edition.
- 2335 Views
- 3 replies
- 2 kudos

- 2 kudos
Hi @Machireddy Nikitha​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best an...
- 2 kudos
- 3970 Views
- 4 replies
- 6 kudos
Databricks SQL Option
Hello Everyone,I want you to ask how can I get the SQL option on Databricks, because I am doing the Data Analyst courses and I do not have that option, just Data Engineer and Machine Learning. I am connecting through Microsoft Azure..Waiting for your...
- 3970 Views
- 4 replies
- 6 kudos
- 6 kudos
@Luis Carcaño​ You can switch your persona from data science and engineering to SQL, but this is only possible if you have a premium Databricks workspace. To do so, please create a new workspace with a premium tier.
- 6 kudos
- 5730 Views
- 4 replies
- 6 kudos
SQL Query execution plan explain and optimize the performance for query run.
When we executing SQL query in databricks SQL warehouse editor what will be best practices to optimize the execution plan and get result faster
- 5730 Views
- 4 replies
- 6 kudos

- 6 kudos
Hi @vinay kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
- 6 kudos
-
Aad
1 -
Access
1 -
Access control
1 -
ADLS Gen
1 -
API
3 -
AWS
1 -
Azure
3 -
Azure databricks
4 -
Azure Databricks SQL
1 -
Azure SQL DB
1 -
Azure synapse
1 -
Batch Processing
1 -
Best Data Warehouse
1 -
Best practice
1 -
Bi
5 -
Bigquery
1 -
Billing and Cost Management
1 -
Broadcast variable
1 -
Bug
1 -
Business Intelligence
2 -
Cache
1 -
Caching
1 -
Catalyst
1 -
CD Pipeline
1 -
Certification
1 -
Certification Voucher
1 -
Class
1 -
Cloud Fetch
1 -
Cluster
3 -
Cluster config
1 -
Cluster Metrics
1 -
ClusterSize
1 -
Code
1 -
ConcurrentQueries
1 -
Connect
1 -
Credential passthrough
1 -
CSV
1 -
CustomKeyVault
1 -
DAIS2023
1 -
Dashboard
1 -
Dashboards
1 -
Data Engineering
1 -
Data Ingestion & connectivity
2 -
Data Science
2 -
databricks
1 -
Databricks Certification
1 -
Databricks Certification Voucher
1 -
Databricks Cluster
3 -
Databricks JDBC
1 -
Databricks notebook
1 -
Databricks Runtime
1 -
Databricks SQL
21 -
Databricks SQL Alerts
1 -
Databricks SQL Analytics
1 -
Databricks SQL Connector
1 -
Databricks SQL Endpoints
2 -
Databricks Table Usage
1 -
Databricks workspace
1 -
DatabrickSQL
1 -
Dataset
1 -
DBeaver
1 -
DBR
2 -
DBSQL
12 -
DBSQL Queries
1 -
Dbu
1 -
Delta
5 -
Delta Live Table Pipeline
1 -
Delta Live Tables
2 -
Delta Pipeline
1 -
Delta table
1 -
Delta Tables
1 -
Different Types
1 -
DLT
3 -
E2
1 -
Endpoint
7 -
Error
1 -
Error Message
2 -
ETL Process
1 -
External Data Sources
1 -
External Hive
1 -
External Table
1 -
File
1 -
Files
1 -
Global Temp Views
1 -
Glue
1 -
Gpu
1 -
Group
1 -
Hive
1 -
Hive Table
1 -
Import
1 -
Jdbc
6 -
Jdbc connection
2 -
Job Cluster
1 -
Key
1 -
Library
1 -
Limit
1 -
LTS
1 -
LTS ML
1 -
Metadata
1 -
Migration
1 -
Multi Cluster Load Balancer
1 -
Mysql
2 -
NodeJS
1 -
Notebook
2 -
Odbc
3 -
Oracle
1 -
OracleDBPackage
1 -
PARAMETER VALUE
1 -
Parquet
1 -
Party Libraries
1 -
Password
1 -
Performance
2 -
Permissions
1 -
Photon
2 -
Pip
1 -
Possible
1 -
PostgresSQL
1 -
Powerbi
7 -
Prod Workspace
1 -
Programming language
1 -
Pyspark
1 -
Python
6 -
Python Dataframe
1 -
Query
6 -
Query History
1 -
Query Parameters
1 -
Query Snippets
1 -
Row Level Security
1 -
Row Limit
1 -
Schedule
1 -
Schema
1 -
ServiceNow User
1 -
Session
1 -
Simba Odbc Connector
1 -
SKU
1 -
Spark
2 -
Spark sql
1 -
Sparkcontext
1 -
Special Characters
1 -
SQL
40 -
SQL Dashboard
3 -
SQL Databricks
1 -
SQL Endpoint
3 -
SQL Endpoints
5 -
SQL Option
1 -
SQL Queries
3 -
Sql query
3 -
SQL Query Execution Plan
1 -
Sql table
1 -
Sql Warehouse
5 -
Sql Workbench
2 -
SQL Workspace Option
2 -
Sqlanalytics
1 -
Sqlexecutionexception
1 -
Sqlserver
1 -
SRC
1 -
Ssl
1 -
ST
1 -
String Agg
1 -
Structfield
1 -
Structured streaming
1 -
Summit22
1 -
Table
1 -
Table Pipeline
1 -
Temporary View
1 -
Trying
1 -
UI SQL
1 -
Unity Catalogue
1 -
Usage
1 -
Usge Statistics
1 -
Value Pair
1 -
Version
1 -
Version Queries
1 -
Visualization
1 -
Vnet Injection
1 -
Works
1 -
Workspace
1 -
Workspace SKU
1 -
Writing
1 -
Xml
1 -
Yarn
2 -
Zip file
1
- « Previous
- Next »