- 2329 Views
- 5 replies
- 1 kudos
getArgument works fine in interactive cluster 10.4 LTS, raises error in interactive cluster 10.4 LTS
Hello,I am trying to use the getArgument() function in a spark.sql query. It works fine if I run the notebook via an interactive cluster, but gives an error when executed via a job run in an instance Pool.query:OPTIMIZE <table>where date = replace(re...
- 2329 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @Kaniz_Fatma,Would you be able to respond to my last comment? I couldn't manage to get it working yet.Thank you in advance.
- 1 kudos
- 1778 Views
- 1 replies
- 0 kudos
AWS Databricks VS AWS EMR
HiWhich services should I use for data lake implementation?any cost comparison between Databricks and aws emr.which one is best to choose
- 1778 Views
- 1 replies
- 0 kudos
- 0 kudos
@AH that depends on use case, if your implementation involves Data Lake, ML, Data engineering tasks better to go with databricks as it has got good UI and there good governance using unity catalog for your data lake and you have good consumer tool su...
- 0 kudos
- 1620 Views
- 1 replies
- 1 kudos
Resolved! System billing usage table - Usage column
Hello experts,Could someone please explain what is exactly contained into the column usage in the system.billing.usage table?We ran specific queries in a cluster trying to calculate the cost and we observe that the DBUs shown in the system table are ...
- 1620 Views
- 1 replies
- 1 kudos
- 1 kudos
@elgeo both should be same, untill if somehow we miss to pick proper plan DBU price, usage column will have complete information related to sku name and DBU units etc... if you use azure databricks calculator and compare we should see similar result
- 1 kudos
- 2585 Views
- 5 replies
- 2 kudos
Roadmap on export menu option for SQL Query and Dashboard Types in Workspace
Are there plans for an export option for SQL Query and SQL Dashboard in the Workspace explorer screen similar to notebooks?Background: Need a way to export and backup any queries and dashboards to save design work and move from staging environments ...
- 2585 Views
- 5 replies
- 2 kudos
- 2 kudos
Th best option would be to have them just under git Repo (especially dashboards).
- 2 kudos
- 901 Views
- 2 replies
- 0 kudos
how to save variables in one notebook to be imported into another?
Say, I have a list of values, dictionaries, variable names in `notebook1.ipynb` that I'd like to re-use / import in another `notebook2.ipynb`. For example, in `notebook1.ipynb`, I have the following: var1 = "dallas" var_lst = [100, 200, 300, 400, ...
- 901 Views
- 2 replies
- 0 kudos
- 0 kudos
You can use %run ./notebook2 after defining variables in notebook1So notebook2 will use the variables defined in notebook1
- 0 kudos
- 1536 Views
- 1 replies
- 0 kudos
Can Error Message be un Redacted
I there a way to un-redact the logging of error message ?Alternatively would be nice to have access to the source code of involved classes like : com.databricks.backend.common.util.CommandLineHelper or com.databricks.util.UntrustedUtils I'm getting t...
- 1536 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @floringrigoriu , you can reach out to Databricks support by filing a support ticket.
- 0 kudos
- 1185 Views
- 1 replies
- 0 kudos
Retrieve DBU per query executed
Hello experts,Do you know how we can retrieve the DBUs consumed for a specific query?Thank you
- 1185 Views
- 1 replies
- 0 kudos
- 0 kudos
I couldn't find a metadata table. However the workaround is to multiply the DBU of the current cluster (retrieve it either online or to be more accurate from the compute page at the right) and multiply it with the time in minutes that the query took ...
- 0 kudos
- 1522 Views
- 2 replies
- 1 kudos
running notebook job from remote github repository fails, but do not fail on python script type
Hi allI am trying to run a notebook from a remote repository, but the job fails. I setup the job as follows:my project structure is as such:but the output i get is like such: The thing is if i set the job type to "Python Script" i dont encounter this...
- 1522 Views
- 2 replies
- 1 kudos
- 1 kudos
@NielsMH if you want to run your jobs based o job name, please use new preview service that databricks released which are DAB format. there you can run your job based on your job name.remote repo in the sense, are you using github actions or api, loo...
- 1 kudos
- 946 Views
- 2 replies
- 0 kudos
Query Load/ Warehouse Load missing
For observability, warehouse load and query load percentage are two major requirements. How do I fetch those details?
- 946 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @FatemaMalu, You can fetch the warehouse load and query load percentage using the Databricks REST API. Specifically, you can use the Clusters API to get the status of your SQL Warehouses. https://docs.databricks.com/api/workspace/warehouses
- 0 kudos
- 1045 Views
- 2 replies
- 1 kudos
Query Hash missing
From the following Databricks API /api/2.0/preview/sql/queries query_hash is missing from the actual response.But the sample response mentioned in the API documentation has it.{ "count": 0, "page": 0, "page_size": 0, "results": [ { ...
- 1045 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @FatemaMalu, Did you try the other fields, such as the id field to uniquely identify a query and its associated metadata?
- 1 kudos
- 1166 Views
- 2 replies
- 1 kudos
Databricks cluster automated
Is there any way to automatically start a Databricks cluster when an event occurs, such as the cluster terminating for some reason, and have the Databricks cluster restart automatically thereafter ? It should avoid manual start.
- 1166 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @hari007 , Would you like to check this Databricks documentation on the query you have asked?https://docs.databricks.com/en/clusters/clusters-manage.html#restart-a-cluster
- 1 kudos
- 1736 Views
- 3 replies
- 1 kudos
Power BI - Azure Databricks Connector shows Error AAD is not setup for domain
Hi Team,What I would like to do is understand what is required for PowerBI gateway to use single sign-on (AAD) to Databricks. Is that something you could have encountered before and know the fix? I currently get message from Power BI that AAD is not ...
- 1736 Views
- 3 replies
- 1 kudos
- 1 kudos
None of the links in the response are valid
- 1 kudos
- 496 Views
- 0 replies
- 0 kudos
Small files and discrepancy in S3 vs catalog
Hello all,I'm in the process of optimizing my tables and I'm running into a confusing situation. I have a table named "trace_messages_fg_streaming_event". If I navigate to the Databricks catalog, it shows stats:Size: 6.7GB, Files: 464But when I look ...
- 496 Views
- 0 replies
- 0 kudos
- 11304 Views
- 4 replies
- 3 kudos
Facing Issues with Databricks JDBC Connectivity after Idle time
Hello team, I am using commons(commons-dbcp2) Datasource which supports default connection pooling in Spring Java application (rest services to fetch databricks data via JDBC template).Initially all works fine and can get the data from databricks via...
- 11304 Views
- 4 replies
- 3 kudos
- 3 kudos
I am seeing the same issue with hikari. When a pooled connection is created then the databricks cluster is terminated (or restarted), the HikariDataSource retains a stale session handle.Why does connection.isValid() returns true then executing any qu...
- 3 kudos
- 1314 Views
- 2 replies
- 0 kudos
validating record count at SQL server database tabbles with migrated azure data lake gen2
we are migrating out project from on-premise to azure , so on-premise database is the SQL server that we are using and azure data lake gen2 is the storage location where store data currently and so far we are currently validating record count of each...
- 1314 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @sai_sathya, Automating the process of validating record counts from SQL Server database tables and Azure data lake Gen2 can be done with PySpark code.- PySpark script can connect to the SQL Server database, retrieve record counts for each table, ...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »