cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

llvu
by New Contributor III
  • 2329 Views
  • 5 replies
  • 1 kudos

getArgument works fine in interactive cluster 10.4 LTS, raises error in interactive cluster 10.4 LTS

Hello,I am trying to use the getArgument() function in a spark.sql query. It works fine if I run the notebook via an interactive cluster, but gives an error when executed via a job run in an instance Pool.query:OPTIMIZE <table>where date = replace(re...

  • 2329 Views
  • 5 replies
  • 1 kudos
Latest Reply
llvu
New Contributor III
  • 1 kudos

Hi @Kaniz_Fatma,Would you be able to respond to my last comment? I couldn't manage to get it working yet.Thank you in advance.

  • 1 kudos
4 More Replies
AH
by New Contributor III
  • 1778 Views
  • 1 replies
  • 0 kudos

AWS Databricks VS AWS EMR

HiWhich services should I use for data lake implementation?any cost comparison between Databricks and aws emr.which one is best to choose 

  • 1778 Views
  • 1 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@AH that depends on use case, if your implementation involves Data Lake, ML, Data engineering tasks better to go with databricks as it has got good UI and there good governance using unity catalog for your data lake and you have good consumer tool su...

  • 0 kudos
elgeo
by Valued Contributor II
  • 1620 Views
  • 1 replies
  • 1 kudos

Resolved! System billing usage table - Usage column

Hello experts,Could someone please explain what is exactly contained into the column usage in the system.billing.usage table?We ran specific queries in a cluster trying to calculate the cost and we observe that the DBUs shown in the system table are ...

  • 1620 Views
  • 1 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@elgeo both should be same, untill if somehow we miss to pick proper plan DBU price, usage column will have complete information related to sku name and DBU units etc... if you use azure databricks calculator and compare we should see similar result 

  • 1 kudos
RyanHager
by Contributor
  • 2585 Views
  • 5 replies
  • 2 kudos

Roadmap on export menu option for SQL Query and Dashboard Types in Workspace

Are there plans for an export option for SQL Query and SQL Dashboard in the Workspace explorer screen similar to notebooks?Background:  Need a way to export and backup any queries and dashboards to save design work and move from staging environments ...

  • 2585 Views
  • 5 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Th best option would be to have them just under git Repo (especially dashboards).

  • 2 kudos
4 More Replies
kll
by New Contributor III
  • 901 Views
  • 2 replies
  • 0 kudos

how to save variables in one notebook to be imported into another?

Say, I have a list of values, dictionaries, variable names in `notebook1.ipynb` that I'd like to re-use / import in another `notebook2.ipynb`. For example, in `notebook1.ipynb`, I have the following:   var1 = "dallas" var_lst = [100, 200, 300, 400, ...

  • 901 Views
  • 2 replies
  • 0 kudos
Latest Reply
Krishnamatta
New Contributor III
  • 0 kudos

You can use %run ./notebook2 after defining variables in notebook1So notebook2 will use the variables defined in notebook1

  • 0 kudos
1 More Replies
floringrigoriu
by New Contributor II
  • 1536 Views
  • 1 replies
  • 0 kudos

Can Error Message be un Redacted

I there a way to un-redact the logging of error message ?Alternatively would be nice to have access to the source code of involved classes like : com.databricks.backend.common.util.CommandLineHelper or com.databricks.util.UntrustedUtils I'm getting t...

  • 1536 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @floringrigoriu , you can reach out to Databricks support by filing a support ticket.

  • 0 kudos
elgeo
by Valued Contributor II
  • 1185 Views
  • 1 replies
  • 0 kudos

Retrieve DBU per query executed

Hello experts,Do you know how we can retrieve the DBUs consumed for a specific query?Thank you

  • 1185 Views
  • 1 replies
  • 0 kudos
Latest Reply
elgeo
Valued Contributor II
  • 0 kudos

I couldn't find a metadata table. However the workaround is to multiply the DBU of the current cluster (retrieve it either online or to be more accurate from the compute page at the right) and multiply it with the time in minutes that the query took ...

  • 0 kudos
NielsMH
by New Contributor III
  • 1522 Views
  • 2 replies
  • 1 kudos

running notebook job from remote github repository fails, but do not fail on python script type

Hi allI am trying to run a notebook from a remote repository, but the job fails. I setup the job as follows:my project structure is as such:but the output i get is like such: The thing is if i set the job type to "Python Script" i dont encounter this...

job-setup.png folder_structure.png job_output.png
  • 1522 Views
  • 2 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@NielsMH if you want to run your jobs based o job name, please use new preview service that databricks released which are DAB format. there you can run your job based on your job name.remote repo in the sense, are you using github actions or api, loo...

  • 1 kudos
1 More Replies
FatemaMalu
by New Contributor II
  • 946 Views
  • 2 replies
  • 0 kudos

Query Load/ Warehouse Load missing

For observability, warehouse load and query load percentage are two major requirements. How do I fetch those details?

  • 946 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @FatemaMalu, You can fetch the warehouse load and query load percentage using the Databricks REST API. Specifically, you can use the Clusters API to get the status of your SQL Warehouses. https://docs.databricks.com/api/workspace/warehouses

  • 0 kudos
1 More Replies
FatemaMalu
by New Contributor II
  • 1045 Views
  • 2 replies
  • 1 kudos

Query Hash missing

From the following Databricks API  /api/2.0/preview/sql/queries query_hash is missing from the actual response.But the sample response mentioned in the API documentation has it.{ "count": 0, "page": 0, "page_size": 0, "results": [ { ...

  • 1045 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @FatemaMalu, Did you try the other fields, such as the id field to uniquely identify a query and its associated metadata?

  • 1 kudos
1 More Replies
hari007
by New Contributor II
  • 1166 Views
  • 2 replies
  • 1 kudos

Databricks cluster automated

Is there any way to automatically start a Databricks cluster when an event occurs, such as the cluster terminating for some reason, and have the Databricks cluster restart automatically thereafter ? It should avoid manual start.

  • 1166 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @hari007 , Would you like to check this Databricks documentation on the query you have asked?https://docs.databricks.com/en/clusters/clusters-manage.html#restart-a-cluster

  • 1 kudos
1 More Replies
laeforceable
by New Contributor II
  • 1736 Views
  • 3 replies
  • 1 kudos

Power BI - Azure Databricks Connector shows Error AAD is not setup for domain

Hi Team,What I would like to do is understand what is required for PowerBI gateway to use single sign-on (AAD) to Databricks. Is that something you could have encountered before and know the fix? I currently get message from Power BI that AAD is not ...

image.png
  • 1736 Views
  • 3 replies
  • 1 kudos
Latest Reply
TimAtGSB
New Contributor II
  • 1 kudos

None of the links in the response are valid

  • 1 kudos
2 More Replies
bfrank1972
by New Contributor III
  • 496 Views
  • 0 replies
  • 0 kudos

Small files and discrepancy in S3 vs catalog

Hello all,I'm in the process of optimizing my tables and I'm running into a confusing situation. I have a table named "trace_messages_fg_streaming_event". If I navigate to the Databricks catalog, it shows stats:Size: 6.7GB, Files: 464But when I look ...

bfrank1972_0-1697559008309.png
  • 496 Views
  • 0 replies
  • 0 kudos
AbhiJ
by New Contributor III
  • 11304 Views
  • 4 replies
  • 3 kudos

Facing Issues with Databricks JDBC Connectivity after Idle time

Hello team, I am using commons(commons-dbcp2) Datasource which supports default connection pooling in Spring Java application (rest services to fetch databricks data via JDBC template).Initially all works fine and can get the data from databricks via...

  • 11304 Views
  • 4 replies
  • 3 kudos
Latest Reply
ash42
New Contributor II
  • 3 kudos

I am seeing the same issue with hikari. When a pooled connection is created then the databricks cluster is terminated (or restarted), the HikariDataSource retains a stale session handle.Why does connection.isValid() returns true then executing any qu...

  • 3 kudos
3 More Replies
sai_sathya
by New Contributor III
  • 1314 Views
  • 2 replies
  • 0 kudos

validating record count at SQL server database tabbles with migrated azure data lake gen2

we are migrating out project from on-premise to azure , so on-premise database is the SQL server that we are using and azure data lake gen2 is the storage location where store data currently and so far we are currently validating record count of each...

  • 1314 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @sai_sathya, Automating the process of validating record counts from SQL Server database tables and Azure data lake Gen2 can be done with PySpark code.- PySpark script can connect to the SQL Server database, retrieve record counts for each table, ...

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors