by
AP
• New Contributor III
- 1352 Views
- 2 replies
- 2 kudos
Hi, I am trying to take advantage of the treasure trove of the information that metastore contains and take some actions to improve performance. In my case, the metastore is managed by databricks, we don't use external metastore.How can I connect to ...
- 1352 Views
- 2 replies
- 2 kudos
Latest Reply
@AKSHAY PALLERLA​ to get the jdbc/odbc information you can get it from the cluster configuration. In the cluster configuration page, under advanced options, you have JDBC/ODBC tab. Click on that tab and it should give you the details you are looking ...
1 More Replies
- 2341 Views
- 6 replies
- 5 kudos
tl;dr: A cell that executes purely on the head node stops printed output during execution, but output still shows up in the cluster logs. After execution of the cell, Databricks does not notice the cell is finished and gets stuck. When trying to canc...
- 2341 Views
- 6 replies
- 5 kudos
Latest Reply
As that library work on pandas problem can be that it doesn't support pandas on spark. On the local version, you probably use non-distributed pandas. You can check behavior by switching between:import pandas as pd
import pyspark.pandas as pd
5 More Replies
by
165036
• New Contributor III
- 1001 Views
- 1 replies
- 1 kudos
Summary of the problemWhen mounting an S3 bucket via Terraform the creation process is frequently timing out (running beyond 10 minutes). When I check the Log4j logs in the GP cluster I see the following error message repeated:```22/07/26 05:54:43 ER...
- 1001 Views
- 1 replies
- 1 kudos
Latest Reply
Solved. See here: https://github.com/databricks/terraform-provider-databricks/issues/1500
- 655 Views
- 1 replies
- 0 kudos
I'm running a Java application that registers a CSV table with HIVE and then checks the number of rows imported. Its done in several steps.:Statement stmt = con.createStatement();....stmt.execute( "CREATE TABLE ( <definition> < > );.....ResultSet rs...
- 655 Views
- 1 replies
- 0 kudos
Latest Reply
@Reto Matter​ Are you running a jar job or using dbconnect to run java code? Please provide how are you trying to make a connection and full exception stack trace.
by
624398
• New Contributor III
- 1149 Views
- 4 replies
- 2 kudos
Hey all,My aim is to validate a given SQL string without actually running it.I thought I could use the `EXPLAIN` statement to do so.So I tried using the `databricks-sql-connector` for python to explain a query, and so determine whether it's valid or ...
- 1149 Views
- 4 replies
- 2 kudos
Latest Reply
Hi @Nativ Issac​, We haven’t heard from you on the last response from @Hubert Dudek​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to o...
3 More Replies
- 926 Views
- 4 replies
- 3 kudos
Hey guys,We're considering Delta Lake as the storage for our project and have a couple questions. The first one is what's the pricing for Delta Lake - can't seem to find a page that says x amount costs y.The second question is more technical - if we...
- 926 Views
- 4 replies
- 3 kudos
Latest Reply
delta lake itself is free. It is a file format. But you will have to pay for storage and compute of course.If you want to use Databricks with delta lake, it will not be free unless you use the community edition.Depending on what you are planning to...
3 More Replies
- 956 Views
- 3 replies
- 1 kudos
is DLT supported for Scala? Any reference implementations or wikis to get started?
- 956 Views
- 3 replies
- 1 kudos
Latest Reply
Hi @Karthik Munipalle​, Delta Live Tables queries can be implemented in Python or SQL.Here are few articles best explaining about DLT. Please have a look.https://docs.databricks.com/data-engineering/delta-live-tables/index.htmlhttps://databricks.com/...
2 More Replies
by
Yagao
• New Contributor
- 1517 Views
- 5 replies
- 2 kudos
Can anyone show me one use case how to do python within sql query ?
- 1517 Views
- 5 replies
- 2 kudos
Latest Reply
To run Python within a SQL query you have to first define a Python function and then register it as a UDF. Once that is done you are able to call that UDF within a SQL query. Please take a look at this documentation here:https://docs.databricks.com/s...
4 More Replies
- 719 Views
- 3 replies
- 4 kudos
I am using init scripts and would like to be able to control the version of a component that we release internally and frequently. We are now manually updating a dbfs requirement.txt file but I think that this problem may have been encountered befor...
- 719 Views
- 3 replies
- 4 kudos
Latest Reply
You can programmatically create cluster templates in JSON files and include config JSON files with libraries needed. Cluster deployment in that scenario needs to be controlled via API https://docs.databricks.com/dev-tools/api/latest/clusters.html
2 More Replies
by
Edel
• New Contributor II
- 619 Views
- 2 replies
- 2 kudos
Just want to know if you have a benchmark or some tests comparing Oracle ADWC vs Delta lake for data warehousing
- 619 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @Edelweiss Kammermann​, Check these article out for a comparison.https://www.trustradius.com/compare-products/databricks-lakehouse-platform-vs-oracle-autonomous-data-warehouse
1 More Replies
- 1125 Views
- 2 replies
- 0 kudos
My dashboard uses Athena as data source for its availability (I don't need to fire up the cluster and manually refresh the data), but it requires me to create the tables manually. Wondering if there is a similar method like the .saveAsTable() to crea...
- 1125 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Howard Zhang​, Here's a fantastic article for your use case. Please have a read.
1 More Replies
- 1613 Views
- 3 replies
- 2 kudos
While connecting the Databricks and Grafana, I have gone through the following approach.Install Grafna Agent in Databrics Clusters from Databricks console --> Not working since the system is not booted with systemd as init systemSince Spark 3 has Pro...
- 1613 Views
- 3 replies
- 2 kudos
Latest Reply
Hi @Avinash Goje​, We haven’t heard from you on the last response from @Hubert Dudek​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to o...
2 More Replies