cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

User16765131552
by Contributor III
  • 6517 Views
  • 5 replies
  • 1 kudos

How to register a JDBC Spark dialect in Python?

I am trying to read from a databricks table. I have used the url from a cluster in the databricks. I am getting this error: java.sql.SQLDataException: [Simba][JDBC](10140) Error converting value to int.After these statements:jdbcConnUrl= "jdbc:spark:...

  • 6517 Views
  • 5 replies
  • 1 kudos
Latest Reply
KKDataEngineer
New Contributor III
  • 1 kudos

is there a solution for this?

  • 1 kudos
4 More Replies
MaheshDR
by New Contributor II
  • 2314 Views
  • 2 replies
  • 0 kudos

Informatica Cloud mapping with Databricks connection failing with java.util.NoSuchElementException

Hi Team,When we tried to configure our source with Databricks table with Databricks connection on Informatica Cloud, we received below error.We already tried the suggestions mentioned in the below community post which seems to be similar error as our...

  • 2314 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Mahesh D​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 0 kudos
1 More Replies
gideont
by New Contributor III
  • 3653 Views
  • 2 replies
  • 2 kudos

Resolved! spark sql update really slow

I tried to use Spark as much as possible but experience some regression. Hopefully to get some direction how to use it correctly.I've created a Databricks table using spark.sqlspark.sql('select * from example_view ') \ .write \ .mode('overwr...

image.png
  • 3653 Views
  • 2 replies
  • 2 kudos
Latest Reply
Pat
Honored Contributor III
  • 2 kudos

Hi, @Vincent Doe​ ,Updates are available in Delta tables, but under the hood you are updating parquet files, it means that each update needs to find the file where records are stored, then re-write the file to new version, and make new file current v...

  • 2 kudos
1 More Replies
akshay1
by New Contributor II
  • 1846 Views
  • 0 replies
  • 2 kudos

Data unloading to S3 bucket from Databricks.

Hi,I am completely new to the Databricks & have a task to unload the data from Databricks table to the S3 location using java/sql. Is this possible? If yes can you please help me?

  • 1846 Views
  • 0 replies
  • 2 kudos
Mendes
by New Contributor
  • 3100 Views
  • 2 replies
  • 0 kudos
  • 3100 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Danilo Mendes​ , Table schema is stored in the default Azure Databricks internal metastore and you can also configure and use external metastores. Ingest data into Azure Databricks. Access data in Apache Spark formats and from external data sources....

  • 0 kudos
1 More Replies
User16790091296
by Contributor II
  • 960 Views
  • 1 replies
  • 0 kudos
  • 960 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryan_Chynoweth
Esteemed Contributor
  • 0 kudos

You have a couple options to write data into a Data Warehouse. Some DWs have special connectors that allow for high performance between Databricks and the DW (for example there is a Spark connector for Snowflake and for Azure Synapse DW). Some data w...

  • 0 kudos
User16790091296
by Contributor II
  • 1724 Views
  • 1 replies
  • 0 kudos

How to read a Databricks table via Databricks api in Python?

Using Python-3, I am trying to compare an Excel (xlsx) sheet to an identical spark table in Databricks. I want to avoid doing the compare in Databricks. So I am looking for a way to read the spark table via the Databricks api. Is this possible? How c...

  • 1724 Views
  • 1 replies
  • 0 kudos
Latest Reply
sajith_appukutt
Honored Contributor II
  • 0 kudos

What is the format of the table - if It is delta, you could use the python bindings for the native Rust API and read the table from your python code and compare bypassing the metastore.

  • 0 kudos
Labels