cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Rexton
by New Contributor
  • 5227 Views
  • 3 replies
  • 2 kudos

AWS Databricks Pyspark - Unable to connect to Azure MySQL - Shows "SSL Connection is required"

Even after specifying SSL options, unable to connect to MySQL. What could have gone wrong? Could anyone experience similar issues? df_target_master = spark.read.format("jdbc")\.option("driver", "com.mysql.jdbc.Driver")\.option("url", host_url)\.optio...

  • 5227 Views
  • 3 replies
  • 2 kudos
Latest Reply
a2barbosa
New Contributor II
  • 2 kudos

Hey,Here the solution: The correct option for ssl is "useSSL" and not just "ssl".This code below could works:df_target_master = spark.read.format("jdbc")\.option("driver", "com.mysql.jdbc.Driver")\.option("url", host_url)\.option("dbtable", supply_ma...

  • 2 kudos
2 More Replies
Swapnil1998
by New Contributor III
  • 947 Views
  • 0 replies
  • 0 kudos

How to query a MySQL Table from Databricks?

I wanted to query a MySQL Table using Databricks rather than reading the complete data using a dbtable option, which will help in incremental loads.remote_table = (spark.read .format("jdbc") .option("driver", driver) .option("url", URL) .option("quer...

  • 947 Views
  • 0 replies
  • 0 kudos
Gim
by Contributor
  • 1535 Views
  • 1 replies
  • 1 kudos

Incremental loading from Delta tables to MySQL database

I was thinking of using Azure Data Factory to orchestrate this. But is it possible to do incremental loads from tables in Delta format going to MySQL database? My Delta table sources will be coming from ADLS storage. Would I need to convert them firs...

  • 1535 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, You can refer: https://community.databricks.com/s/question/0D53f00001UCQJ7CAP/delta-tables-incremental-backup-method. Please let us know if you need any further clarification on the same.

  • 1 kudos
AmanSehgal
by Honored Contributor III
  • 3723 Views
  • 2 replies
  • 10 kudos

Migrating data from delta lake to RDS MySQL and ElasticSearch

There are mechanisms (like DMS) to get data from RDS to delta lake and store the data in parquet format, but is it possible to reverse of this in AWS?I want to send data from data lake to MySQL RDS tables in batch mode.And the next step is to send th...

  • 3723 Views
  • 2 replies
  • 10 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 10 kudos

@Kaniz Fatma​  and @Hubert Dudek​  - writing to MySQL RDS is relatively simpler. I'm finding ways to export data into Elasticsearch

  • 10 kudos
1 More Replies
User16753724663
by Valued Contributor
  • 1111 Views
  • 1 replies
  • 0 kudos

Unable to use on prem Mysql server as we are not able to resolve the hostname

while connecting from notebook, it returns the error unable to resolve name.

  • 1111 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16753724663
Valued Contributor
  • 0 kudos

Since we are unable to resolve hostname, it point towards the DNS issue. We can use custom dns using init script and add in the cluster:%scala dbutils.fs.put("/databricks/<directory>/dns-masq.sh";,""" #!/bin/bash #####################################...

  • 0 kudos
Labels