- 5611 Views
- 3 replies
- 2 kudos
Even after specifying SSL options, unable to connect to MySQL. What could have gone wrong? Could anyone experience similar issues? df_target_master = spark.read.format("jdbc")\.option("driver", "com.mysql.jdbc.Driver")\.option("url", host_url)\.optio...
- 5611 Views
- 3 replies
- 2 kudos
Latest Reply
Hey,Here the solution: The correct option for ssl is "useSSL" and not just "ssl".This code below could works:df_target_master = spark.read.format("jdbc")\.option("driver", "com.mysql.jdbc.Driver")\.option("url", host_url)\.option("dbtable", supply_ma...
2 More Replies
- 1060 Views
- 0 replies
- 0 kudos
I wanted to query a MySQL Table using Databricks rather than reading the complete data using a dbtable option, which will help in incremental loads.remote_table = (spark.read .format("jdbc") .option("driver", driver) .option("url", URL) .option("quer...
- 1060 Views
- 0 replies
- 0 kudos
- 1267 Views
- 0 replies
- 6 kudos
In the docs it's mention that "if you use Azure Database for MySQL as an external metastore, you must change the value of the lower_case_table_names property from 1 (the default) to 2 in the server-side database configuration."However "lower_case_tab...
- 1267 Views
- 0 replies
- 6 kudos
- 1745 Views
- 1 replies
- 1 kudos
I was thinking of using Azure Data Factory to orchestrate this. But is it possible to do incremental loads from tables in Delta format going to MySQL database? My Delta table sources will be coming from ADLS storage. Would I need to convert them firs...
- 1745 Views
- 1 replies
- 1 kudos
Latest Reply
Hi, You can refer: https://community.databricks.com/s/question/0D53f00001UCQJ7CAP/delta-tables-incremental-backup-method. Please let us know if you need any further clarification on the same.
- 4002 Views
- 2 replies
- 10 kudos
There are mechanisms (like DMS) to get data from RDS to delta lake and store the data in parquet format, but is it possible to reverse of this in AWS?I want to send data from data lake to MySQL RDS tables in batch mode.And the next step is to send th...
- 4002 Views
- 2 replies
- 10 kudos
Latest Reply
@Kaniz Fatma and @Hubert Dudek - writing to MySQL RDS is relatively simpler. I'm finding ways to export data into Elasticsearch
1 More Replies
- 1245 Views
- 1 replies
- 0 kudos
while connecting from notebook, it returns the error unable to resolve name.
- 1245 Views
- 1 replies
- 0 kudos
Latest Reply
Since we are unable to resolve hostname, it point towards the DNS issue. We can use custom dns using init script and add in the cluster:%scala
dbutils.fs.put("/databricks/<directory>/dns-masq.sh";,"""
#!/bin/bash
#####################################...