cancel
Showing results for 
Search instead for 
Did you mean: 
APol
New Contributor II
since ‎09-08-2022
‎06-26-2023

User Stats

  • 1 Posts
  • 0 Solutions
  • 0 Kudos given
  • 1 Kudos received

User Activity

Hi. I assume that it can be concurrency issue. (a Read thread from Databricks and a Write thread from another system)From the start:I read 12-16 csv files (approximately 250Mb each of them) to dataframe. df = spark.read.option("header", "False").opti...
Kudos from