cancel
Showing results for 
Search instead for 
Did you mean: 
ninjadev999
New Contributor II
since ‎02-11-2022
‎06-26-2023

User Stats

  • 4 Posts
  • 0 Solutions
  • 0 Kudos given
  • 0 Kudos received

User Activity

I'm reading a huge csv file including 39,795,158 records and writing into MSSQL server, on Azure Databricks. The Databricks(notebook) is running on a cluster node with 56 GB Memory, 16 Cores, and 12 workers.This is my code in Python and PySpark:from ...