2 weeks ago
2 weeks ago
Hi there @Tiwarisk,
if this is the major issue
@Tiwarisk wrote:I am writing a file using this but the data type of columns get changed while reading.
You can explicitly specify your table schema like this
from pyspark.sql.types import StructType, StructField, StringType, IntegerType, DoubleType
schema = StructType([
StructField("column1", StringType(), True),
StructField("column2", IntegerType(), True),
StructField("column3", DoubleType(), True)
])
Then you can read the Excel file like this
// Read the Excel file with the specified schema
val df = spark.read
.format("com.crealytics.spark.excel")
.option("header", "true")
.schema(schema) // Specify the schema here
.load(path)
After this when you write it won't cause trouble because When writing data to an Excel file using the `com.crealytics.spark.excel` format, you might encounter issues where the data types of the columns are altered. This happens because the Excel format doesn't natively support all Spark data types, and the conversion might not be perfect.
@Tiwarisk wrote:df.write.format("com.crealytics.spark.excel").option("header", "true").mode("overwrite").save(path)
2 weeks ago
I checked the library you are using to write to Excel and it seems there is a new version available that has improved data type handling.
https://github.com/crealytics/spark-excel
To use V2 implementation, just change your .format from .format("com.crealytics.spark.excel") to .format("excel").
Check the github readme for details. If your dataframe has the same datatypes as the Excel table, I'm hoping this gives you some more luck.
2 weeks ago
Do you need to write the data again in excel format ? Do you need it in that format ? If yes, while reading the excel file back, are you inferring the schema of the file ?
2 weeks ago
yes inferschema is true
Excited to expand your horizons with us? Click here to Register and begin your journey to success!
Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!