yesterday
I am using dataricks version 15.4 and getting below error whicle reading from jdbc and writing to aws S3 location:
yesterday - last edited yesterday
Hi @flashmav1 ,
In first case it didn't work because your number has 14 digits before the decimal point. DECIMAL(23,10) only allows 13 digits before the decimal point. In general, when you work with Decimal you deal with 2 things:
- Precision (23): Total number of digits in number - in your case 23
- Scale (10): Number of digits after the decimal point - in your case 10
- available for integer part: 23 - 10 = 13 digits
Regarding your second attempt. According to docs one reason that can cause this error to appear is when any value in source column have an actual precision greater 38. So you can try to query your source database. Maybe you have some weird outliers in data.
You can also try pushing the casting to your source and see if that works. Start by casting to a string
spark.read.format("jdbc")
.option("url", jdbcUrl)
.option("query", "select CAST(your_column AS STRING) from your_table")
.load()
yesterday
I think it is related to number before conversion to Decimal. Please check what type is it?
yesterday - last edited yesterday
Hi @flashmav1 ,
In first case it didn't work because your number has 14 digits before the decimal point. DECIMAL(23,10) only allows 13 digits before the decimal point. In general, when you work with Decimal you deal with 2 things:
- Precision (23): Total number of digits in number - in your case 23
- Scale (10): Number of digits after the decimal point - in your case 10
- available for integer part: 23 - 10 = 13 digits
Regarding your second attempt. According to docs one reason that can cause this error to appear is when any value in source column have an actual precision greater 38. So you can try to query your source database. Maybe you have some weird outliers in data.
You can also try pushing the casting to your source and see if that works. Start by casting to a string
spark.read.format("jdbc")
.option("url", jdbcUrl)
.option("query", "select CAST(your_column AS STRING) from your_table")
.load()
10 hours ago
I was working in databricks pyspark code side becuase the error seems like the type issue in pyspark but the actual issue is reading the 14 interger value from source. So casting to string(in my case nvarchar) worked.
10 hours ago
Cool, glad that it worked ๐
yesterday
yesterday - last edited yesterday
I am able to read it like this but unable to write to s3 locaition
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now