I am attempting to apply Mosaic's `grid_pointascellid` method on a spark dataframe with `lat`, `lon` columns.
```
import pyspark.sql.functions as F
# Create a Spark DataFrame with a lat and lon column
df = spark.createDataFrame([
("point1", 10.0, 20.0),
("point2", 30.0, 40.0),
("point3", 50.0, 60.0),
], ["name", "lat", "lon"])
# Print the results of the DataFrame
df.show()
df1 = df.withColumn("id", grid_pointascellid(st_point(df["lon"], df["lat"]), lit(7)))
df1.show()
```
The schema of the dataframe is:
```
|-- name: string (nullable = true)
|-- lat: decimal(9,6) (nullable = true)
|-- lng: decimal(9,6) (nullable = true)
```
Exception:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 251.0 failed 4 times, most recent failure: Lost task 0.3 in stage 251.0 (TID 15016) (10.138.221.80 executor 50): java.lang.ClassCastException: org.apache.spark.sql.types.Decimal cannot be cast to java.lang.Double
https://databrickslabs.github.io/mosaic/api/spatial-indexing.html?highlight=grid_boundary#grid-point...