I'm trying to verify the partitions assigned to rows.
I'm running something like this:
from pyspark.sql.functions import spark_partition_id
df = spark.read.table("some.uc.table").limit(10)
df = df.repartition(2)
df = df.withColumn("partitionid", spark_partition_id())
display(df)
The results in:
Insufficient privileges: User does not have permission SELECT on anonymous function
This really seems like a bug that needs get fixed.