Hi,
when i am running the below simple code over my Unity Catalog on a Shared cluster, it works very well.
But on a Single User - i am getting :
Failed to acquire a SAS token for list on /__unitystorage/schemas/1bb5b053-ac96-471b-8077-8288c56c8a20/tables/90a7830e-2d62-4b51-8e7b-64de6d367859/_delta_log due to java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: 403: Your token is missing the required scopes for this endpoint.
Please advice!
%python
from pyspark.sql import SparkSession
sql_query = f"SELECT * from z_product_engineering_mpe_mit_trace.trace_api.lot_at_operation limit 2"
# Create a Spark session
spark = SparkSession.builder \
.appName("Python Spark SQL") \
.getOrCreate()
# Read data from Databricks table into a DataFrame
df = spark.sql(sql_query)
display(df)