Running sql command on Single User cluster vs Shared.
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2024 10:31 PM
Hi,
when i am running the below simple code over my Unity Catalog on a Shared cluster, it works very well.
But on a Single User - i am getting :
Failed to acquire a SAS token for list on /__unitystorage/schemas/1bb5b053-ac96-471b-8077-8288c56c8a20/tables/90a7830e-2d62-4b51-8e7b-64de6d367859/_delta_log due to java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: 403: Your token is missing the required scopes for this endpoint.
Failed to acquire a SAS token for list on /__unitystorage/schemas/1bb5b053-ac96-471b-8077-8288c56c8a20/tables/90a7830e-2d62-4b51-8e7b-64de6d367859/_delta_log due to java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: 403: Your token is missing the required scopes for this endpoint.
Please advice!
%python
from pyspark.sql import SparkSession
sql_query = f"SELECT * from z_product_engineering_mpe_mit_trace.trace_api.lot_at_operation limit 2"
# Create a Spark session
spark = SparkSession.builder \
.appName("Python Spark SQL") \
.getOrCreate()
# Read data from Databricks table into a DataFrame
df = spark.sql(sql_query)
display(df)
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-21-2024 11:05 PM
Hi, Could you please refer to the limitations here: https://docs.databricks.com/en/compute/access-mode-limitations.html . Please let us know if this helps.