cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Can't query delta tables, token missing required scope

Miro_ta
New Contributor III

Hello,
I've correctly set up a stream from kinesis, but I can't read anything from my delta table

I'm actually reproducing the demo from Frank Munz: https://github.com/fmunz/delta-live-tables-notebooks/tree/main/motion-demo

and I'm running the following script which is named M-Magn SSS.py

 

# Databricks notebook source
# MAGIC %md
# MAGIC ![Name of the image](https://raw.githubusercontent.com/fmunz/motion/master/img/magn.png)

# COMMAND ----------

# MAGIC %sql
# MAGIC USE demo_frank.motion;
# MAGIC DESCRIBE table sensor;

# COMMAND ----------

from pyspark.sql.functions import window

# set watermark for 1 minute for time column
# spark.sql.shuffle.partitions (default 200, match number of cores)
spark.conf.set("spark.sql.shuffle.partitions", 30)


display(spark.readStream.format("delta").
        table("sensor").withWatermark("time", "3 seconds"). \
        groupBy(window("time", "1 second")).avg("magn").orderBy("window",ascending=False).limit(30))

 

  I've created the pipeline successfully, I'm now trying to look at my delta table, but databricks raises the following error when i run this:

 

 

%sql
USE bigdatalab.motion;
DESCRIBE table sensor;

 

Error in SQL statement: ExecutionException: org.apache.spark.sql.AnalysisException: 403: Your token is missing the required scopes for this endpoint.

I don't know what could be the cause of the problem, I've tried to set up all the authorizations but it doesn't work.

Hope you can help me, I'm using the Databricks on AWS 14 days trial.

 

2 ACCEPTED SOLUTIONS

Accepted Solutions

Miro_ta
New Contributor III

Not really, but I solved in my case using the hive metastore instead of the unity catalog, maybe it's because I'm on a trial version

View solution in original post

Brosenberg
Databricks Employee
Databricks Employee

At this time, Delta Live Tables in Unity Catalog can only be read using a Shared compute or SQL Warehouse (support to read from Assigned compute is on the roadmap).

To read the table using Assigned compute (e.g. Personal Compute), you will first need to make a copy of the table using a Shared Compute/SQL Warehouse (i.e. not in a DLT pipeline). You can use Assigned compute to read from the copy.

View solution in original post

8 REPLIES 8

Miro_ta
New Contributor III

If I try to query any other table that is not delta, I don't get any error.

trejas
New Contributor II

Did you figure out an answer to this? I am seeing the same thing on a delta streaming table.

Miro_ta
New Contributor III

Not really, but I solved in my case using the hive metastore instead of the unity catalog, maybe it's because I'm on a trial version

jlb0001
New Contributor III

I'm running into a similar situation such that DESCRIBE <table> on a table in UC works fine but running DESCRIBE EXTENDED <table> on that same table gives me that same error from a databricks notebook.

Strangely, I don't get this error if I switch the notebook to instead use a Serverless SQL Warehouse, and not a personal cluster compute.

So, I personally definitely have whatever access is needed, but somehow this compute instance does not have access to just the extended properties?

ExecutionException: org.apache.spark.sql.AnalysisException: 403: Your token is missing the required scopes for this endpoint.

Spoiler
What permissions must the compute instance have to access extended properties?

udi_azulay
New Contributor II

I am having similar issue, any update on this one ?

nag_kanchan
New Contributor III

I am facing the same issue but i cannot move my data to hive_metastore. Does anyone has a solution for Unity catalog?

@Retired_mod 

Brosenberg
Databricks Employee
Databricks Employee

At this time, Delta Live Tables in Unity Catalog can only be read using a Shared compute or SQL Warehouse (support to read from Assigned compute is on the roadmap).

To read the table using Assigned compute (e.g. Personal Compute), you will first need to make a copy of the table using a Shared Compute/SQL Warehouse (i.e. not in a DLT pipeline). You can use Assigned compute to read from the copy.

holly
Databricks Employee
Databricks Employee

Hello, I also had this issue. It was because I was trying to read a DLT table with a Machine Learning Runtime. At time of writing, Machine Learning Runtimes are not compatible with shared access mode, so I ended up setting up two clusters, one MLR assigned access mode, then a normal runtime shared access mode, and then played a fun game of dependency jenga. 

I was only setting up a demo, so no big deal. If you're looking at something in production, I'm not sure I could recommend this approach unless it's a last resort.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group