cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Writing to Foreign catalog

Fatimah-Tariq
New Contributor III

I have a running notebook job where I am doing some processing and writing the tables in a foreign catalog. It has been running successfully for about an year. The job is scheduled and runs on job cluster with DBR 16.2

Recently, I had to add new notebook to the job doing almost the same operations but while testing this notebook standalone with an interactive cluster created with the DBR 16.4(because 16.2 is deprecated and not available anymore for interactive clusters), it's throwing me an error on the code block where I'm writing to the foreign catalog saying "SparkConnectGrpcException: (java.lang.SecurityException) PERMISSION_DENIED: Only READ credentials can be retrieved for foreign tables."

Can anyone please help me understand the scenario and possible solutions to test my new notebook in an interactive env?

Please note that the new notebook does not throw me an error on this write statement when I add it to my scheduled job (I believe it's because of the older DBR version for the job cluster for this scheduled job that still runs all my other notebooks successfully too). But, I need to test my code first in an interactive env before adding it to the prod job. Any help is appreciated.

0 REPLIES 0