cancel
Showing results for 
Search instead for 
Did you mean: 

Access Foreign Catalog using Python in Notebook

kashy
New Contributor III

Hello - I have a foreign catalog which I can access fine in SQL.  However, I can't access it from from python notebook.

i.e. this works just fine if I have notebook using a Pro SQL Warehouse

%sql
USE CATALOG <my_foreign_catalog_name>;
USE SCHEMA public;
SELECT * from bookings;

 

However, this, running on Shared Cluster 13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)
spark.sql("USE CATALOG <my_foreign_catalog_name>")
spark.sql("USE SCHEMA public")
display(spark.table("bookings"))
 
Gives me this error:
org.apache.spark.sql.connector.catalog.CatalogNotFoundException: Catalog 'my_foreign_catalog_name' plugin class not found: spark.sql.catalog.my_foreign_catalog_name is not defined
 
 
If I run:
display(spark.sql("SHOW CATALOGS"))
 
I can't see the foreign_catalogs I have created in Unity Catalog and it just shows me 'spark_catalog'
 
Any idea what needs tweaking?
1 ACCEPTED SOLUTION

Accepted Solutions

kashy
New Contributor III

thanks @Debayan 

I resolved this - I has to remove this option on the cluster and it works now

Enable credential passthrough for user-level data access

View solution in original post

2 REPLIES 2

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, Are you using this in single user cluster? 

Also, please tag @Debayan with your next response so that I will get notified. 

kashy
New Contributor III

thanks @Debayan 

I resolved this - I has to remove this option on the cluster and it works now

Enable credential passthrough for user-level data access
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.