Hello - I have a foreign catalog which I can access fine in SQL. However, I can't access it from from python notebook.
i.e. this works just fine if I have notebook using a Pro SQL Warehouse
%sql
USE CATALOG <my_foreign_catalog_name>;
USE SCHEMA public;
SELECT * from bookings;
However, this, running on Shared Cluster 13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)
spark.sql("USE CATALOG <my_foreign_catalog_name>")
spark.sql("USE SCHEMA public")
display(spark.table("bookings"))
Gives me this error:
org.apache.spark.sql.connector.catalog.CatalogNotFoundException: Catalog 'my_foreign_catalog_name' plugin class not found: spark.sql.catalog.my_foreign_catalog_name is not defined
If I run:
display(spark.sql("SHOW CATALOGS"))
I can't see the foreign_catalogs I have created in Unity Catalog and it just shows me 'spark_catalog'
Any idea what needs tweaking?