I am trying to use spark session of the compute in java Jar to run queries against tables unity catalog . I get the following error
SparkSession spark = SparkSession.builder()
.appName("Databricks Query Example")
.config("spark.master", "local")
// .config("spark.databricks.sql.initial.catalog.name", "110166_ctg_dev")
// .config("spark.jars.packages", "io.delta:delta-spark_2.12:3.2.0,io.unitycatalog:unitycatalog-spark:0.2.0-SNAPSHOT")
// .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
// .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
// .config("spark.sql.catalog.110166_ctg_dev", "io.unitycatalog.connectors.spark.UCSingleCatalog")
// .config("spark.sql.defaultCatalog", "110166_ctg_dev")
.config("spark.jars.packages", ",io.delta:delta-core_2.12:2.4.0")
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
.config("spark.local.ip", "127.0.0.1")
.config("spark.driver.host", "127.0.0.1")
.getOrCreate();
// // Example query
// Dataset<Row> df = spark.read().json("path/to/your/json/file");
// df.createOrReplaceTempView("table");
spark.sql("SHOW CATALOGS").show();
Dataset<Row> result = spark.sql("select * from XXXXXX.YYYYYYYYYYY.ZZZZZZZZZZ");
result.show();
The following is error we are getting
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. +-------------+ | catalog| +-------------+ |spark_catalog| +-------------+ Exception in thread "main" org.apache.spark.sql.AnalysisException: [REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace, but got `110166_ctg_dev`.`trusted__ccb__cpe__pat__jdx_db`. at org.apache.spark.sql.errors.QueryCompilationErrors$.requiresSinglePartNamespaceError(QueryCompilationErrors.scala:1336) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog$TableIdentifierHelper.asTableIdentifier(V2SessionCatalog.scala:245) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.loadTable(V2SessionCatalog.scala:75) at org.apache.spark.sql.connector.catalog.DelegatingCatalogExtension.loadTable(DelegatingCatalogExtension.java:73)