cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

I am trying to use spark session of the compute in java Jar to run queries against tables unity cata

hemprasad
New Contributor II

I am trying to use spark session of the compute in java Jar to run queries against tables unity catalog . I get the following error 

 

SparkSession spark = SparkSession.builder()
                .appName("Databricks Query Example")
                .config("spark.master", "local")
//                .config("spark.databricks.sql.initial.catalog.name", "110166_ctg_dev")
//                .config("spark.jars.packages", "io.delta:delta-spark_2.12:3.2.0,io.unitycatalog:unitycatalog-spark:0.2.0-SNAPSHOT")
//                .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
//                .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
//                .config("spark.sql.catalog.110166_ctg_dev", "io.unitycatalog.connectors.spark.UCSingleCatalog")
//                .config("spark.sql.defaultCatalog", "110166_ctg_dev")
             
.config("spark.jars.packages", ",io.delta:delta-core_2.12:2.4.0")
             .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
              .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
              .config("spark.local.ip", "127.0.0.1")
              .config("spark.driver.host", "127.0.0.1")
                .getOrCreate();

//        // Example query
//        Dataset<Row> df = spark.read().json("path/to/your/json/file");
//        df.createOrReplaceTempView("table");
       
spark.sql("SHOW CATALOGS").show();
        Dataset<Row> result = spark.sql("select * from XXXXXX.YYYYYYYYYYY.ZZZZZZZZZZ");
        result.show();

 

The following is error we are getting

 

SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. +-------------+ | catalog| +-------------+ |spark_catalog| +-------------+ Exception in thread "main" org.apache.spark.sql.AnalysisException: [REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace, but got `110166_ctg_dev`.`trusted__ccb__cpe__pat__jdx_db`. at org.apache.spark.sql.errors.QueryCompilationErrors$.requiresSinglePartNamespaceError(QueryCompilationErrors.scala:1336) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog$TableIdentifierHelper.asTableIdentifier(V2SessionCatalog.scala:245) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.loadTable(V2SessionCatalog.scala:75) at org.apache.spark.sql.connector.catalog.DelegatingCatalogExtension.loadTable(DelegatingCatalogExtension.java:73)

 

1 REPLY 1

samantha789
New Contributor II

@hemprasad newjetnet aa loginwrote:

I am trying to use spark session of the compute in java Jar to run queries against tables unity catalog . I get the following error 

 

SparkSession spark = SparkSession.builder()
                .appName("Databricks Query Example")
                .config("spark.master", "local")
//                .config("spark.databricks.sql.initial.catalog.name", "110166_ctg_dev")
//                .config("spark.jars.packages", "io.delta:delta-spark_2.12:3.2.0,io.unitycatalog:unitycatalog-spark:0.2.0-SNAPSHOT")
//                .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
//                .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
//                .config("spark.sql.catalog.110166_ctg_dev", "io.unitycatalog.connectors.spark.UCSingleCatalog")
//                .config("spark.sql.defaultCatalog", "110166_ctg_dev")
             
.config("spark.jars.packages", ",io.delta:delta-core_2.12:2.4.0")
             .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
              .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
              .config("spark.local.ip", "127.0.0.1")
              .config("spark.driver.host", "127.0.0.1")
                .getOrCreate();

//        // Example query
//        Dataset<Row> df = spark.read().json("path/to/your/json/file");
//        df.createOrReplaceTempView("table");
       
spark.sql("SHOW CATALOGS").show();
        Dataset<Row> result = spark.sql("select * from XXXXXX.YYYYYYYYYYY.ZZZZZZZZZZ");
        result.show();

 

The following is error we are getting

 

SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. +-------------+ | catalog| +-------------+ |spark_catalog| +-------------+ Exception in thread "main" org.apache.spark.sql.AnalysisException: [REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace, but got `110166_ctg_dev`.`trusted__ccb__cpe__pat__jdx_db`. at org.apache.spark.sql.errors.QueryCompilationErrors$.requiresSinglePartNamespaceError(QueryCompilationErrors.scala:1336) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog$TableIdentifierHelper.asTableIdentifier(V2SessionCatalog.scala:245) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.loadTable(V2SessionCatalog.scala:75) at org.apache.spark.sql.connector.catalog.DelegatingCatalogExtension.loadTable(DelegatingCatalogExtension.java:73)

 



The error "REQUIRES_SINGLE_PART_NAMESPACE" occurs because Spark expects a single-part table name. Use the fully qualified table name or configure Spark to recognize multi-part namespaces if supported. Review Unity Catalog documentation for specific instructions.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group