cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Reading Athen table created on top of s3 in databricks

HemanthRatakond
New Contributor II

HI,

we have databricks that use aws glue catalog as metastore, I am trying to read athena table which is created on top s3, I am getting following error

com.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.openx.data.jsonserde.JsonSerDe

2 REPLIES 2

daniel_sahal
Honored Contributor III

@Hemanth Ratakonda​ Can you paste full error message? It looks like the one you've pasted is cut in half.

HemanthRatakond
New Contributor II

@Daniel Sahal​ 

at org.apache.hadoop.hive.ql.plan.TableDesc.getDeserializerClass(TableDesc.java:79)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.addColumnMetadataToConf(HiveTableScanExec.scala:127)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.hadoopConf$lzycompute(HiveTableScanExec.scala:104)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.hadoopConf(HiveTableScanExec.scala:101)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.hadoopReader$lzycompute(HiveTableScanExec.scala:113)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.hadoopReader(HiveTableScanExec.scala:108)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.$anonfun$doExecute$2(HiveTableScanExec.scala:214)

at org.apache.spark.util.Utils$.withDummyCallSite(Utils.scala:2952)

at org.apache.spark.sql.hive.execution.HiveTableScanExec.doExecute(HiveTableScanExec.scala:214)

at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:232)

at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:276)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:165)

at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:272)

at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:228)

at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:121)

at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:133)

at org.apache.spark.sql.execution.qrc.InternalRowFormat$.collect(cachedSparkResults.scala:120)

at org.apache.spark.sql.execution.qrc.InternalRowFormat$.collect(cachedSparkResults.scala:108)

at org.apache.spark.sql.execution.qrc.InternalRowFormat$.collect(cachedSparkResults.scala:90)

at org.apache.spark.sql.execution.qrc.ResultCacheManager.$anonfun$computeResult$1(ResultCacheManager.scala:528)

at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)

at org.apache.spark.sql.execution.qrc.ResultCacheManager.collectResult$1(ResultCacheManager.scala:520)

at org.apache.spark.sql.execution.qrc.ResultCacheManager.computeResult(ResultCacheManager.scala:540)

at org.apache.spark.sql.execution.qrc.ResultCacheManager.$anonfun$getOrComputeResultInternal$1(ResultCacheManager.scala:395)

at scala.Option.getOrElse(Option.scala:189)

at org.apache.spark.sql.execution.qrc.ResultCacheManager.getOrComputeResultInternal(ResultCacheManager.scala:388)

at org.apache.spark.sql.execution.qrc.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:286)

at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeCollectResult$1(SparkPlan.scala:438)

at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)

at org.apache.spark.sql.execution.SparkPlan.executeCollectResult(SparkPlan.scala:435)

at org.apache.spark.sql.Dataset.collectResult(Dataset.scala:3471)

at org.apache.spark.sql.Dataset.$anonfun$collectResult$1(Dataset.scala:3462)

at org.apache.spark.sql.Dataset.$anonfun$withAction$3(Dataset.scala:4344)

at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:789)

at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:4342)

at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:245)

at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:414)

at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:190)

at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1003)

at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:144)

at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:364)

at org.apache.spark.sql.Dataset.withAction(Dataset.scala:4342)

at org.apache.spark.sql.Dataset.collectResult(Dataset.scala:3461)

at com.databricks.backend.daemon.driver.OutputAggregator$.withOutputAggregation0(OutputAggregator.scala:267)

at com.databricks.backend.daemon.driver.OutputAggregator$.withOutputAggregation(OutputAggregator.scala:101)

at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:115)

at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:145)

at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$23(DriverLocal.scala:731)

at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:103)

at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$20(DriverLocal.scala:714)

at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:401)

at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)

at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:158)

at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:399)

at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:396)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:64)

at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:444)

at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:429)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:64)

at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:691)

at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:622)

at scala.util.Try$.apply(Try.scala:213)

at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:614)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:533)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:568)

at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:438)

at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:381)

at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:232)

at java.lang.Thread.run(Thread.java:750)

Caused by: java.lang.ClassNotFoundException: org.openx.data.jsonserde.JsonSerDe

at java.net.URLClassLoader.findClass(URLClassLoader.java:387)

at java.lang.ClassLoader.loadClass(ClassLoader.java:419)

at java.lang.ClassLoader.loadClass(ClassLoader.java:352)

at java.lang.Class.forName0(Native Method)

at java.lang.Class.forName(Class.java:348)

at org.apache.hadoop.hive.ql.plan.TableDesc.getDeserializerClass(TableDesc.java:76)

... 68 more

at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:130)

at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:145)

at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$23(DriverLocal.scala:731)

at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:103)

at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$20(DriverLocal.scala:714)

at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:401)

at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)

at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:158)

at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:399)

at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:396)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:64)

at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:444)

at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:429)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:64)

at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:691)

at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:622)

at scala.util.Try$.apply(Try.scala:213)

at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:614)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:533)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:568)

at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:438)

at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:381)

at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:232)

at java.lang.Thread.run(Thread.java:750)

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.