cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

I am getting this error: com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: com.databricks.rpc.UnknownRemoteException: Remote exception occurred:

TeachingWithDat
New Contributor II

I am teaching a class for BYU Idaho and every table in every database has been imploded for my class. We keep getting this error:

com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: com.databricks.rpc.UnknownRemoteException: Remote exception occurred:

Databricks Premium on AWS

3 REPLIES 3

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @Databricks University Alliance​ , Could you please paste the whole error snippet here?

pc
New Contributor II

com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException:

The query operator `UpdateCommandEdge` contains one or more unsupported

expression types Aggregate, Window or Generate.

Invalid expressions: [avg(spark_catalog.eds_us_lake_cdp.cdp_job_log.Duration) OVER (PARTITION BY spark_catalog.eds_us_lake_cdp.cdp_job_log.job_id ORDER BY spark_catalog.eds_us_lake_cdp.cdp_job_log.Job_Start_Date_Time ASC NULLS FIRST RANGE BETWEEN INTERVAL '29' DAY PRECEDING AND CURRENT ROW), avg(spark_catalog.eds_us_lake_cdp.cdp_job_log.Duration)];

UpdateCommandEdge Delta[version=433, s3://tpc-aws-ted-dev-edpp-lake-cdp-us-east-1/eds_us_lake_cdp/cdp_job_log/delta], [Job_Run_Id#10299, Job_Id#10300, Batch_Run_Id#10301, Tidal_Job_No#10302, Source_Layer#10303, Source_Object_Location#10304, Source_Object_Name#10305, Target_Layer#10306, Target_Object_Location#10307, Target_Object_Name#10308, Status#10309, Status_Source#10310, Step_Control_Log#10311, Job_Scheduled_Date_Time#10312, Job_Start_Date_Time#10313, Job_End_Date_Time#10314, Error_Description#10315, Source_Record_Count#10316, Target_Record_Count#10317, MD5_HASH#10318, User_Id#10319, Created_Date_Time#10320, Duration#10321, round(avg(Duration#10321) windowspecdefinition(job_id#10300, Job_Start_Date_Time#10313 ASC NULLS FIRST, specifiedwindowframe(RangeFrame, -INTERVAL '29' DAY, currentrow$())), 2)]

+- SubqueryAlias spark_catalog.eds_us_lake_cdp.cdp_job_log

+- Relation eds_us_lake_cdp.cdp_job_log[Job_Run_Id#10299,Job_Id#10300,Batch_Run_Id#10301,Tidal_Job_No#10302,Source_Layer#10303,Source_Object_Location#10304,Source_Object_Name#10305,Target_Layer#10306,Target_Object_Location#10307,Target_Object_Name#10308,Status#10309,Status_Source#10310,Step_Control_Log#10311,Job_Scheduled_Date_Time#10312,Job_Start_Date_Time#10313,Job_End_Date_Time#10314,Error_Description#10315,Source_Record_Count#10316,Target_Record_Count#10317,MD5_HASH#10318,User_Id#10319,Created_Date_Time#10320,Duration#10321,Average_Run#10322] parquet

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis(CheckAnalysis.scala:60)

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis$(CheckAnalysis.scala:59)

at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:221)

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$2(CheckAnalysis.scala:623)

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$2$adapted(CheckAnalysis.scala:105)

at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:358)

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1(CheckAnalysis.scala:105)

at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)

at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis(CheckAnalysis.scala:100)

at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis$(CheckAnalysis.scala:100)

at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:221)

at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:275)

at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:331)

at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:272)

at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:128)

at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)

at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:268)

at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:265)

at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)

at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:265)

at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:129)

at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:126)

at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:118)

at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:103)

at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)

at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:101)

at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:803)

at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:968)

at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:798)

at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)

at com.databricks.backend.daemon.driver.SQLDriverLocal.$anonfun$executeSql$1(SQLDriverLocal.scala:91)

at scala.collection.immutable.List.map(List.scala:297)

at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:37)

at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:145)

at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$13(DriverLocal.scala:634)

at com.databricks.logging.Log4jUsageLoggingShim$.$anonfun$withAttributionContext$1(Log4jUsageLoggingShim.scala:33)

at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)

at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:94)

at com.databricks.logging.Log4jUsageLoggingShim$.withAttributionContext(Log4jUsageLoggingShim.scala:31)

at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:205)

at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:204)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:59)

at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:240)

at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:225)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:59)

at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:611)

at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:615)

at scala.util.Try$.apply(Try.scala:213)

at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:607)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:526)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:561)

at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:431)

at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:374)

at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:225)

at java.lang.Thread.run(Thread.java:748)

at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:130)

at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:145)

at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$13(DriverLocal.scala:634)

at com.databricks.logging.Log4jUsageLoggingShim$.$anonfun$withAttributionContext$1(Log4jUsageLoggingShim.scala:33)

at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)

at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:94)

at com.databricks.logging.Log4jUsageLoggingShim$.withAttributionContext(Log4jUsageLoggingShim.scala:31)

at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:205)

at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:204)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:59)

at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:240)

at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:225)

at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:59)

at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:611)

at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:615)

at scala.util.Try$.apply(Try.scala:213)

at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:607)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:526)

at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:561)

at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:431)

at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:374)

at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:225)

at java.lang.Thread.run(Thread.java:748)

Kaniz_Fatma
Community Manager
Community Manager

Hi @Databricks University Alliance​, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if his suggestions helped you.

Or else, If you have any solution, please do share that with the community as it can be helpful to others.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!