cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Query execution after establishing Databricks to Information Design Tool JDBC Connection

Magesh2798
New Contributor II

Hello all,

I have created a JDBC connection from Databricks to Information Design Tool using access token generated using Databricks Service Principal.

But it’s throwing below error while running query on top of Databricks data in Information Design Business layer.

Error - 

[Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: 42000, Query: SELECT
c***, Error message from Server: org.apache.hive.service.cli.HiveSQLException: Error running query: java.lang.reflect.InvocationTargetException
at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:57)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:715)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:128)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:559)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:403)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:27)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:27)
at com.databricks.spark.util.PublicDBLogging.withAttributionTags0(DatabricksSparkUsageLogger.scala:72)
at com.databricks.spark.util.DatabricksSparkUsageLogger.withAttributionTags(DatabricksSparkUsageLogger.scala:172)
at com.databricks.spark.util.UsageLogging.$anonfun$withAttributionTags$1(UsageLogger.scala:491)
at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:603)
at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:612)
at com.databricks.spark.util.UsageLogging.withAttributionTags(UsageLogger.scala:491)
at com.databricks.spark.util.UsageLogging.withAttributionTags$(UsageLogger.scala:489)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withAttributionTags(SparkExecuteStatementOperation.scala:67)
at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.$anonfun$withLocalProperties$11(ThriftLocalProperties.scala:190)
at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:185)
at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:71)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:67)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:381)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:367)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:415)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor238.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.databricks.spark.util.DbfsReflectionUtils$.resolveDbfsV2Path(DbfsReflectionUtils.scala:73)
at com.databricks.spark.util.DbfsReflectionUtils$.getRootFileSystem(DbfsReflectionUtils.scala:137)
at com.databricks.spark.util.DbfsReflectionUtils$.getRootFileSystemName(DbfsReflectionUtils.scala:157)
at org.apache.spark.api.python.PythonSecurityUtils$.checkPathFileSystemSafety(PythonSecurityUtils.scala:136)
at com.databricks.sql.transaction.tahoe.DeltaLog$.apply(DeltaLog.scala:1090)
at com.databricks.sql.transaction.tahoe.DeltaLog$.forTable(DeltaLog.scala:990)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.$anonfun$deltaLog$2(DeltaTableV2.scala:117)
at com.databricks.sql.transaction.tahoe.DeltaLog$.$anonfun$withAdditionalSnapshotInitializationUsageLogData$1(DeltaLog.scala:860)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:105)
at com.databricks.sql.transaction.tahoe.DeltaLog$.withAdditionalSnapshotInitializationUsageLogData(DeltaLog.scala:861)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.$anonfun$deltaLog$1(DeltaTableV2.scala:117)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2$.withEnrichedUnsupportedTableException(DeltaTableV2.scala:490)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.deltaLog$lzycompute(DeltaTableV2.scala:116)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.deltaLog(DeltaTableV2.scala:113)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.$anonfun$initialSnapshot$7(DeltaTableV2.scala:190)
at com.databricks.sql.transaction.tahoe.DeltaLog$.$anonfun$withAdditionalSnapshotInitializationUsageLogData$1(DeltaLog.scala:860)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:105)
at com.databricks.sql.transaction.tahoe.DeltaLog$.withAdditionalSnapshotInitializationUsageLogData(DeltaLog.scala:861)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.$anonfun$initialSnapshot$6(DeltaTableV2.scala:189)
at scala.Option.orElse(Option.scala:447)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.$anonfun$initialSnapshot$1(DeltaTableV2.scala:189)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2$.withEnrichedUnsupportedTableException(DeltaTableV2.scala:490)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.initialSnapshot$lzycompute(DeltaTableV2.scala:197)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.initialSnapshot(DeltaTableV2.scala:163)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.$anonfun$tableSchema$2(DeltaTableV2.scala:222)
at scala.Option.getOrElse(Option.scala:189)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.tableSchema$lzycompute(DeltaTableV2.scala:222)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.tableSchema(DeltaTableV2.scala:220)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.schema(DeltaTableV2.scala:227)
at org.apache.spark.sql.connector.catalog.Table.columns(Table.java:68)
at com.databricks.sql.transaction.tahoe.catalog.DeltaTableV2.columns(DeltaTableV2.scala:66)
at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation$.create(DataSourceV2Relation.scala:234)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$createRelation$2(Analyzer.scala:1648)
at scala.Option.map(Option.scala:230)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$createRelation$1(Analyzer.scala:1599)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$record(Analyzer.scala:1902)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$createRelation(Analyzer.scala:1599)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anon$3.$anonfun$submit$7(Analyzer.scala:1845)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anon$3.$anonfun$submit$6(Analyzer.scala:1845)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$6(SQLExecution.scala:769)
at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$5(SQLExecution.scala:769)
at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$4(SQLExecution.scala:769)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$3(SQLExecution.scala:768)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$2(SQLExecution.scala:767)
at org.apache.spark.sql.execution.SQLExecution$.withOptimisticTransaction(SQLExecution.scala:789)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$1(SQLExecution.scala:766)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.$anonfun$run$1(SparkThreadLocalForwardingThreadPoolExecutor.scala:134)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.$anonfun$runWithCaptured$4(SparkThreadLocalForwardingThreadPoolExecutor.scala:91)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)
at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:90)
at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.runWithCaptured$(SparkThreadLocalForwardingThreadPoolExecutor.scala:67)
at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:131)
at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.run(SparkThreadLocalForwardingThreadPoolExecutor.scala:134)
... 3 more
Caused by: com.databricks.backend.daemon.data.common.InvalidMountException: Error while using path /mnt/databrickstowebi/client6000/pficfootnoteallocationsummary/_delta_log for resolving path '/client6000/pficfootnoteallocationsummary/_delta_log' within mount at '/mnt/databrickstowebi'.
at com.databricks.backend.daemon.data.common.InvalidMountException$.apply(DataMessages.scala:733)
at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.resolve(MountEntryResolver.scala:114)
at com.databricks.backend.daemon.data.client.DBFSV2.resolve(DatabricksFileSystemV2.scala:104)
... 73 more
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: com.databricks.common.client.DatabricksServiceHttpClientException: RESOURCE_DOES_NOT_EXIST: Refresh token not found for userId: Some(3790767436975024)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2199)
at com.google.common.cache.LocalCache.get(LocalCache.java:3932)
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4721)
at com.databricks.backend.daemon.driver.credentials.CachingCredentialStore.get(CachingCredentialStore.scala:60)
at com.databricks.backend.daemon.driver.credentials.OAuthTokenRefresherClient.refreshToken(OAuthTokenRefresherClient.scala:82)
at com.databricks.backend.daemon.driver.credentials.OAuthTokenRefresherClient.newToken(OAuthTokenRefresherClient.scala:131)
at org.apache.spark.credentials.RuntimeCredential.getOrRefresh(CredentialContext.scala:51)
at org.apache.spark.credentials.CredentialContext$.$anonfun$getCredentialFromStore$2(CredentialContext.scala:233)
at scala.Option.map(Option.scala:230)
at org.apache.spark.credentials.CredentialContext$.$anonfun$getCredentialFromStore$1(CredentialContext.scala:232)
at scala.Option.map(Option.scala:230)
at org.apache.spark.credentials.CredentialContext$.getCredentialFromStore(CredentialContext.scala:230)
at org.apache.spark.credentials.CredentialContext$.$anonfun$getCredential$6(CredentialContext.scala:225)
at scala.Option.flatMap(Option.scala:271)
at org.apache.spark.credentials.CredentialContext$.$anonfun$getCredential$3(CredentialContext.scala:225)
at scala.Option.orElse(Option.scala:447)
at org.apache.spark.credentials.CredentialContext$.getCredential(CredentialContext.scala:225)
at com.databricks.backend.daemon.data.client.adl.AdlGen2UpgradeCredentialContextTokenProvider.getToken(AdlGen2UpgradeCredentialContextTokenProvider.scala:30)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getAccessToken(AbfsClient.java:1371)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.executeHttpOperation(AbfsRestOperation.java:306)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.completeExecute(AbfsRestOperation.java:238)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.lambda$execute$0(AbfsRestOperation.java:211)
at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.measureDurationOfInvocation(IOStatisticsBinding.java:494)
at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDurationOfInvocation(IOStatisticsBinding.java:465)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:209)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getPathStatus(AbfsClient.java:979)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:1128)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:956)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:946)
at com.databricks.common.filesystem.LokiABFS.getFileStatusNoCache(LokiABFS.scala:52)
at com.databricks.common.filesystem.LokiABFS.getFileStatus(LokiABFS.scala:42)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1862)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.exists(AzureBlobFileSystem.java:1525)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.FallbackEncryptionContextProvider.lambda$manifestFileExists$0(FallbackEncryptionContextProvider.java:52)
at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.FallbackEncryptionContextProvider.manifestFileExists(FallbackEncryptionContextProvider.java:48)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.FallbackEncryptionContextProvider.initialize(FallbackEncryptionContextProvider.java:38)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.initialize(AzureBlobFileSystem.java:238)
at com.databricks.common.filesystem.LokiABFS.initialize(LokiABFS.scala:36)
at com.databricks.common.filesystem.LokiFileSystem$.$anonfun$getLokiFS$1(LokiFileSystem.scala:149)
at com.databricks.common.filesystem.FileSystemCache.getOrCompute(FileSystemCache.scala:46)
at com.databricks.common.filesystem.LokiFileSystem$.getLokiFS(LokiFileSystem.scala:146)
at com.databricks.common.filesystem.LokiFileSystem.initialize(LokiFileSystem.scala:211)
at com.databricks.backend.common.util.HadoopFSUtil$.getLokiABFSForMount(HadoopFSUtil.scala:714)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2Factory.createFileSystem(DatabricksFileSystemV2Factory.scala:113)
at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.$anonfun$resolve$2(MountEntryResolver.scala:82)
at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:669)
at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:687)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.common.util.locks.LoggedLock$.withAttributionContext(LoggedLock.scala:89)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.common.util.locks.LoggedLock$.withAttributionTags(LoggedLock.scala:89)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:664)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:582)
at com.databricks.common.util.locks.LoggedLock$.recordOperationWithResultTags(LoggedLock.scala:89)
at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:542)
at com.databricks.common.util.locks.LoggedLock$.recordOperation(LoggedLock.scala:89)
at com.databricks.common.util.locks.LoggedLock$.withLock(LoggedLock.scala:163)
at com.databricks.common.util.locks.PerKeyLock.withLock(PerKeyLock.scala:42)
at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.resolve(MountEntryResolver.scala:79)
... 74 more
Caused by: com.databricks.common.client.DatabricksServiceHttpClientException: RESOURCE_DOES_NOT_EXIST: Refresh token not found for userId: Some(3790767436975024)
at com.databricks.common.client.DatabricksServiceHttpClientException.copy(DBHttpClient.scala:1407)
at com.databricks.common.client.RawDBHttpClient.getResponseBody(DBHttpClient.scala:1253)
at com.databricks.common.client.RawDBHttpClient.$anonfun$httpRequestInternal$1(DBHttpClient.scala:1199)
at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:669)
at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:687)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.common.client.RawDBHttpClient.withAttributionContext(DBHttpClient.scala:604)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.common.client.RawDBHttpClient.withAttributionTags(DBHttpClient.scala:604)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:664)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:582)
at com.databricks.common.client.RawDBHttpClient.recordOperationWithResultTags(DBHttpClient.scala:604)
at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:542)
at com.databricks.common.client.RawDBHttpClient.recordOperation(DBHttpClient.scala:604)
at com.databricks.common.client.RawDBHttpClient.httpRequestInternal(DBHttpClient.scala:1185)
at com.databricks.common.client.RawDBHttpClient.entityEnclosingRequestInternal(DBHttpClient.scala:1174)
at com.databricks.common.client.RawDBHttpClient.getInternal(DBHttpClient.scala:1123)
at com.databricks.common.client.RawDBHttpClient.get(DBHttpClient.scala:689)
at com.databricks.common.client.RawDBHttpClient.getWithHeaders(DBHttpClient.scala:717)
at com.databricks.common.client.RawDBHttpClient.get(DBHttpClient.scala:661)
at com.databricks.backend.daemon.driver.credentials.OAuthTokenRefresherClient.$anonfun$refreshToken$2(OAuthTokenRefresherClient.scala:91)
at com.databricks.common.client.DBHttpClient$.retryWithDeadline(DBHttpClient.scala:376)
at com.databricks.backend.daemon.driver.credentials.OAuthTokenRefresherClient.reliably(OAuthTokenRefresherClient.scala:52)
at com.databricks.backend.daemon.driver.credentials.OAuthTokenRefresherClient.$anonfun$refreshToken$1(OAuthTokenRefresherClient.scala:91)
at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4724)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3522)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2315)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2278)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2193)
... 140 more
.

@Retired_mod - any help on this issue.

If someone facing the same issue let me know how you solved this issue?

Thanks in advance

#Information Design tool #Databricksconnection #Queryexecutionerror

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group