I've been followig this documentation to get Salesforce Data Cloud connection setup into Databricks
https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud
I've added the client id and secret id and the scope, and it seems to connect becuase I'm taken to the next screen where I can create a catalog from the connection. However, when I click 'test connection' I see the followig error:
summary: UnityCatalogServiceException: [RequestId=413ba3a5-fd89-4340-a1f9-ba321b863492 ErrorClass=INTERNAL_ERROR] , data: {"type":"baseError","stackFrames":["com.databricks.sql.managedcatalog.UnityCatalogServiceException: [RequestId=413ba3a5-fd89-4340-a1f9-ba321b863492 ErrorClass=INTERNAL_ERROR] \n\tat
com.databricks.managedcatalog.ErrorDetailsHandler.wrapServiceException(ErrorDetailsHandler.scala:51)\n\tat
org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:358)\n\tat
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166)\n\tat
org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:358)\n\tat
java.lang.Thread.run(Thread.java:750)\n"], "arguments":{},"addedWidgets":{},"removedWidgets":[],"datasetInfos":[], "metadata":{},"jupyterProps":null,"sqlProps":{"sqlState":null,"errorClass":null, "stackTrace":null,"startIndex":null,"stopIndex":null,"pysparkCallSite":null,"pysparkFragment":null}}