Salesforce to Databricks connection
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-07-2024 09:16 AM
I've been followig this documentation to get Salesforce Data Cloud connection setup into Databricks
https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud
I've added the client id and secret id and the scope, and it seems to connect becuase I'm taken to the next screen where I can create a catalog from the connection. However, when I click 'test connection' I see the followig error:
summary: UnityCatalogServiceException: [RequestId=413ba3a5-fd89-4340-a1f9-ba321b863492 ErrorClass=INTERNAL_ERROR] , data: {"type":"baseError","stackFrames":["com.databricks.sql.managedcatalog.UnityCatalogServiceException: [RequestId=413ba3a5-fd89-4340-a1f9-ba321b863492 ErrorClass=INTERNAL_ERROR] \n\tat
com.databricks.managedcatalog.ErrorDetailsHandler.wrapServiceException(ErrorDetailsHandler.scala:51)\n\tat
org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:358)\n\tat
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166)\n\tat
org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:358)\n\tat
java.lang.Thread.run(Thread.java:750)\n"], "arguments":{},"addedWidgets":{},"removedWidgets":[],"datasetInfos":[], "metadata":{},"jupyterProps":null,"sqlProps":{"sqlState":null,"errorClass":null, "stackTrace":null,"startIndex":null,"stopIndex":null,"pysparkCallSite":null,"pysparkFragment":null}}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Hi @KristiLogos , Good Day!
We understand that you are facing the following error while you are trying to create a connection with Salesforce, but since it's been a long time, we wanted to check if you are still facing the issue or if it's resolved for you.
If you are still facing the issue can you please check the requirements mentioned in this doc :
Your follow-ups are appreciated here.
Kudos
Ayushi