cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Expose delta table data to Salesforce - odata?

Ruby8376
Valued Contributor

HI Looking for suggestiongs to stream on demand data from databricks delta tables to salesforce.

Is odata a good option? 

8 REPLIES 8

Ruby8376
Valued Contributor

@-werners- @Anonymous  can u help?

-werners-
Esteemed Contributor III

So do you want Databricks to push data into Salesforce or do you want Salesforce to pull data?
If you want to push, you could write microbatches to salesforce (if possible of course).
For pull, you might wanna use the SQL api of databricks sql, OR use some kind of pub/sub tool like kafka etc where all changes are published, which can be fetched later on.

So perhaps some more info on 'Salesforce'.  Is it salesforce cloud or the database?

Ruby8376
Valued Contributor

@-werners- we don't want to persist data on SF cloud but just show it on the ui. It will be pulled on demand from salesforceside using external objects. is odata good option?

-werners-
Esteemed Contributor III

I would not look at OData, as Databricks itself has no out-of-the-box OData integration. So this means you would have to do this yourself.
Instead I would look at the databricks-sql sql api. So the UI you are talking about is doing a REST call to the query api and databricks sql sends the data.
Something like that?

Ruby8376
Valued Contributor

even for this case, there would be some custom development. Salesforce doesn't natively support sending SQL queries directly to external databases or APIs. therefore, either we would need middleware between salesforce and databricks or develope custom apex code on salesforce(to authenticate, request, process repsonse, etc).

Ruby8376
Valued Contributor

Hi @werners

So, we have decided on this flow: Salesforce connect <->APIM <->Webapp(Odata)<->databricks sql API <- Delta tables

 

2 questions here:

1. can you share link/documentation on how we can integrate databricks <-delta tables? do we need serverless sql warehouse for sure or can we just use a databricks sql endpoint for querying??

2. For authentication, we have chosen Oauth machineto machine authentication as per microsoft recommendation. in this process do we use external table to share the same delta table? OAuth machine-to-machine (M2M) authentication | Databricks on AWS

 

@-werners-  Please help

 

-werners-
Esteemed Contributor III

I see, is there a possibility in SF to define an external location/datasource?
Just guessing here, as these type of packages are really good in isolating data, not integrating it.

fegvilela
New Contributor

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group