cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks to Salesforce Core (Not cloud)

sdurai
New Contributor

Hi,

Is there any native connector available to connect salesforce core (not cloud) in Databricks? If no native connector, what are all recommended approaches to connect to Salesforce core

Thanks,

Subashini

2 ACCEPTED SOLUTIONS

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @sdurai ,

Here you will find list of all Salesforce products that the Salesforce ingestion connector support:

Salesforce ingestion connector FAQs | Databricks on AWS

If you don't want to use managed connector another approach that you can take is to use bulk extraction via Salesforce APIs.

View solution in original post

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @sdurai,

Yes. Databricks has a native Salesforce connector for core Salesforce (Sales Cloud / Service Cloud / Platform objects) via Lakeflow Connect - Salesforce ingestion connector. It lets you create fully managed, incremental pipelines from Salesforce Platform data into Unity Catalog tables, using Bulk API 2.0 / REST under the hood.

Here are some docs for your reference. As I'm not sure which cloud you are on, I have shared both AWS and Azure links. 

Salesforce ingestion overview: AWS & Azure

Setting up ingestion (UI + Bundles): AWS & Azure

Salesforce connector FAQs (product coverage, APIs, limits): HERE - Same as what @szymon_dybczak has shared.

Lakehouse Federation - Salesforce Data 360 (query federation): HERE

Lakehouse Federation โ€“ Salesforce Data 360 File Sharing: HERE

If the managed connector isnโ€™t available in your workspace, common alternatives are either exporting from Salesforce to cloud storage and ingesting with Auto Loader / standard file-based connectors, or building a custom connector in Databricks using the Salesforce REST/Bulk APIs or an OSS tool running on Databricks compute.

Hope this helps.

If this answer resolves your question, could you mark it as โ€œAccept as Solutionโ€? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***

View solution in original post

4 REPLIES 4

szymon_dybczak
Esteemed Contributor III

Hi @sdurai ,

Here you will find list of all Salesforce products that the Salesforce ingestion connector support:

Salesforce ingestion connector FAQs | Databricks on AWS

If you don't want to use managed connector another approach that you can take is to use bulk extraction via Salesforce APIs.

Thank you @szymon_dybczak 

Ashwin_DSA
Databricks Employee
Databricks Employee

Hi @sdurai,

Yes. Databricks has a native Salesforce connector for core Salesforce (Sales Cloud / Service Cloud / Platform objects) via Lakeflow Connect - Salesforce ingestion connector. It lets you create fully managed, incremental pipelines from Salesforce Platform data into Unity Catalog tables, using Bulk API 2.0 / REST under the hood.

Here are some docs for your reference. As I'm not sure which cloud you are on, I have shared both AWS and Azure links. 

Salesforce ingestion overview: AWS & Azure

Setting up ingestion (UI + Bundles): AWS & Azure

Salesforce connector FAQs (product coverage, APIs, limits): HERE - Same as what @szymon_dybczak has shared.

Lakehouse Federation - Salesforce Data 360 (query federation): HERE

Lakehouse Federation โ€“ Salesforce Data 360 File Sharing: HERE

If the managed connector isnโ€™t available in your workspace, common alternatives are either exporting from Salesforce to cloud storage and ingesting with Auto Loader / standard file-based connectors, or building a custom connector in Databricks using the Salesforce REST/Bulk APIs or an OSS tool running on Databricks compute.

Hope this helps.

If this answer resolves your question, could you mark it as โ€œAccept as Solutionโ€? That helps other users quickly find the correct fix.

Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***

My cloud is in Azure. Thank you for the details.