Wednesday
Hi,
Is there any native connector available to connect salesforce core (not cloud) in Databricks? If no native connector, what are all recommended approaches to connect to Salesforce core
Thanks,
Subashini
yesterday
Hi @sdurai ,
Here you will find list of all Salesforce products that the Salesforce ingestion connector support:
Salesforce ingestion connector FAQs | Databricks on AWS
If you don't want to use managed connector another approach that you can take is to use bulk extraction via Salesforce APIs.
yesterday - last edited yesterday
Hi @sdurai,
Yes. Databricks has a native Salesforce connector for core Salesforce (Sales Cloud / Service Cloud / Platform objects) via Lakeflow Connect - Salesforce ingestion connector. It lets you create fully managed, incremental pipelines from Salesforce Platform data into Unity Catalog tables, using Bulk API 2.0 / REST under the hood.
Here are some docs for your reference. As I'm not sure which cloud you are on, I have shared both AWS and Azure links.
Salesforce ingestion overview: AWS & Azure
Setting up ingestion (UI + Bundles): AWS & Azure
Salesforce connector FAQs (product coverage, APIs, limits): HERE - Same as what @szymon_dybczak has shared.
Lakehouse Federation - Salesforce Data 360 (query federation): HERE
Lakehouse Federation โ Salesforce Data 360 File Sharing: HERE
If the managed connector isnโt available in your workspace, common alternatives are either exporting from Salesforce to cloud storage and ingesting with Auto Loader / standard file-based connectors, or building a custom connector in Databricks using the Salesforce REST/Bulk APIs or an OSS tool running on Databricks compute.
Hope this helps.
If this answer resolves your question, could you mark it as โAccept as Solutionโ? That helps other users quickly find the correct fix.
yesterday
Hi @sdurai ,
Here you will find list of all Salesforce products that the Salesforce ingestion connector support:
Salesforce ingestion connector FAQs | Databricks on AWS
If you don't want to use managed connector another approach that you can take is to use bulk extraction via Salesforce APIs.
yesterday
Thank you @szymon_dybczak
yesterday - last edited yesterday
Hi @sdurai,
Yes. Databricks has a native Salesforce connector for core Salesforce (Sales Cloud / Service Cloud / Platform objects) via Lakeflow Connect - Salesforce ingestion connector. It lets you create fully managed, incremental pipelines from Salesforce Platform data into Unity Catalog tables, using Bulk API 2.0 / REST under the hood.
Here are some docs for your reference. As I'm not sure which cloud you are on, I have shared both AWS and Azure links.
Salesforce ingestion overview: AWS & Azure
Setting up ingestion (UI + Bundles): AWS & Azure
Salesforce connector FAQs (product coverage, APIs, limits): HERE - Same as what @szymon_dybczak has shared.
Lakehouse Federation - Salesforce Data 360 (query federation): HERE
Lakehouse Federation โ Salesforce Data 360 File Sharing: HERE
If the managed connector isnโt available in your workspace, common alternatives are either exporting from Salesforce to cloud storage and ingesting with Auto Loader / standard file-based connectors, or building a custom connector in Databricks using the Salesforce REST/Bulk APIs or an OSS tool running on Databricks compute.
Hope this helps.
If this answer resolves your question, could you mark it as โAccept as Solutionโ? That helps other users quickly find the correct fix.
yesterday
My cloud is in Azure. Thank you for the details.