โ07-09-2025 01:58 AM - edited โ07-09-2025 01:59 AM
Can we connect Salesforce with Databricks.
I want to do both push and pull activity using Databricks and Salesforce.
Do There any challenge while using ODBC?
Please help me with detailed approach.
Thanks a ton.
โ07-09-2025 02:35 AM
Hi @LakehouseOMG14 ,
Yes, you can connect Databricks with Salesforce. You can leverage zero-copy federation as described in below article if you want to access delta tables in Salesforce.
Set Up a Databricks Data Federation Connection | Data Cloud Integrations: Home Page | Data Cloud Int...
To access Salesforce data from Databricks you can use:
- Databricks Lakehouse Federation - Salesforce Data Cloud Connector.
- Databricks LakeFlow Connect - Salesforce Ingestion Connector.
More details you can find below:
- Introducing Salesforce Connectors for Lakehouse Federation and LakeFlow Connect | Databricks Blog
- Ingest data from Salesforce - Azure Databricks | Microsoft Learn
โ07-09-2025 02:35 AM
Hi @LakehouseOMG14 ,
Yes, you can connect Databricks with Salesforce. You can leverage zero-copy federation as described in below article if you want to access delta tables in Salesforce.
Set Up a Databricks Data Federation Connection | Data Cloud Integrations: Home Page | Data Cloud Int...
To access Salesforce data from Databricks you can use:
- Databricks Lakehouse Federation - Salesforce Data Cloud Connector.
- Databricks LakeFlow Connect - Salesforce Ingestion Connector.
More details you can find below:
- Introducing Salesforce Connectors for Lakehouse Federation and LakeFlow Connect | Databricks Blog
- Ingest data from Salesforce - Azure Databricks | Microsoft Learn
โ07-09-2025 04:15 AM
Thanks Sir.
Can we Extract or Pull data from salesforce to Delta lake using Databricks.
What all can be architectural challenges kindly share, as this I am implementing first time.
โ07-09-2025 04:47 AM
Hi @LakehouseOMG14 ,
If you want to extract data from Salesforce and load it to Delta Table than you can use Lakeflow Connect.
There are some requirements you must meet to use that approach:
Your workspace must be enabled for Unity Catalog.
Serverless compute must be enabled for your workspace. See Enable serverless compute.
If you plan to use an existing connection: You must have USE CONNECTION privileges or ALL PRIVILEGES on the connection object.
You must have USE CATALOG privileges on the target catalog.
You must have USE SCHEMA and CREATE TABLE privileges on an existing schema or CREATE SCHEMA privileges on the target catalog.
All that is described in below article. There you also find examples with pipeline definition:
Ingest data from Salesforce - Azure Databricks | Microsoft Learn
Also you can take a look at below video:
โ08-04-2025 12:20 PM
HI Team / @szymon_dybczak
When looking to set up a connector between data clouds and data bricks , can anyone give the exact step break for
โ
โ08-04-2025 12:58 PM
HI Team / @szymon_dybczak
When looking to set up a connector between data clouds and data bricks , can anyone give the exact step break for
โThe connection URL is
but the path ?
โ08-04-2025 02:49 PM - edited โ08-04-2025 02:56 PM
Hi @ManojkMohan ,
ClientId and Client Secret are attributes of service principal. A service principal is a specialized identity in Azure Databricks designed for automation and programmatic access. Service principals give automated tools and scripts API-only access to Azure Databricks resources, providing greater security than using users accounts.
So, as a first step your databricks workspace or account admin needs to create for you service principal. Pay attention -according to Salesforce documentation, ClientID/ClientSecret authentication is only supported using Azure Databricks-managed Service Principals. Microsoft Entra ID-managed principals are not supported.
The connection url is just url of your databricks workspace.
Http path is a connection detail of your compute. How to find it is describes at below documentation entry:
https://docs.databricks.com/aws/en/integrations/compute-details
Edit: Here's a video that you can check. It shows more or less how to configure connector
โ08-04-2025 03:36 PM
Have resolved it
Step 1: Create a Service Principal
Log in to your Databricks Workspace and navigate to the Admin Settings page by clicking your email in the bottom-left corner and selecting "Admin Settings".
Go to the Identity and access tab and click on Service principals.
Click the Add service principal button.
Give your Service Principal a descriptive name, for example, Salesforce Data Cloud Connector, and click Add.
Step 2: Get the Client ID
The Client ID is the unique identifier for the Service Principal you just created.
After creating the service principal, you will be taken back to the list. Click on the name of the service principal you just created (e.g., Salesforce Data Cloud Connector).
On the configuration page for the service principal, you will see a field labeled Application ID. This is your Client ID.
Copy this Application ID value. You will paste this into the "Client Id" field in Salesforce.
Step 3: Generate the Client Secret
The Client Secret is like a password for your Service Principal.
While still on the configuration page for your service principal, scroll down to the OAuth secrets section.
Click the Generate secret button.
Crucial Step: A dialog box will appear displaying your new secret. You must copy this secret now and store it in a secure location. You will not be able to see this value again after you close this window.
This generated value is your Client Secret. You will paste this into the "Client Secret" field in Salesforce.
Step 4: Grant Permissions to the SQL Warehouse
By default, a new Service Principal has no permissions. You must grant it permission to use the specific SQL Warehouse that Salesforce will connect to.
Navigate to the SQL Warehouses page in the SQL persona of your workspace.
Click on the name of the SQL Warehouse you want to use.
Click the Permissions button at the top-right.
In the permissions dialog, find and select the Service Principal you created (e.g., Salesforce Data Cloud Connector).
Grant it the Can use permission.
Click Add/Update.
Summary: What to Fill In Salesforce
Salesforce Field What to Enter
Client Id The Application ID you copied in Step 2.
Client Secret The OAuth Secret you generated and saved in Step 3.
After completing these steps, your Salesforce connector will have the correct credentials and permissions to securely authenticate with and query your Databricks workspace.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now