cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Salesforce with Databricks connectivity

LakehouseOMG14
New Contributor II

Can we connect Salesforce with Databricks.

I want to do both push and pull activity using Databricks and Salesforce.

Do There any challenge while using ODBC?

Please help me with detailed approach.

Thanks a ton.

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @LakehouseOMG14 ,

Yes, you can connect Databricks with Salesforce. You can leverage zero-copy federation as described in below article if you want to access delta tables in Salesforce.

Set Up a Databricks Data Federation Connection | Data Cloud Integrations: Home Page | Data Cloud Int...

To access  Salesforce data from Databricks you can use:

- Databricks Lakehouse Federation - Salesforce Data Cloud Connector. 

- Databricks LakeFlow Connect - Salesforce Ingestion Connector. 

More details you can find below:

Introducing Salesforce Connectors for Lakehouse Federation and LakeFlow Connect | Databricks Blog

Ingest data from Salesforce - Azure Databricks | Microsoft Learn

View solution in original post

7 REPLIES 7

szymon_dybczak
Esteemed Contributor III

Hi @LakehouseOMG14 ,

Yes, you can connect Databricks with Salesforce. You can leverage zero-copy federation as described in below article if you want to access delta tables in Salesforce.

Set Up a Databricks Data Federation Connection | Data Cloud Integrations: Home Page | Data Cloud Int...

To access  Salesforce data from Databricks you can use:

- Databricks Lakehouse Federation - Salesforce Data Cloud Connector. 

- Databricks LakeFlow Connect - Salesforce Ingestion Connector. 

More details you can find below:

Introducing Salesforce Connectors for Lakehouse Federation and LakeFlow Connect | Databricks Blog

Ingest data from Salesforce - Azure Databricks | Microsoft Learn

LakehouseOMG14
New Contributor II

@szymon_dybczak 

Thanks Sir.

Can we Extract or Pull data from salesforce to Delta lake using Databricks. 

What all can be architectural challenges kindly share, as this I am implementing first time.

 

Hi @LakehouseOMG14 ,

If you want to extract data from Salesforce and load it to Delta Table than you can use Lakeflow Connect.
There are some requirements you must meet to use that approach:

  • Your workspace must be enabled for Unity Catalog.

  • Serverless compute must be enabled for your workspace. See Enable serverless compute.

  • If you plan to use an existing connection: You must have USE CONNECTION privileges or ALL PRIVILEGES on the connection object.

  • You must have USE CATALOG privileges on the target catalog.

  • You must have USE SCHEMA and CREATE TABLE privileges on an existing schema or CREATE SCHEMA privileges on the target catalog.

  • Create a Salesforce user that Databricks can use to retrieve data. Make sure that the user has API access and access to all of the objects that you plan to ingest.

All that is described in below article. There you also find examples with pipeline definition:

Ingest data from Salesforce - Azure Databricks | Microsoft Learn

Also you can take a look at below video:

Get Data Into Databricks - Lakeflow Connect with Salesforce

ManojkMohan
Contributor III

HI Team / @szymon_dybczak  

When looking to set up a connector between data clouds and data bricks , can anyone give the exact step break  for 

  1. client id,
  2.  client secret, 
  3. url, 
  4. http path

Salesforce Databricks connector.jpg

โ€ƒ

ManojkMohan
Contributor III

HI Team / @szymon_dybczak  

When looking to set up a connector between data clouds and data bricks , can anyone give the exact step break  for 

  1. client id,
  2.  client secret, 
  3. url, 
  4. http path

Salesforce Databricks connector.jpg

โ€ƒThe connection URL is

ManojkMohan_0-1754337369748.png

but the path ?

szymon_dybczak
Esteemed Contributor III

Hi @ManojkMohan ,

ClientId and Client Secret are attributes of service principal. A service principal is a specialized identity in Azure Databricks designed for automation and programmatic access. Service principals give automated tools and scripts API-only access to Azure Databricks resources, providing greater security than using users accounts.

So, as a first step your databricks workspace or account admin needs to create for you service principal. Pay attention -according to Salesforce documentation,  ClientID/ClientSecret authentication is only supported using Azure Databricks-managed Service Principals. Microsoft Entra ID-managed principals are not supported.

The connection url is just url of your databricks workspace. 

Http path is a connection detail of your compute. How to find it is describes at below documentation entry:

https://docs.databricks.com/aws/en/integrations/compute-details

Edit: Here's a video that you can check. It shows more or less how to configure connector

https://youtu.be/DxTEOccMrXE

ManojkMohan
Contributor III

Have resolved it 

Step 1: Create a Service Principal
Log in to your Databricks Workspace and navigate to the Admin Settings page by clicking your email in the bottom-left corner and selecting "Admin Settings".

Go to the Identity and access tab and click on Service principals.

Click the Add service principal button.

Give your Service Principal a descriptive name, for example, Salesforce Data Cloud Connector, and click Add.

Step 2: Get the Client ID
The Client ID is the unique identifier for the Service Principal you just created.

After creating the service principal, you will be taken back to the list. Click on the name of the service principal you just created (e.g., Salesforce Data Cloud Connector).

On the configuration page for the service principal, you will see a field labeled Application ID. This is your Client ID.

Copy this Application ID value. You will paste this into the "Client Id" field in Salesforce.

Step 3: Generate the Client Secret
The Client Secret is like a password for your Service Principal.

While still on the configuration page for your service principal, scroll down to the OAuth secrets section.

Click the Generate secret button.

Crucial Step: A dialog box will appear displaying your new secret. You must copy this secret now and store it in a secure location. You will not be able to see this value again after you close this window.

This generated value is your Client Secret. You will paste this into the "Client Secret" field in Salesforce.

Step 4: Grant Permissions to the SQL Warehouse
By default, a new Service Principal has no permissions. You must grant it permission to use the specific SQL Warehouse that Salesforce will connect to.

Navigate to the SQL Warehouses page in the SQL persona of your workspace.
Click on the name of the SQL Warehouse you want to use.
Click the Permissions button at the top-right.
In the permissions dialog, find and select the Service Principal you created (e.g., Salesforce Data Cloud Connector).
Grant it the Can use permission.
Click Add/Update.
Summary: What to Fill In Salesforce
Salesforce Field What to Enter
Client Id The Application ID you copied in Step 2.
Client Secret The OAuth Secret you generated and saved in Step 3.
After completing these steps, your Salesforce connector will have the correct credentials and permissions to securely authenticate with and query your Databricks workspace.

ManojkMohan_0-1754346883258.pngManojkMohan_1-1754346964076.png

 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now