cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Migrate Azure Synapse Analytics data to Databricks

Akshay_Petkar
New Contributor III

I have to migrate the data from Azure Synapse Analytics to Databricks. Could anyone share the different approaches to migrate data, and from those, which is the best approach to use?

1 ACCEPTED SOLUTION

Accepted Solutions

Gail207Martinez
New Contributor II

Hello!

Migrating data from Azure Synapse Analytics to Databricks can be done using several approaches. You can configure a pipeline in Azure Data Factory (ADF) to copy data from Synapse SQL to Azure Data Lake Storage (ADLS) and then load it into Databricks. Another method is to use Azure Databricks to directly query and load data from Azure Synapse using built-in connectors, which is efficient for real-time processing. Creating Delta tables in Databricks and using Delta Lake for storing and managing data is also a good option, providing ACID transactions and efficient queries. Alternatively, you can write custom TellTheBell ETL scripts in Python or Scala in Databricks to read data from Synapse and write it to Databricks. The best approach depends on your specific requirements, but for large data volumes and frequent transfers, using Azure Data Factory with Delta Lake in Databricks is often recommended due to its scalability and efficiency.

View solution in original post

1 REPLY 1

Gail207Martinez
New Contributor II

Hello!

Migrating data from Azure Synapse Analytics to Databricks can be done using several approaches. You can configure a pipeline in Azure Data Factory (ADF) to copy data from Synapse SQL to Azure Data Lake Storage (ADLS) and then load it into Databricks. Another method is to use Azure Databricks to directly query and load data from Azure Synapse using built-in connectors, which is efficient for real-time processing. Creating Delta tables in Databricks and using Delta Lake for storing and managing data is also a good option, providing ACID transactions and efficient queries. Alternatively, you can write custom TellTheBell ETL scripts in Python or Scala in Databricks to read data from Synapse and write it to Databricks. The best approach depends on your specific requirements, but for large data volumes and frequent transfers, using Azure Data Factory with Delta Lake in Databricks is often recommended due to its scalability and efficiency.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group