cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Lowcode ETL in Databricks

kazinahian
New Contributor III

Hello everyone,

I work as a Business Intelligence practitioner, employing tools like Alteryx or various low-code solutions to construct ETL processes and develop data pipelines for my Dashboards and reports. Currently, I'm delving into Azure Databricks. I typically access different tables to create my dashboards. I'm curious if there are any low-code ETL data pipeline tools available within Azure?

Thank you!

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @kazinahian,  In the Azure ecosystem, you have a few options for building ETL (Extract, Transform, Load) data pipelines, including low-code solutions.

Letโ€™s explore some relevant tools:

  1. Azure Data Factory:

    • Purpose: Azure Data Factory is a cloud-based data integration service that allows you to create, schedule, and manage data-driven workflows.
    • Features:
      • Data Movement: It automates data movement from various sources to destinations, including Azure SQL Database, Azure data lake Storage, and more.
      • Data Transformation: You can use Data Factory to transform data using mapping, filtering, and other transformations.
      • Orchestration: It orchestrates complex workflows involving multiple data sources and destinations.
    • Low-Code Aspect: While it provides a visual interface for designing pipelines, you can also write custom code (e.g., Python, SQL) within activities.
    • Learn More: Azure Data Factory Documentation
  2. Azure Databricks:

    • Purpose: Azure Databricks is a collaborative analytics platform that combines Apache Spark with Azure services.
    • Features:
      • ETL Pipelines: You can create ETL pipelines using Databricks notebooks, which allow you to write Spark code (Scala, Python, or SQL).
      • Scalability: Databricks scales horizontally, making it suitable for big data workloads.
      • Integration: It integrates well with other Azure services like Azure Data Lake Storage and Azure SQL Data Warehouse.
    • Low-Code Aspect: While it involves coding, Databricks provides a user-friendly interface for notebook development.
    • Learn More: Run your first ETL workload on Azure Databricks
  3. SQL Server Integration Services (SSIS):

    • Purpose: Although not purely an Azure service, SSIS is widely used for ETL tasks.
    • Features:
      • Visual Design: SSIS offers a visual design environment for creating ETL packages.
      • Extensibility: You can write custom scripts or use built-in components.
      • Integration with Azure: You can run SSIS packages on Azure SQL Database or Azure SQL Managed Instance.
    • Low-Code Aspect: SSIS is primarily a visual tool, but it also supports custom scripting.
    • Learn More: SQL Server Integration Services
  4. Other Tools in Azure Marketplace:

Remember that the choice of tool depends on your specific requirements, data volume, and complexity. Feel free to explore these options and find the one that best fits your needs! ๐Ÿ˜Š

 

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @kazinahian,  In the Azure ecosystem, you have a few options for building ETL (Extract, Transform, Load) data pipelines, including low-code solutions.

Letโ€™s explore some relevant tools:

  1. Azure Data Factory:

    • Purpose: Azure Data Factory is a cloud-based data integration service that allows you to create, schedule, and manage data-driven workflows.
    • Features:
      • Data Movement: It automates data movement from various sources to destinations, including Azure SQL Database, Azure data lake Storage, and more.
      • Data Transformation: You can use Data Factory to transform data using mapping, filtering, and other transformations.
      • Orchestration: It orchestrates complex workflows involving multiple data sources and destinations.
    • Low-Code Aspect: While it provides a visual interface for designing pipelines, you can also write custom code (e.g., Python, SQL) within activities.
    • Learn More: Azure Data Factory Documentation
  2. Azure Databricks:

    • Purpose: Azure Databricks is a collaborative analytics platform that combines Apache Spark with Azure services.
    • Features:
      • ETL Pipelines: You can create ETL pipelines using Databricks notebooks, which allow you to write Spark code (Scala, Python, or SQL).
      • Scalability: Databricks scales horizontally, making it suitable for big data workloads.
      • Integration: It integrates well with other Azure services like Azure Data Lake Storage and Azure SQL Data Warehouse.
    • Low-Code Aspect: While it involves coding, Databricks provides a user-friendly interface for notebook development.
    • Learn More: Run your first ETL workload on Azure Databricks
  3. SQL Server Integration Services (SSIS):

    • Purpose: Although not purely an Azure service, SSIS is widely used for ETL tasks.
    • Features:
      • Visual Design: SSIS offers a visual design environment for creating ETL packages.
      • Extensibility: You can write custom scripts or use built-in components.
      • Integration with Azure: You can run SSIS packages on Azure SQL Database or Azure SQL Managed Instance.
    • Low-Code Aspect: SSIS is primarily a visual tool, but it also supports custom scripting.
    • Learn More: SQL Server Integration Services
  4. Other Tools in Azure Marketplace:

Remember that the choice of tool depends on your specific requirements, data volume, and complexity. Feel free to explore these options and find the one that best fits your needs! ๐Ÿ˜Š

 

Thank you.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!