<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How to Use BladeBridge for Redshift to Databricks Migration? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/120607#M46202</link>
    <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;I have a Redshift queries that I need to migrate to Databricks using &lt;STRONG&gt;BladeBridge&lt;/STRONG&gt;, but I have never used BladeBridge before and can’t find any clear documentation or steps on how to use it within the Databricks environment.&lt;/P&gt;&lt;P&gt;If anyone has already implemented BladeBridge for Redshift or any other warehouse to Databricks conversion, I’d really appreciate it if you could share:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Your experience or approach&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Any documentation or links that helped you get started&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks in advance for your support!&lt;/P&gt;</description>
    <pubDate>Fri, 30 May 2025 12:35:49 GMT</pubDate>
    <dc:creator>Akshay_Petkar</dc:creator>
    <dc:date>2025-05-30T12:35:49Z</dc:date>
    <item>
      <title>How to Use BladeBridge for Redshift to Databricks Migration?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/120607#M46202</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;I have a Redshift queries that I need to migrate to Databricks using &lt;STRONG&gt;BladeBridge&lt;/STRONG&gt;, but I have never used BladeBridge before and can’t find any clear documentation or steps on how to use it within the Databricks environment.&lt;/P&gt;&lt;P&gt;If anyone has already implemented BladeBridge for Redshift or any other warehouse to Databricks conversion, I’d really appreciate it if you could share:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Your experience or approach&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Any documentation or links that helped you get started&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks in advance for your support!&lt;/P&gt;</description>
      <pubDate>Fri, 30 May 2025 12:35:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/120607#M46202</guid>
      <dc:creator>Akshay_Petkar</dc:creator>
      <dc:date>2025-05-30T12:35:49Z</dc:date>
    </item>
    <item>
      <title>Re: How to Use BladeBridge for Redshift to Databricks Migration?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/120621#M46211</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/88335"&gt;@Akshay_Petkar&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Migrating Amazon Redshift SQL to Databricks (especially Delta Lake or Unity Catalog-backed systems) using BladeBridge is a practical yet less-documented use case.&lt;BR /&gt;Since BladeBridge is a commercial tool with limited public documentation, here's a consolidated response based on real-world usage patterns, typical migration steps,&lt;BR /&gt;and best practices gathered from enterprise implementations.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;BladeBridge for Redshift to Databricks Migration&lt;/STRONG&gt;&lt;BR /&gt;BladeBridge operates as a code translation framework, helping automate SQL/ETL conversions with configurable rule engines.&lt;BR /&gt;For Redshift to Databricks (SQL/Delta/Unity Catalog), this often means:&lt;BR /&gt;- Parsing Redshift SQL (DDL, DML, Views, Functions)&lt;BR /&gt;- Translating syntax, types, and warehouse-specific constructs to Spark SQL / Databricks SQL&lt;BR /&gt;- Packaging output as Databricks notebooks, dbt models, or SQL scripts.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Step-by-Step Migration Flow:&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;Step 1: Set Up BladeBridge&lt;/STRONG&gt;&lt;BR /&gt;- Get access to the BladeBridge environment (either via your enterprise license or BladeBridge-managed services).&lt;BR /&gt;- Work with BladeBridge support to enable Redshift as source and Databricks (Delta Lake/Spark SQL) as the target.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Step 2: Extract Redshift Code&lt;/STRONG&gt;&lt;BR /&gt;- Use BladeBridge’s metadata extractor or CLI to scan your Redshift warehous.&lt;BR /&gt;- This typically includes:&lt;BR /&gt;1. Stored procedures&lt;BR /&gt;2. UDFs&lt;BR /&gt;3. Views&lt;BR /&gt;4. Complex SQL queries&lt;BR /&gt;5. ETL control logic (if embedded)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Step 3: Define Mapping Rules&lt;/STRONG&gt;&lt;BR /&gt;BladeBridge uses a rule-based translation engine. You or the BladeBridge team will:&lt;BR /&gt;- Map Redshift-specific functions (e.g., DISTINCT ON, ENCODE, STL_* system tables) to Databricks-compatible alternatives.&lt;BR /&gt;- Handle data type conversions (SUPER, GEOMETRY, etc. → struct/JSON or compatible formats)&lt;BR /&gt;- Replace Redshift-specific syntax with Spark/Databricks equivalents.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Step 4: Generate Target Code&lt;/STRONG&gt;&lt;BR /&gt;- BladeBridge will generate:&lt;BR /&gt;1 Spark SQL / Databricks SQL scripts&lt;BR /&gt;2 Optional: PySpark or Scala code if procedural logic needs translation&lt;BR /&gt;3 Notebooks (.dbc or .ipynb)&lt;BR /&gt;4 dbt-compatible models (if configured)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Step 5: Validation &amp;amp; QA&lt;/STRONG&gt;&lt;BR /&gt;- BladeBridge offers data diffing / validation capabilities to compare Redshift and Databricks output.&lt;BR /&gt;- Integrate with Great Expectations or Delta Live Tables expectations if needed.&lt;BR /&gt;- Unit tests and volume-based data checks are essential post-conversion.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Step 6: Deployment&lt;/STRONG&gt;&lt;BR /&gt;- Load converted code into Databricks (via Workspace API, Git sync, or notebooks).&lt;BR /&gt;- Use Databricks Jobs or Workflows to orchestrate converted SQL pipelines.&lt;BR /&gt;- Set up access permissions if you're using Unity Catalog.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;STRONG&gt;Ask BladeBridge for:&lt;/STRONG&gt;&lt;BR /&gt;- Redshift → Databricks Conversion Mapping Guide&lt;BR /&gt;- Rule Engine Customization Manual&lt;BR /&gt;- CLI/SDK Usage Docs&lt;BR /&gt;- Check with your Databricks TAM (if enterprise) — they often co-pilot BladeBridge-based migrations.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;My Suggestion:&lt;/STRONG&gt;&lt;BR /&gt;If this is your first time, request BladeBridge to:&lt;BR /&gt;- Do a pilot migration of 10–20 complex querie&lt;BR /&gt;- Provide documentation for custom rules&lt;BR /&gt;- Clarify the translation logic visibility so you can tune it in-house later&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 30 May 2025 16:23:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/120621#M46211</guid>
      <dc:creator>lingareddy_Alva</dc:creator>
      <dc:date>2025-05-30T16:23:42Z</dc:date>
    </item>
    <item>
      <title>Re: How to Use BladeBridge for Redshift to Databricks Migration?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/128931#M48378</link>
      <description>&lt;P&gt;Dear&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/24053"&gt;@lingareddy_Alva&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Thank you so much for sharing these steps &amp;amp; specifics. Much appreciated!&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Context&lt;/STRONG&gt;:&lt;/P&gt;&lt;P&gt;Have just started exploring BladeBridge for AWS Redshift to Databricks migration.&amp;nbsp;&lt;/P&gt;&lt;P&gt;"&lt;SPAN&gt;BladeBridge operates as a code translation framework" and it supports and provides many other activities as part of the e2e data migration process.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;The reconcile step in the official documentation &lt;A href="https://databrickslabs.github.io/lakebridge/docs/reconcile/dataflow_example/" target="_self"&gt;here&lt;/A&gt; shows reconciliation between source and target.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;But there is NO mention of actual data flow / data migration / Redshift Unload etc anywhere.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;Question&lt;/STRONG&gt;: &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Is the actual data movement or I/O from source (Redshift) to target (Databricks) NOT handled by BladeBridge? OOB?&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Dileep&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 20 Aug 2025 03:31:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-bladebridge-for-redshift-to-databricks-migration/m-p/128931#M48378</guid>
      <dc:creator>ddharma</dc:creator>
      <dc:date>2025-08-20T03:31:53Z</dc:date>
    </item>
  </channel>
</rss>

