<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Seeking Best Approach for Bulk Migration of LUA/Exasol Scripts to Databricks PySpark in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/150407#M53413</link>
    <description>&lt;P class="p8i6j01 paragraph"&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/36892"&gt;@Phani1&lt;/a&gt;,&lt;/P&gt;
&lt;P class="p1"&gt;After some research, I don't believe there’s a Databricks-native, one-click tool to bulk-convert Lua/Exasol to PySpark.&lt;/P&gt;
&lt;P class="p1"&gt;Databricks AI Assistant is great for interactive refactoring, but as you said, it’s not really a bulk‑migration engine. Lakebridge (per your note) is geared toward SQL conversion, not Lua procedural logic, so it will only cover part of the problem.&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;For this kind of migration, I’d recommend:&lt;/P&gt;
&lt;UL class="ul1"&gt;
&lt;LI class="li1"&gt;Classify each script into SQL-heavy vs. Lua‑heavy logic.&lt;/LI&gt;
&lt;LI class="li1"&gt;For SQL-heavy parts, use migration tooling (e.g., Lakebridge, where applicable) or an LLM to translate to Databricks / Spark SQL, then standardise and lint.&lt;/LI&gt;
&lt;LI class="li1"&gt;For Lua-heavy control flow, use an LLM to generate PySpark/DBSQL skeletons, but treat that as a starting point and enforce code review + tests.&lt;/LI&gt;
&lt;LI class="li1"&gt;Wrap the whole thing in a small migration factory: version control, automated conversion, and data reconciliation (parallel runs on Exasol vs Databricks to compare row counts/aggregates).&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="p1"&gt;For large estates, the most effective approach is usually to combine this approach with a Databricks PSA/migration partner that&amp;nbsp;already has patterns and accelerators for Exasol migrations. The best next step is to go via your Databricks account team so they can connect you with the right migration partner/PS team for your scale and timelines.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 09 Mar 2026 19:59:46 GMT</pubDate>
    <dc:creator>Ashwin_DSA</dc:creator>
    <dc:date>2026-03-09T19:59:46Z</dc:date>
    <item>
      <title>Seeking Best Approach for Bulk Migration of LUA/Exasol Scripts to Databricks PySpark</title>
      <link>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/150384#M53406</link>
      <description>&lt;DIV class=""&gt;Hi All,&lt;/DIV&gt;&lt;DIV class=""&gt;We are planning a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;bulk migration of LUA Script / Exasol scripts to Databricks native PySpark&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;and are evaluating the best approach for large-scale automated code conversion and testing in Databricks.&lt;/DIV&gt;&lt;DIV class=""&gt;So far, we have analyzed the following options:&lt;/DIV&gt;&lt;OL class=""&gt;&lt;LI&gt;&lt;DIV class=""&gt;&lt;STRONG&gt;Databricks AI Assistant&lt;/STRONG&gt;&lt;/DIV&gt;&lt;UL class=""&gt;&lt;LI&gt;Useful for interactive development inside notebooks / IDE&lt;/LI&gt;&lt;LI&gt;But it does not seem ideal for&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;bulk migration&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;use cases&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;DIV class=""&gt;&lt;STRONG&gt;Lakebridge&lt;/STRONG&gt;&lt;/DIV&gt;&lt;UL class=""&gt;&lt;LI&gt;Based on the documentation, it appears to support&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;SQL script conversion&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;It does not seem to support&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;LUA scripts&lt;/STRONG&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;DIV class=""&gt;&lt;STRONG&gt;External LLMs (for example Claude Opus 4.6 or similar models)&lt;/STRONG&gt;&lt;/DIV&gt;&lt;UL class=""&gt;&lt;LI&gt;Calling the model through&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;REST API from Databricks notebooks&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;Passing the source scripts for conversion&lt;/LI&gt;&lt;LI&gt;Writing the generated PySpark output into notebooks or files&lt;/LI&gt;&lt;LI&gt;Building a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;custom automated conversion framework&lt;/STRONG&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;DIV class=""&gt;We would like to understand:&lt;/DIV&gt;&lt;UL class=""&gt;&lt;LI&gt;What is the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;best approach&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;for&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;bulk conversion&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;of LUA/Exasol scripts into PySpark on Databricks?&lt;/LI&gt;&lt;LI&gt;Are there any Databricks-native tools, partner solutions, or migration accelerators that support this scenario?&lt;/LI&gt;&lt;/UL&gt;&lt;DIV class=""&gt;Our goal is to find a scalable approach that supports:&lt;/DIV&gt;&lt;UL class=""&gt;&lt;LI&gt;bulk conversion&lt;/LI&gt;&lt;LI&gt;automation&lt;/LI&gt;&lt;LI&gt;code quality review&lt;/LI&gt;&lt;LI&gt;testing/validation in Databricks&lt;/LI&gt;&lt;/UL&gt;&lt;DIV class=""&gt;Any recommendations, architecture suggestions, or lessons learned would be very helpful.&lt;/DIV&gt;&lt;DIV class=""&gt;Thanks in advance!&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;HR /&gt;</description>
      <pubDate>Mon, 09 Mar 2026 13:16:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/150384#M53406</guid>
      <dc:creator>Phani1</dc:creator>
      <dc:date>2026-03-09T13:16:05Z</dc:date>
    </item>
    <item>
      <title>Re: Seeking Best Approach for Bulk Migration of LUA/Exasol Scripts to Databricks PySpark</title>
      <link>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/150407#M53413</link>
      <description>&lt;P class="p8i6j01 paragraph"&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/36892"&gt;@Phani1&lt;/a&gt;,&lt;/P&gt;
&lt;P class="p1"&gt;After some research, I don't believe there’s a Databricks-native, one-click tool to bulk-convert Lua/Exasol to PySpark.&lt;/P&gt;
&lt;P class="p1"&gt;Databricks AI Assistant is great for interactive refactoring, but as you said, it’s not really a bulk‑migration engine. Lakebridge (per your note) is geared toward SQL conversion, not Lua procedural logic, so it will only cover part of the problem.&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;For this kind of migration, I’d recommend:&lt;/P&gt;
&lt;UL class="ul1"&gt;
&lt;LI class="li1"&gt;Classify each script into SQL-heavy vs. Lua‑heavy logic.&lt;/LI&gt;
&lt;LI class="li1"&gt;For SQL-heavy parts, use migration tooling (e.g., Lakebridge, where applicable) or an LLM to translate to Databricks / Spark SQL, then standardise and lint.&lt;/LI&gt;
&lt;LI class="li1"&gt;For Lua-heavy control flow, use an LLM to generate PySpark/DBSQL skeletons, but treat that as a starting point and enforce code review + tests.&lt;/LI&gt;
&lt;LI class="li1"&gt;Wrap the whole thing in a small migration factory: version control, automated conversion, and data reconciliation (parallel runs on Exasol vs Databricks to compare row counts/aggregates).&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="p1"&gt;For large estates, the most effective approach is usually to combine this approach with a Databricks PSA/migration partner that&amp;nbsp;already has patterns and accelerators for Exasol migrations. The best next step is to go via your Databricks account team so they can connect you with the right migration partner/PS team for your scale and timelines.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 09 Mar 2026 19:59:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/150407#M53413</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-03-09T19:59:46Z</dc:date>
    </item>
    <item>
      <title>Re: Seeking Best Approach for Bulk Migration of LUA/Exasol Scripts to Databricks PySpark</title>
      <link>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/151116#M53586</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/216690"&gt;@Ashwin_DSA&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;Thanks for your response—this really helps validate our approach.&lt;/P&gt;&lt;P&gt;We agree on the need to split SQL-heavy and LUA-heavy logic and are planning a “migration factory” with LLM-assisted conversion, followed by strong validation (parallel runs + data reconciliation) rather than relying purely on code accuracy.&lt;/P&gt;&lt;P&gt;We’ll also explore connecting with Databricks PS/partners as suggested&lt;/P&gt;&lt;P&gt;Thanks again!&lt;/P&gt;&lt;P&gt;Phani&lt;/P&gt;</description>
      <pubDate>Tue, 17 Mar 2026 08:29:59 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/seeking-best-approach-for-bulk-migration-of-lua-exasol-scripts/m-p/151116#M53586</guid>
      <dc:creator>Phani1</dc:creator>
      <dc:date>2026-03-17T08:29:59Z</dc:date>
    </item>
  </channel>
</rss>

