- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
- Databricks AI Assistant
- Useful for interactive development inside notebooks / IDE
- But it does not seem ideal for bulk migration use cases
- Lakebridge
- Based on the documentation, it appears to support SQL script conversion
- It does not seem to support LUA scripts
- External LLMs (for example Claude Opus 4.6 or similar models)
- Calling the model through REST API from Databricks notebooks
- Passing the source scripts for conversion
- Writing the generated PySpark output into notebooks or files
- Building a custom automated conversion framework
- What is the best approach for bulk conversion of LUA/Exasol scripts into PySpark on Databricks?
- Are there any Databricks-native tools, partner solutions, or migration accelerators that support this scenario?
- bulk conversion
- automation
- code quality review
- testing/validation in Databricks
- Labels:
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi @Phani1,
After some research, I don't believe there’s a Databricks-native, one-click tool to bulk-convert Lua/Exasol to PySpark.
Databricks AI Assistant is great for interactive refactoring, but as you said, it’s not really a bulk‑migration engine. Lakebridge (per your note) is geared toward SQL conversion, not Lua procedural logic, so it will only cover part of the problem.
For this kind of migration, I’d recommend:
- Classify each script into SQL-heavy vs. Lua‑heavy logic.
- For SQL-heavy parts, use migration tooling (e.g., Lakebridge, where applicable) or an LLM to translate to Databricks / Spark SQL, then standardise and lint.
- For Lua-heavy control flow, use an LLM to generate PySpark/DBSQL skeletons, but treat that as a starting point and enforce code review + tests.
- Wrap the whole thing in a small migration factory: version control, automated conversion, and data reconciliation (parallel runs on Exasol vs Databricks to compare row counts/aggregates).
For large estates, the most effective approach is usually to combine this approach with a Databricks PSA/migration partner that already has patterns and accelerators for Exasol migrations. The best next step is to go via your Databricks account team so they can connect you with the right migration partner/PS team for your scale and timelines.
If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi @Ashwin_DSA,
Thanks for your response—this really helps validate our approach.
We agree on the need to split SQL-heavy and LUA-heavy logic and are planning a “migration factory” with LLM-assisted conversion, followed by strong validation (parallel runs + data reconciliation) rather than relying purely on code accuracy.
We’ll also explore connecting with Databricks PS/partners as suggested
Thanks again!
Phani