- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi @Phani1,
After some research, I don't believe there’s a Databricks-native, one-click tool to bulk-convert Lua/Exasol to PySpark.
Databricks AI Assistant is great for interactive refactoring, but as you said, it’s not really a bulk‑migration engine. Lakebridge (per your note) is geared toward SQL conversion, not Lua procedural logic, so it will only cover part of the problem.
For this kind of migration, I’d recommend:
- Classify each script into SQL-heavy vs. Lua‑heavy logic.
- For SQL-heavy parts, use migration tooling (e.g., Lakebridge, where applicable) or an LLM to translate to Databricks / Spark SQL, then standardise and lint.
- For Lua-heavy control flow, use an LLM to generate PySpark/DBSQL skeletons, but treat that as a starting point and enforce code review + tests.
- Wrap the whole thing in a small migration factory: version control, automated conversion, and data reconciliation (parallel runs on Exasol vs Databricks to compare row counts/aggregates).
For large estates, the most effective approach is usually to combine this approach with a Databricks PSA/migration partner that already has patterns and accelerators for Exasol migrations. The best next step is to go via your Databricks account team so they can connect you with the right migration partner/PS team for your scale and timelines.
If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***