Hi @Phani1,
After some research, I don't believe thereโs a Databricks-native, one-click tool to bulk-convert Lua/Exasol to PySpark.
Databricks AI Assistant is great for interactive refactoring, but as you said, itโs not really a bulkโmigration engine. Lakebridge (per your note) is geared toward SQL conversion, not Lua procedural logic, so it will only cover part of the problem.
For this kind of migration, Iโd recommend:
- Classify each script into SQL-heavy vs. Luaโheavy logic.
- For SQL-heavy parts, use migration tooling (e.g., Lakebridge, where applicable) or an LLM to translate to Databricks / Spark SQL, then standardise and lint.
- For Lua-heavy control flow, use an LLM to generate PySpark/DBSQL skeletons, but treat that as a starting point and enforce code review + tests.
- Wrap the whole thing in a small migration factory: version control, automated conversion, and data reconciliation (parallel runs on Exasol vs Databricks to compare row counts/aggregates).
For large estates, the most effective approach is usually to combine this approach with a Databricks PSA/migration partner that already has patterns and accelerators for Exasol migrations. The best next step is to go via your Databricks account team so they can connect you with the right migration partner/PS team for your scale and timelines.
If this answer resolves your question, could you mark it as โAccept as Solutionโ? That helps other users quickly find the correct fix.
Regards,
Ashwin | Delivery Solution Architect @ Databricks
Helping you build and scale the Data Intelligence Platform.
***Opinions are my own***