Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
4 weeks ago
Hi All,
We are planning a bulk migration of LUA Script / Exasol scripts to Databricks native PySpark and are evaluating the best approach for large-scale automated code conversion and testing in Databricks.
So far, we have analyzed the following options:
- Databricks AI Assistant
- Useful for interactive development inside notebooks / IDE
- But it does not seem ideal for bulk migration use cases
- Lakebridge
- Based on the documentation, it appears to support SQL script conversion
- It does not seem to support LUA scripts
- External LLMs (for example Claude Opus 4.6 or similar models)
- Calling the model through REST API from Databricks notebooks
- Passing the source scripts for conversion
- Writing the generated PySpark output into notebooks or files
- Building a custom automated conversion framework
We would like to understand:
- What is the best approach for bulk conversion of LUA/Exasol scripts into PySpark on Databricks?
- Are there any Databricks-native tools, partner solutions, or migration accelerators that support this scenario?
Our goal is to find a scalable approach that supports:
- bulk conversion
- automation
- code quality review
- testing/validation in Databricks
Any recommendations, architecture suggestions, or lessons learned would be very helpful.
Thanks in advance!
Labels:
- Labels:
-
Spark