filipniziol
Esteemed Contributor

Hi @Avinash_Narala ,

If it is lift and shift, then try this:

1. Set up Lakehouse Federation to SQL Server

2. Use CTAS statements to copy each table into Unity Catalog

 

CREATE TABLE catalog_name.schema_name.table_name
AS 
SELECT *
FROM sql_server_catalog_name.sql_server_schema_name.your_sql_server_table

 

3. Loop through a list of tables in Python for automation

 

tables_to_migrate = [
    ("sql_server_schema_name", "SQL_TABLE_A", "unity_catalog_schema", "DEST_TABLE_A"),
    ("sql_server_schema_name", "SQL_TABLE_B", "unity_catalog_schema", "DEST_TABLE_B")
]

for src_schema, src_table, dest_schema, dest_table in tables_to_migrate:
    spark.sql(f"""
      CREATE TABLE my_catalog.{dest_schema}.{dest_table} 
      AS
      SELECT *
      FROM sql_server_catalog_name.{src_schema}.{src_table}
    """)

 

View solution in original post