It seems like you are encountering an issue with the schema mapping when importing external tables to Unity Catalog in Databricks.
To troubleshoot thisBased on the information you've provided, it sounds like the issue you're experiencing could be related to the way Databricks is interpreting the schema of external tables imported from SQL Server.
When importing an external table into Databricks through the Unity Catalog, Databricks will attempt to automatically infer the schema of the table based on the underlying metadata. However, there may be cases where the metadata is incomplete, outdated, or otherwise inconsistent with the actual table schema, leading to issues with queries.
To resolve this issue, you may need to manually specify the schema of the external tables in Databricks when creating them, rather than relying on Databricks to infer it from the metadata. You can do this by specifying the schema as a list of StructFields when defining the table using the CREATE TABLE statement.
Here's an example of how you can define an external table with a specific schema in Databricks:
CREATE TABLE my_external_table
USING <external_table_type>
OPTIONS (<external_table_options>)
LOCATION '<external_table_location>'
COMMENT '<table_description>'
TBLPROPERTIES ( '<table_properties>' )
AS SELECT *
FROM my_sql_server_table
WHERE 1 = 0; -- This will create an empty table with the designated schema
ALTER TABLE my_external_table
ADD COLUMNS (
col1 STRING,
col2 INT,
col3 DOUBLE
);
In this example, you can define the external table by specifying the external table type, options, location, and table properties. You can then use the AS SELECT statement to create an empty table in Databricks based on the schema of your SQL Server table and use ALTER TABLE to add the column names and data types that are relevant to the Databricks query. This method should ensure that the schema of the external table in Databricks is consistent with the actual table schema in SQL Server.
If the issue persists, it may be helpful to review the Databricks logs to see if there are any error messages or inconsistencies related to the schema inference process. You can also reach out to the Databricks support team for further assistance with troubleshooting this issue.