Hi all.
I've used foreign catalog attached to azure sql databases and never had problems except in 2 situations:
1) Foreign Catalogs don't support sql schemas/objects like [xxxx.yyyy].tablename. The workaround is creating views on sql database
2) This is new issue: When I try to read a table from a foreign catalog, specifically a Binary column serialized as base64(a representation of a zip file) the resultset returns a truncated and wrong value and then I can't decode.
This is the second issue with data mappings. The first one was trying read a json column from google bigquery table and now this problem reading sql varbinary column(mapped to binary in databricks).
Using data factory I can read the sql database and copy to parquet files. It works but it should be done using only databricks.
Any help?