Hi Utu,try doing something like this , wrap import within a fixture itself.import osimport pytestfrom pyspark.sql import SparkSession_local_test = True@pytest.fixture(scope='session')def spark(): if 'DATABRICKS_RUNTIME_VERSION' in os.environ: ...
Hi Gauri,As I can read but someone from databricks can also confirm: as of now, SSIS is not supported as a source dialect for the transpile command in Databricks Labs LakeBridge. The analyze command supports SSIS for assessment and reporting, but the...
also try doing this :csv_file_path = "abfss://storage-dm-int-container@devdomaindmdbxint01.dfs.core.windows.net/raw_data/dummy.csv" add another folder for file
Hi,input path you provided to .load() overlaps with a path that is managed by Unity Catalog or Delta Live Tables (DLT). This is not allowed because Databricks prevents you from using Autoloader (cloudFiles) to read from or write to directories that a...