โ05-14-2021 12:07 PM
I've reviewed the COPY INTO docs here - https://docs.databricks.com/spark/latest/spark-sql/language-manual/delta-copy-into.html#examples but there's only one simple example. Looking for some additional examples that show loading data from CSV - with explicitly defined schema vs inferring it, etc
โ06-21-2021 01:32 PM
Here's an example for predefined schema
%sql
CREATE OR REPLACE TABLE copy_into_bronze_test
(id STRING, reading_time TIMESTAMP, device_type STRING, device_id STRING, device_operational_status STRING, reading_1 double, reading_2 double, reading_3 double)
USING DELTA
%sql
COPY INTO copy_into_bronze_test
FROM
(
SELECT
_c0 as ID,
CAST (_c1 AS timestamp) AS reading_time,
_c2 as device_type,
_c3 as device_id,
_c4 as device_operational_status,
CAST(_c5 as double) as reading_1,
CAST(_c6 as double) as reading_2,
CAST(_c7 as double) as reading_3
FROM
'dbfs:/FileStore/flight/anand.ladda/*_no_headers.csv'
)
FILEFORMAT = CSV FORMAT_OPTIONS('sep' = ',','inferSchema' = 'false','header' = 'false')
โ06-21-2021 01:32 PM
Here's an example for predefined schema
%sql
CREATE OR REPLACE TABLE copy_into_bronze_test
(id STRING, reading_time TIMESTAMP, device_type STRING, device_id STRING, device_operational_status STRING, reading_1 double, reading_2 double, reading_3 double)
USING DELTA
%sql
COPY INTO copy_into_bronze_test
FROM
(
SELECT
_c0 as ID,
CAST (_c1 AS timestamp) AS reading_time,
_c2 as device_type,
_c3 as device_id,
_c4 as device_operational_status,
CAST(_c5 as double) as reading_1,
CAST(_c6 as double) as reading_2,
CAST(_c7 as double) as reading_3
FROM
'dbfs:/FileStore/flight/anand.ladda/*_no_headers.csv'
)
FILEFORMAT = CSV FORMAT_OPTIONS('sep' = ',','inferSchema' = 'false','header' = 'false')
โ03-28-2024 12:19 AM
How does this work when you have 200 tables to load with each table having 50+ columns how to build this code in a generic way the csv file do not have the headers and when loaded without select its not taking predefined schema
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now