โ09-27-2023 05:27 AM
I am loading a table into a data frame using
df = spark.table(table_name)
Is there a way to load only the required columns? The table has more than 50+ columns and I only need a handful of column.
โ09-27-2023 10:39 PM
@vk217 Simply just use select function, ex.
df = spark.read.table(table_name).select("col1", "col2", "col3")
View solution in original post
never-displayed
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!