Issue with Pyspark GroupBy GroupedData
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-22-2023 07:09 AM
Hi Guys,
I am working on streaming data movement from bronze to silver. My bronze table is having a entity_name column, based on the entity_name column i need to create multiple silver tables.
I tried the below approach, But it is failing with error "'GroupedData' object has no attribute 'get_group'"
Sample Code Snippet :
grouped_df = bronze_df.groupBy("entity_name")
entity_names = [row.PrimaryEntityName for row in grouped_df.agg({"entity_name": "first"}).collect()]
for entity_name in entity_names:
entity_df = grouped_df.get_group(entity_name)
I think where/filter clause can do the needful but efficiency wise it wont be a good solution in my pov. Is there anyother approach on doing this?
TIA.
- Labels:
-
Bronze Table
-
Groupby
-
Pyspark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-22-2023 09:48 PM
@Harun Raseed Basheer :
The issue with your code is that the groupBy operation returns a GroupedData object, which does not have a get_group method. Instead, you can use the filter method to filter the bronze_df DataFrame for each entity name and write the resulting DataFrames to separate Silver tables.
Here's an example of how you can modify your code to achieve this:
from pyspark.sql.functions import col
# Group the bronze DataFrame by entity_name
grouped_df = bronze_df.groupBy("entity_name")
# Extract the unique entity names
entity_names = [row.entity_name for row in grouped_df.agg({"entity_name": "first"}).collect()]
# Filter the bronze DataFrame for each entity name and write to a separate Silver table
for entity_name in entity_names:
entity_df = bronze_df.filter(col("entity_name") == entity_name)
entity_df.write.format("delta").mode("append").save(f"/mnt/silver/{entity_name}")
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-26-2023 10:23 PM
Hi @Harun Raseed Basheer
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!

