โ02-27-2022 06:15 AM
Hi,
How to convert each row of dataframe to array of rows?
Here is our scenario , we need to pass each row of dataframe to one function as dict to apply the key level transformations. But as our data is very huge we can't use collect df.toJson().collect() to iterate over each row as it uses only driver's memory.
Please add your suggestions here.
Thank you
โ05-04-2022 09:10 AM
You have two options here:
Option 2 is most performant, but it involves operating on many rows. Still, a common approach is to define a row-level function in your PandasUDF and call it using an .apply. Take a look at the docs and try out some options.
โ02-27-2022 06:38 AM
To optimize performance I would write pandas vectorized function and registered it as udf . https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.sql.functions.pandas_udf.html
If it is one time go you can also use basic spark function like foreach https://spark.apache.org/docs/3.1.1/api/python/reference/api/pyspark.sql.DataFrame.foreach.html
โ02-28-2022 02:19 AM
@Hubert Dudekโ , Thank you for the reply.
We are new to ADB. And using the below code, looking for an optimized way to do it
dfJSONString = df.toJSON().collect()
stringList = []
for row in dfJSONString:
# ==== Unflatten the JSON string ==== #
jsonString = unflatten(json.loads(row),dictreg[reg.upper()])
stringList.append(json.dumps(jsonString))
Thank you
โ05-04-2022 09:10 AM
You have two options here:
Option 2 is most performant, but it involves operating on many rows. Still, a common approach is to define a row-level function in your PandasUDF and call it using an .apply. Take a look at the docs and try out some options.
โ06-07-2022 09:20 AM
Hi @Sailaja Bโ,
Just a friendly follow-up. Did you saw Dan's response? Do you have any follow-up questions or can you select Dan's as best answer?
โ05-10-2022 09:39 AM
Hi @Sailaja Bโ ,
Just a friendly follow-up. Do you still need help or the recommendations given By Dan or Hubert help you to resolve your issue? please let us know.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group