cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Trying to use pivot function with pyspark for count aggregate

rbricks007
New Contributor II

I'm trying this code but getting the following error

 

testDF = (eventsDF
          .groupBy("user_id")
          .pivot("event_name")
          .count("event_name"))

 

TypeError: _api() takes 1 positional argument but 2 were given

Please guide how to fix the error

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @rbricks007 ,The error you’re encountering in your PySpark code is related to the usage of the pivot function. 

  • The pivot function expects only one positional argument, but it seems you’re passing two.
  • Try making these adjustments, and your code should work without errors. 
    Happy coding! 🚀


 

View solution in original post

2 REPLIES 2

Krishnamatta
New Contributor III

Try this

from pyspark.sql import functions as F
testDF = (eventsDF
            .groupBy("user_id")
            .pivot("event_name")
            .agg(F.count("event_name")))

 

 

Hi @rbricks007 ,The error you’re encountering in your PySpark code is related to the usage of the pivot function. 

  • The pivot function expects only one positional argument, but it seems you’re passing two.
  • Try making these adjustments, and your code should work without errors. 
    Happy coding! 🚀