Pyspark 3.3.0 exceptAll working on 11.3 LTS but not locally
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2023 03:19 AM
Hello,
Currently I'm in process of upgrading the DBR version in my jobs to version 11.3 LTS. After upgrading pyspark version to 3.3.0 on my local machine I found that exceptAll function is broken (it looks like others have similar problem). It throws error that column could not be found (see attached screenshot). What is weird, on databricks 11.3 LTS cluster (which uses Spark 3.3.0 according to documentation) the same calculations works!
How is that possible? What am I missing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2023 03:20 AM
Just for the record upgrading pyspark to version 3.3.1 fix the problem, but I would preffer to have the same version of pyspark like I have on LTS...

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-24-2023 03:03 AM
Hi @Jacek Dembowiak
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!

