Hi!
When I run a notebook on databricks, it throws error - " 'JavaPackage' object is not callable" which points to pydeequ library:
/local_disk0/.ephemeral_nfs/envs/pythonEnv-3abbb1aa-ee5b-48da-aaf2-18f273299f52/lib/python3.8/site-packages/pydeequ/checks.py in __init__(self, spark_session, level, description, constraints)
91 self._jvm = spark_session._jvm
92 self.level = level
---> 93 self._java_level = self.level._get_java_object(self._jvm)
94 self._check_java_class = self._jvm.com.amzon.deequ.checks.Check
95 self.description = description
/local_disk0/.ephemeral_nfs/envs/pythonEnv-3abbb1aa-ee5b-48da-aaf2-18f273299f52/lib/python3.8/site-packages/pydeequ/checks.py in _get_java_object(self, jvm)
19 return jvm.com.amzon.deequ.checks.CheckLevel.Error()
20 if self == CheckLevel.Warning:
---> 21 return jvm.com.amzon.deequ.checks.CheckLevel.Warning()
22 raise ValueError("Invalid value for CheckLevel Enum")
Spark 3.2.0,
Scala 2.12
I believe it has something to do with my runtime version, but I dont want to downgrade it.
Please help me with this.
Thanks