I'm getting the error...
AttributeError: 'SparkSession' object has no attribute '_wrapped'
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<command-2311820097584616> in <cell line: 2>()
1 from sparknlp.training import CoNLL
----> 2 trainingData = CoNLL().readDataset(spark, 'dbfs:/FileStore/Users/tobiasc@slalom.com/HLS/nlp/data/coNLL_2003_eng.train')
3 trainingData.selectExpr(
4 "text",
5 "token.result as tokens",
/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/sparknlp/training/conll.py in readDataset(self, spark, path, read_as, partitions, storage_level)
141 jdf = self._java_obj.readDataset(jSession, path, read_as, partitions,
142 spark.sparkContext._getJavaStorageLevel(storage_level))
--> 143 return DataFrame(jdf, spark._wrapped)
144
When executing the following code...
from sparknlp.training import CoNLL
trainingData = CoNLL().readDataset(spark, 'dbfs:/FileStore/eng.train')
trainingData.selectExpr(
"text",
"token.result as tokens",
"pos.result as pos",
"label.result as label"
).show(3, False)
Can anyone help?