Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Yes I found that solution too, the problem after that was dbutils, pylint identify that as a no imported library. I found a solution at least for me in the pylintrc file I added this:
Hey @carlos_tasayco, it seems like you are running a test here. If you are running a test outside of a databricks environment (like in a CI pipeline) then you need to define a spark session manually. In the DBR (databricks runtime) the spark session is created automatically for you but when you run code locally or in a CI pipeline, then you need to build the spark session yourself and save it in the variable 'spark'.
Check out this thread and look at MarkusFra's reply, he creates a Databricks Session object in his script which is what I think you need to do here.
Yes I found that solution too, the problem after that was dbutils, pylint identify that as a no imported library. I found a solution at least for me in the pylintrc file I added this:
Doing this I avoide this problem.
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!