08-14-2024 01:01 AM
08-22-2024 02:42 PM
I am having the same issue. Did you ever determine how to solve this?
08-22-2024 11:56 PM
What type of cluster do you use, and what version of graphframes?
08-23-2024 12:14 AM
I'm running on 14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12), 1 driver (n2-highmem 😎 64 GB Memory, 8 Cores, 1-8 Workers (n2-highmem-4) 32-256 GB Memory 4-32 Cores. My version of graphframes is 0.6.
08-23-2024 12:32 AM
Please install the latest graphframes version.
0.6 has only spark 2 support.
Also in the latest version the deprecated sql context is removed:
https://github.com/graphframes/graphframes/releases
08-23-2024 09:04 AM - edited 08-23-2024 09:08 AM
Thanks for the response -werners-. Version 0.8.3 installed via https://pypi.org/project/graphframes-latest/ gives a different error: AttributeError: 'SparkSession' object has no attribute '_sc'. No version above 0.6 is available via %pip install graphframes --upgrade.
08-25-2024 11:18 PM
Serverless compute might be the issue here.
Can you deploy a classic compute cluster and install graphframes?
09-03-2024 03:29 PM
Hi, sorry for the delay, yes this was done several weeks ago but my issue is with deploying it on Serverless. Also we are unable to deploy custom clusters currently due to a separate issue.
09-03-2024 11:26 PM
Serverless compute has limitations, like installing libraries. So at the moment that won't be possible.
Oldschool clusters have way more configuration possibilities, so hopefully you can fix the issue you experience on deploying clusters.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group