cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Can use graphframes DBR 14.3

Henrik_
New Contributor III
I get the following error when trying to run GraphFrame on DBR 14.3. Anyone has an idea of how I can solve this? 
 
"""
import pyspark.sql.functions as F
from graphframes import GraphFrame
 
vertices = spark.createDataFrame([
    ("a", "Alice", 34),
    ("b", "Bob", 36),
    ("c", "Charlie", 30),
    ("d", "David", 29),
    ("e", "Esther", 32),
    ("f", "Fanny", 36),
    ("g", "Gabby", 60)],
    ["id", "name", "age"])

edges = spark.createDataFrame([
    ("a", "b", "friend"),
    ("b", "c", "follow"),
    ("c", "b", "follow"),
    ("f", "c", "follow"),
    ("e", "f", "follow"),
    ("e", "d", "friend"),
    ("d", "a", "friend"),
    ("a", "e", "friend")],
    ["src", "dst", "relationship"])

g = GraphFrame(vertices, edges)
display(g)
 
But I get this error: 
 
[ATTRIBUTE_NOT_SUPPORTED] Attribute `sql_ctx` is not supported.
File <command-1696281758644499>, line 22 1 vertices = spark.createDataFrame([ 2 ("a", "Alice", 34), 3 ("b", "Bob", 36), (...) 8 ("g", "Gabby", 60)], 9 ["id", "name", "age"]) 11 edges = spark.createDataFrame([ 12 ("a", "b", "friend"), 13 ("b", "c", "follow"), (...) 19 ("a", "e", "friend")], 20 ["src", "dst", "relationship"]) ---> 22 g = GraphFrame(vertices, edges) 23 display(g)

File /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.10/site-packages/graphframes/graphframe.py:63, in GraphFrame.__init__(self, v, e) 61 self._vertices = v 62 self._edges = e ---> 63 self._sqlContext = v.sql_ctx 64 self._sc = self._sqlContext._sc 65 self._jvm_gf_api = _java_api(self._sc)
File /databricks/spark/python/pyspark/sql/connect/dataframe.py:1757, in DataFrame.__getattr__(self, name) 1754 # END-EDGE 1756 if name not in self.columns: -> 1757 raise PySparkAttributeError( 1758 error_class="ATTRIBUTE_NOT_SUPPORTED", message_parameters={"attr_name": name} 1759 ) 1761 return self._col(name)
"""
8 REPLIES 8

jctanner
New Contributor III

I am having the same issue. Did you ever determine how to solve this?

-werners-
Esteemed Contributor III

What type of cluster do you use, and what version of graphframes?

Henrik_
New Contributor III

I'm running  on 14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12), 1 driver (n2-highmem ๐Ÿ˜Ž 64 GB Memory, 8 Cores, 1-8 Workers (n2-highmem-4) 32-256 GB Memory 4-32 Cores.  My version of graphframes is 0.6.

-werners-
Esteemed Contributor III

Please install the latest graphframes version.

0.6 has only spark 2 support.
Also in the latest version the deprecated sql context is removed:
https://github.com/graphframes/graphframes/releases

jctanner
New Contributor III

Thanks for the response -werners-. Version 0.8.3 installed via https://pypi.org/project/graphframes-latest/ gives a different error: AttributeError: 'SparkSession' object has no attribute '_sc'. No version above 0.6 is available via %pip install graphframes --upgrade.


Note that I am using Serverless compute.

-werners-
Esteemed Contributor III

Serverless compute might be the issue here.
Can you deploy a classic compute cluster and install graphframes?

jctanner
New Contributor III

Hi, sorry for the delay, yes this was done several weeks ago but my issue is with deploying it on Serverless. Also we are unable to deploy custom clusters currently due to a separate issue.

-werners-
Esteemed Contributor III

Serverless compute has limitations, like installing libraries.  So at the moment that won't be possible.
Oldschool clusters have way more configuration possibilities, so hopefully you can fix the issue you experience on deploying clusters.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group