- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2025 09:25 PM
Hi!
Have connected workspace to AWS, but when I execute in a new notebook:
%python
%pip install glow.py
import glow
from pyspark.sql import SparkSession
# Create a Spark session
spark = (SparkSession.builder
.appName("Genomics Analysis")
.getOrCreate())
# Register Glow with the Spark session
glow.register(spark)
I get:
TypeCheckError: argument "session" (pyspark.sql.connect.session.SparkSession) is not an instance of pyspark.sql.session.SparkSession
File <command-7248535564102662>, line 10
5 spark = SparkSession.builder \
6 .appName("Genomics Analysis") \
7 .getOrCreate()
9 # Register Glow with the Spark session
---> 10 glow.register(spark)
Any ideas how this is supposed to work? AI and googling did not help...
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Solved this with the help of colleagues at last. First of all, it won't work with Serverless mode, so a cluster is required. Once the cluster is created in Compute section, on Library tab add those 2 libraries:
Then running:
import glow
from pyspark.sql.session import SparkSession
glow.register(spark)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Solved this with the help of colleagues at last. First of all, it won't work with Serverless mode, so a cluster is required. Once the cluster is created in Compute section, on Library tab add those 2 libraries:
Then running:
import glow
from pyspark.sql.session import SparkSession
glow.register(spark)

