Is there a way to install hail on cluster?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi all!
Been trying to install hail (https://hail.is/) on databricks with no luck so far. Is there an easy way to make it work? So far I could not get further than (providing sparkContext like `hl.init(sc=spark.sparkContext` also did not help):
import hail as hl
hl.init()
TypeError: 'JavaPackage' object is not callable
File <command-3243506661262167>, line 3
1 import hail
2 get_ipython().system('echo $JAVA_HOME')
----> 3 hail.init_spark()
File /databricks/python/lib/python3.11/site-packages/hail/backend/spark_backend.py:130, in SparkBackend.__init__(self, idempotent, sc, spark_conf, app_name, master, local, log, quiet, append, min_block_size, branching_factor, tmpdir, local_tmpdir, skip_logging_configuration, optimizer_iterations, gcs_requester_pays_project, gcs_requester_pays_buckets, copy_log_on_error)
128 jhc = hail_package.HailContext.getOrCreate(jbackend, branching_factor, optimizer_iterations)
129 else:
--> 130 jbackend = hail_package.backend.spark.SparkBackend.apply(
131 jsc,
132 app_name,
133 master,
134 local,
135 log,
136 True,
137 append,
138 skip_logging_configuration,
139 min_block_size,
140 tmpdir,
141 local_tmpdir,
142 gcs_requester_pays_project,
143 gcs_requester_pays_buckets,
144 )
145 jhc = hail_package.HailContext.apply(jbackend, branching_factor, optimizer_iterations)
147 self._jsc = jbackend.sc()
Cheers!
2 REPLIES 2
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
you can run "pip install hail" on notebook cell.
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
@SriramMohanty wrote:you can run "pip install hail" on notebook cell.
Of course, it just does not work (get the same output as above)
Cheers!

