Get the external public IP of the Job Compute cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-17-2024 12:01 PM - last edited on 07-18-2024 12:56 AM by Retired_mod
We just moved our workflow from "all purpose compute cluster" to "job compute cluster". We need to find out the external public IP of the Job Compute cluster. On the all purpose compute cluster, we get the IP by attaching a notebook and run the command "curl". On Job compute cluster, we can not attach a notebook to it, so how can we get the public IP of the driver node? The "Spark compute UI - Master" only give us the internal private IPs for driver node.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-17-2024 01:41 PM
I found the following IPs from the Cluster JSON file:
"driver": {
"private_ip": "10.*.*.*",
"public_dns": "172.*.*.*",
"node_id": "80*****",
Similar the executors configuration
"executors": [
{
"private_ip": "10.*.*.*",
"public_dns": "172.*.*.*",
"node_id": "7a********",
I assumed the public_dns IP "172.*.*.*" is the public IPs of the driver/executor nodes. is it correct?

