cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number.

roshan_robert
New Contributor

Hi Team,

In a streamlit app (in databricks) while creating the spark session getting below error, this is happening when running the app via web link.

"[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number"

Below is the code used in app.py:

import os
import streamlit as st
import pandas as pd
from databricks import sql
import pytz
from datetime import datetime

from pyspark.sql import SparkSession
from pyspark import SparkContext
from pyspark import SparkConf

# Create SparkSession
try:
spark = SparkSession.builder \
.appName("Streamlit App") \
.config("spark.driver.memory", "4g") \
.config("spark.executor.memory", "4g") \
.getOrCreate()
st.write("Spark Session created successfully!")
st.write("Spark Version:", spark.version)
except Exception as e:
st.write(f"Error creating Spark session: {e}")

#Create a sample DataFrame
data = {
'Name': ['Alice', 'Bob', 'Charlie', 'David'],
'Age': [25, 30, 35, 40],
'City': ['New York', 'Los Angeles', 'Chicago', 'Houston']
}
df = pd.DataFrame(data)

# Display the DataFrame in the Streamlit app
st.title('Sample DataFrame')
st.write(df)

# Convert the edited DataFrame to a Spark DataFrame
spark_df = spark.createDataFrame(df)
st.write(spark_df)

2 REPLIES 2

Walter_C
Databricks Employee
Databricks Employee

The error you're encountering, "[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number", typically occurs when there are issues with the Java configuration or when PySpark is unable to establish a connection with the Java gateway process. Here are some potential solutions to address this issue:

Java Configuration

  1. Java Version: Ensure that you're using a compatible Java version. Spark typically works best with Java 8 or 11. Check your Java version and make sure it's compatible with your Spark version
  2. JAVA_HOME: Verify that the JAVA_HOME environment variable is correctly set and points to a compatible Java installation

I tried setting up JAVA_HOME explicitly, but it did not work. Issue i am facing is when launching the application using the web link, when running the same code in notebook its running fine. Let me know the fix.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group