cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Does Databricks support proxy for BigQuery?

109005
New Contributor III

Hi team, we tried to use the proxy options for BigQuery Spark connector as mentioned in this documentation. However, we keep getting "connect timed out" error. The proxy host is working on our end. This made us wonder if by chance Databricks does not support proxies?

Please do confirm.

6 REPLIES 6

Prabakar
Esteemed Contributor III
Esteemed Contributor III

Hi @Ayushi Pandey​ Databricks supports proxies. could you please share the error stack?

109005
New Contributor III

Sure thing. Below is the error stack:

com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryException: Connect to 10.128.0.4:3128 [/10.128.0.4] failed: connect timed out

---------------------------------------------------------------------------

Py4JJavaError Traceback (most recent call last)

<command-215529791339289> in <module>

1 spark.conf.set("proxyAddress", "http://10.128.0.4:3128")

----> 2 df = spark.read.format("bigquery").load("connectivity.product")

/databricks/spark/python/pyspark/sql/readwriter.py in load(self, path, format, schema, **options)

156 self.options(**options)

157 if isinstance(path, str):

--> 158 return self._df(self._jreader.load(path))

159 elif path is not None:

160 if type(path) != list:

/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in __call__(self, *args)

1302

1303 answer = self.gateway_client.send_command(command)

-> 1304 return_value = get_return_value(

1305 answer, self.gateway_client, self.target_id, self.name)

1306

/databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw)

Are you using community edition? or you are testing from your own workspace?

109005
New Contributor III

My own workspace.

Vidula
Honored Contributor

Hi @Ayushi Pandey​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

109005
New Contributor III

Hi, no this issue was not resolved unfortunately.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.