Databrick IP address
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 10:00 PM
I'm trying to call api in databricks notebook .But while calling api inside databricks notebook it is giving error saying "403" forbidden.
I think it seems issue with IP address
can any one help me to know which IP of databricks need to be whitelisted in order to call api and where can I find it.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 10:47 PM
Might be worth adding what public cloud you are on and if you are using secure cluster connectivity
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 11:00 PM
@jacovangelder My databricks workspace is hosted in AWS so I tried to add the region,Inbound and outbound IP address But it is not working
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 11:04 PM - edited 06-19-2024 11:16 PM
I don't know your enterprise setup, but it could be that the AWS firewall is blocking the outbound request too. Just whitelisting the AWS Databricks ranges in the application you're posting to might not be enough.
Edit: actually no that can't be the case because a 403 is remote server error.
You might have only whitelisted the control plane IPs and not the compute plane (cluster/VM ip's)
Which IP's did you use?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 11:40 PM
@jacovangelder yes, I have whitelisted control plane IP's inbound and outbound.
ap-south-1 65.0.37.64/28, 13.232.248.161
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 11:54 PM
@jacovangelder
can you please let me know where can I get compute IP address
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2024 12:15 AM
You can find it in the Spark Master UI settings in the cluster configuration.
Or you can run %sh ifconfig.
Do keep in mind you might want to take a range because this most likely changes everytime you spin up your cluster.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2024 02:32 AM
@jacovangelder
will try this thank you