โ06-22-2021 05:55 PM
I want to test out different APIs directly from a Databricks notebook instead of using Postman or CURL. Is this possible?
โ06-22-2021 06:35 PM
If you're question is about using the Databricks API from within a databricks notebook, then the answer is yes of course, you can definitely orchestrate anything and invoke the REST API from a python notebook using the `requests` library already baked in the runtime for example.
Any other system/bash -based commands can also be run from a notebook if you precede them with the magic %sh command
โ10-18-2024 02:25 AM
You cannot consume the api across multiple workspaces because of isolation nonsense. Databricks pushes our notebooks outside the platform?
โ10-18-2024 03:07 AM
@tj-cycyota We have a couple of options using the REST API and the Databricks Python SDK. Please review the link for more details with examples.
Streamlining Data Engineering: Databricks SDK for Python vs Databricks REST API
โ10-18-2024 05:50 AM
@Panda There is no REST API for databricks. "RE" in REST stands for Ready Everywhere. You cannot connect to the API in workspace 1, from a notebook in workspace 2. Therefor it is Not Ready Everywhere. Workspace 1 cannot resolve the hostname for Workspace 2.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group