It looks like a couple of things. You may not have setup the abfss path as an external location. You may not have added the location to the IAM controls for the container to the Databricks Connector. I am assuming that you are a workspace and cata...
Two other things we would generally recommend are 1.) Direct Connect and 2.) Using AWS DMS, or any other CDC tool set. There are lots. We have Query Federation now in the Unity Catalog, so you could set up the connection that way, but JDBC/ODBC con...
I would use widgets in the notebook which will process in Jobs. SQL in Notebooks can use parameters, as would the SQL in the jobs with parameterized queries now supported.
One clarification to the point above. The data is stored in your (the customer's) account and S3. All of the ephemeral workers are spun up in your own account and VPC. Very little information is actually stored in the Databricks account, other tha...
Without knowing all that you are trying to do, the answer is yes, with the Instance Profile API. https://docs.databricks.com/dev-tools/api/latest/instance-profiles.html. You might also check out the SCIM APIs to associate the Instance Profile to a g...