cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

isolated databricks cluster call from synapses or azure datafactory

alexgv12
New Contributor III

https://learn.microsoft.com/en-us/answers/questions/1919424/isolated-databricks-cluster-call-from-sy...

how can I create a job in databricks with parameters of isolated from synapses or azure datafactory, because I can not find any option that allows to pass as parameter this value and not being able to do so I have no access to my unit catalog in databricks

Captura de pantalla 2024-08-20 122534.png

example:

{
   "num_workers": 1,
   "cluster_name""...",
   "spark_version""14.0.x-scala2.12",
   "spark_conf": {
       "spark.hadoop.fs.azure.account.oauth2.client.endpoint""...",
       "spark.hadoop.fs.azure.account.auth.type""...",
       "spark.hadoop.fs.azure.account.oauth.provider.type""...",
       "spark.hadoop.fs.azure.account.oauth2.client.id""...",
       "spark.hadoop.fs.azure.account.oauth2.client.secret""..."
   },
   "node_type_id""...",
   "driver_node_type_id""...",
   "ssh_public_keys": [],
   "spark_env_vars": {
       "cluster_type""all-purpose"
   },
   "init_scripts": [],
   "enable_local_disk_encryption": false,
   "data_security_mode""USER_ISOLATION",
   "cluster_id""..."
}

2 REPLIES 2

-werners-
Esteemed Contributor III

I am not sure what you mean.
Do you want to run a databricks notebook with Data Factory and pass parameter values from ADF to the notebook?
Or do you want to start a databricks job?

alexgv12
New Contributor III

alexgv12_0-1724248583821.png

Hi warner thanks for your question, I share the link service in synapses updated, currently we have a pool in databricks then what we do with the link service is that it creates a job and uploads an instance with the resources of our pool but to upload the cluster we have not identified where to send the parameter of isolated, because we need it to see our unit catalog, then in our flow we call the notebook using this link service, and in our code we make use of unit catalog

alexgv12_1-1724249177197.png

these are the characteristics with which the cluster is created, but these are the ones that I need

alexgv12_2-1724249248963.png

I hope this helps to understand a little, and thank you for your help

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group