cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Job Preview in ADF

fjrodriguez
New Contributor III

I do have one Spark Job that is triggered via ADF as a usual "Python" activity. Now wanted to move to Job which is under Preview. 

Normally under linked service level i do have spark config and environment that is needed for the execution of this script. I do not find the way how to bypass this spark envs and config to the job cluster i do have created for my new Job Activity.

 

Anyone faced this before ?

2 REPLIES 2

radothede
Valued Contributor II

Hi @fjrodriguez 

my understanding is You've already created a cluster for Your job. If that's the case, You can put that spark configuration and env variables directly in the cluster Your job is using.

 

If for some reason thats not possible, then You can still use 'Additional cluster settings' in ADF Linked Service, providing Cluster spark conf and Cluster Spark environment variables.

 

Best,

Radek.

Hey Raked, thanks for replying.

Yes, indeed this is feasible solution but somehow the ADF Additional cluster settings adjusted at LK level are not bypass to the Job compute cluster ๐Ÿ˜ž