Hi!
I asume that ADF is just the trigger and it's from Databricks the direct access to ADLS to process the data.
ADLS Access:
You create an External Location in your Databricks workspace that acts as a bridge to ADLS. This is done through Catalog Explorer.
To set it up:
Create a Storage Credential using a Managed Identity (or Service Principal) that has permissions to your ADLS
Create an External Location that links this credential to your specific ADLS path
You can assign granular permissions at the workspace or catalog level
That's it. Now Databricks can read and write to that ADLS path directly.

Reference: https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/external-loca...
Calling Databricks Job
To trigger a Databricks job from ADF, you need:
That's the minimum. Everything else is optional, likes warehouse/cluster id if dont need serverless, parameters jobs, etc.
Reference: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-databricks-job