cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks Access Bundles - config data needed by notebook

GiggleByte
New Contributor II

I have this structure - 

Folder-1 - the root of databricks access directory. "databricks.yaml" file is in this directory

Folder-1 / Folder-2  has notebooks. One of the notebook, "test-notebook" is used for *job* configuration in databricks.yaml file.

Folder-1 / Folder-2 / configs / config.json is used by "test-notebook". When running the notebook directly in Databricks UI, I can read the contents of config.json and use json data successfully. I use the path 'configs/config.json'

However, when I deploy bundle and run the job, I get FileNotFound error. From the context of a job, the file configs / config.json is not accessible. 

What is the best way to save configuration files within an access bundle and how to ensure that the file is available to the notebook?

 

Thanks,

Archana 

 

 

 

3 REPLIES 3

karthik_p
Esteemed Contributor

@GiggleByte adding few more inputs to what @Retired_mod  mentioned, @GiggleByte how you are running job, using cmd/vs console/github/gitalab. your resource folder should have converted yaml of your json not json to run dab jobs from other than databricks UI if i am not wrong

karthik_p_0-1698404320318.png

 

GiggleByte
New Contributor II

I am running the job via "databricks bundle run" command via Azure DevOps pipeline. 

I tried using "files" section to databricks.yaml file, but when the notebook runs, it is unable to find conf/config1.json 

GiggleByte_0-1698416491049.png

@Retired_mod - config files are part of the bundle. 

@karthik_p - so, is it mandatory that all config information needed by notebooks be in yaml format? If yes, then how to notebooks access parameters? For example, if I have a config file called "custom_configs.yaml", then will the notebook be able to open the file "custom_configs.yaml"? Or, is there another way to read the config information?

Thanks,

Archana

 

karthik_p
Esteemed Contributor

@GiggleByte @Yes based on demo test that I have done, it is working as you said. 

JSON converted yaml config for job setting need to be placed under resources, that yaml has job config setting, it looks  similar to rest api json request converted in form of yaml. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group