โ02-02-2023 10:37 PM
Hi,
We're using Databricks Runtime version 11.3LTS and executing a Spark Java Job using a Job Cluster. To automate the execution of this job, we need to define (source in from bash config files) some environment variables through an init script (cluster-scoped) and make them available to the Spark Java job.
While doing this, we realized that we're not able to upload an init script of size larger than 5KB at an appropriate DBFS location. Apparently the documentation states that the init script size should not be larger than 64KB. Are there are any settings/configurations at workspace level that can help us raise this limit from 5KB to 64KB?
Please let us know if we're missing anything here. Any help in this regard will be highly appreciated.
Thanks in advance!
Regards,
// Rahul
โ02-03-2023 10:15 AM
Hi @Rahul Kโ , Could you please share the screenshot of the error you are getting?
โ02-13-2023 08:45 AM
Thank you @Lakshay Goelโ for your response and really sorry for the delayed response. I'm not able to attach a screenshot here.
Please find below the error I'm getting while uploading a file named db_init.sh to /databricks/init folder.
Error occurred when processing file db_init.sh: Server responded with 0 code.
โ02-03-2023 07:26 PM
@Rahul Kโ it looks that's limitation
The init script cannot be larger than 64KB. If a script exceeds that size, an error message appears when you try to save.
as on DBFS external mounts init scripts cannot bet stored. @Lakshay Goelโ any other inputs please
โ02-13-2023 08:48 AM
Thank you @karthik pโ for your response. I'm trying to upload a script named db_init.sh (of 10kb size) to /databricks/init folder and getting the following error -
Error occurred when processing file db_init.sh: Server responded with 0 code.
Had it been having size greater than 64kb, then it's expected, but it's having a size of 10kb. Please let me know if there are any settings that help increase this size.
โ02-14-2023 09:17 AM
@Rahul Kโ - Databricks recommends you avoid storing init scripts under /databricks/init (which is now a legacy) to avoid unexpected behaviour. Try using the new Global Init Scripts using either the UI or API or Terraform and see if the issue persists.
Reference: https://learn.microsoft.com/en-us/azure/databricks/clusters/init-scripts
โ02-14-2023 09:28 PM
Thank you @Sundar Ramanโ for your response. Yes, we referred the same link that you shared and we currently use the Cluster-scoped script (following option #4) as against Global (following option #3) because the script needs to be executed only for one cluster and not all clusters that're part of a workspace. We're not using the following options #1 and #2 as they've been deprecated.
This clsuter-scoped init script has been provided from the UI and uploaded/deployed to the DBFS root. It has been uploaded/deployed to the DBFS root as described in the shared link (Section - Cluster-scoped init script locations). I'll check if referring it from ADLS directly helps address the size limitatation.
โ02-15-2023 11:33 AM
oh ok! A quick question @Rahul Kโ ! Are you still using /databricks/init as it relates to the legacy global path? Have you tried saving the cluster-scoped script to a different location such as /databricks/scripts ?
โ02-15-2023 09:32 PM
Yes @Sundar Ramanโ, but that also gives the same error.
โ02-17-2023 02:49 AM
@Rahul K Also please can you confirm the location is not a DBFS mount? Have you tried on any other DBR and do you still have the same issue there? โ
โ02-21-2023 12:09 AM
yes @Sundar Ramanโ, the location is not a DBFS mount. We tried this in a workspace where DBR 10.4LTS and 11.3LTS are being used.
โ04-10-2023 01:39 AM
Hi @Rahul Kโ
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group