by
digui
• New Contributor
- 5768 Views
- 3 replies
- 0 kudos
Hi y'all.I'm trying to export metrics and logs to AWS cloudwatch, but while following their tutorial to do so, I ended up facing this error when trying to initialize my cluster with an init script they provided.This is the part where the script fail...
- 5768 Views
- 3 replies
- 0 kudos
Latest Reply
@digui Did you figure out what to do? We're facing the same issue, the script works for the executors.I was thinking on adding an if that checks if there is log4j.properties and modify it only if it exists
2 More Replies
- 20668 Views
- 6 replies
- 2 kudos
Hello,I want to install ODBC driver (for pyodbc).I have tried to do it using terraform, however I think it is impossible.So I want to do it with Init Script in my cluster. I have the code from the internet and it works when it is on the beginning of ...
- 20668 Views
- 6 replies
- 2 kudos
Latest Reply
Actually found this article and using this to migrate my shell script to workspace.Cluster-named and cluster-scoped init script migration notebook - Databricks
5 More Replies
- 3197 Views
- 2 replies
- 1 kudos
When installing Notebook-scoped R libraries I don't want to manually specify the custom CRAN mirror each time like this:install.packages("diffdf", repos="my_custom_cran_url'')Instead I want to take the custom CRAN mirror URL by default so that I don'...
- 3197 Views
- 2 replies
- 1 kudos
Latest Reply
Got solution on Stack Overflow for this problem: https://stackoverflow.com/a/76777228/18082636
1 More Replies
by
glebex
• New Contributor II
- 10182 Views
- 7 replies
- 7 kudos
Greetings all!I am currently facing an issue while accessing workspace files from the init script.As it was explained in the documentation, it is possible to place init script inside workspace files (link). This works perfectly fine and init script i...
- 10182 Views
- 7 replies
- 7 kudos
Latest Reply
@Gleb Smolnik You might also want to try cloning a github repo in your init script and then storing dependencies like requirements.txt files and other init scripts there. By doing this you can pull a whole slew of init scripts to be utilized in your...
6 More Replies
- 1851 Views
- 1 replies
- 0 kudos
Similar issue: https://stackoverflow.com/questions/76220211/create-new-databricks-cluster-from-adf-linked-service-with-initscripts-from-abfsI am trying to create clusters using ADF linked service where the cluster is configured with a init script. As...
- 1851 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Oscar Dyremyhr Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 2178 Views
- 2 replies
- 3 kudos
I am working on converting manual global init scripts into a terraform IaC process for multiple environments. Within terraform, we are using the resource "databricks_global_init_script" and set the content_base64 with the following:base64encoded(<<-...
- 2178 Views
- 2 replies
- 3 kudos
Latest Reply
Atanu
Databricks Employee
I am looking into it @Kristian Foster Are you able to get it working?
1 More Replies
by
FRG96
• New Contributor III
- 6299 Views
- 0 replies
- 0 kudos
I want to use an Init Script on ADLS Gen2 location for my Azure Databricks 11.3 and 12.2 clusters. The init_script.sh is placed in a directory that has spaces in it:https://storageaccount1.blob.core.windows.net/container1/directory%20with%20spaces/su...
- 6299 Views
- 0 replies
- 0 kudos
- 8261 Views
- 8 replies
- 4 kudos
I am trying to run a cluster-scoped init script through Pulumi. I have referred to this documentation https://learn.microsoft.com/en-us/azure/databricks/clusters/configure#spark-configuration However, looks like the documentation is not very clear.I ...
- 8261 Views
- 8 replies
- 4 kudos
Latest Reply
Hi @Sulfikkar Basheer Shylaja , Why don't you store the init-script on DBFS and just pass the dbfs:/ path of the init script in Pulumi? You could just run this code on a notebook-%python
dbutils.fs.put("/databricks/init-scripts/set-private-pip-repos...
7 More Replies
- 3953 Views
- 2 replies
- 0 kudos
In an init script or a notebook, we can:pip install --index-url=<our private pypi url> --extra-index-url=https://pypi.org/simple <a module>In the cluster web UI (libraries -> install library), we can give only the url of our private repository, but n...
- 3953 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Philippe CRAVE Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...
1 More Replies
- 4040 Views
- 4 replies
- 4 kudos
Hi,We're using Databricks Runtime version 11.3LTS and executing a Spark Java Job using a Job Cluster. To automate the execution of this job, we need to define (source in from bash config files) some environment variables through an init script (clust...
- 4040 Views
- 4 replies
- 4 kudos
Latest Reply
Hi @Rahul K Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
3 More Replies
- 5746 Views
- 11 replies
- 1 kudos
Hi,We're using Databricks Runtime version 11.3LTS and executing a Spark Java Job using a Job Cluster. To automate the execution of this job, we need to define (source in from bash config files) some environment variables through an init script (clust...
- 5746 Views
- 11 replies
- 1 kudos
Latest Reply
Hi @Rahul K Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...
10 More Replies
by
repcak
• New Contributor III
- 2627 Views
- 1 replies
- 2 kudos
I'm trying to access init script which is stored on mounted azure data lake storage gen2 to dbfsI mounted storage to dbfs:/mnt/storage/container/script.shand when i try to access it i got an error:Cluster scoped init script dbfs:/mnt/storage/containe...
- 2627 Views
- 1 replies
- 2 kudos
Latest Reply
I do not think the init script saved under mount point work and we do not suggest that. If you specify abfss , then the cluster need to be configured so that the cluster can authenticate and access the adls gen2 folder. Otherwise, the cluster will no...
- 7804 Views
- 7 replies
- 0 kudos
Hello,We have an Azure Data Factory pipeline running during the night, and one of the activities calls a Databricks Notebook with dynamic DatabricksInstancePoolId, ClusterVersion and Workers. Yesterday, it failed with with the following error:Cluster...
- 7804 Views
- 7 replies
- 0 kudos
Latest Reply
Hi @Rita Fernandes,What are you trying to install in your init script? only the ODBC driver or some other libraries/dependencies?
6 More Replies