by
kg6ka
• New Contributor
- 2751 Views
- 1 replies
- 1 kudos
Hey, guys.I have a question, so, I have databricks jobs in workflow that are linked to my databricks repo, which contains the necessary scripts for one or another job. That is, the job is linked to the databricks repo.The main code is developed in gi...
- 2751 Views
- 1 replies
- 1 kudos
Latest Reply
Does the user which the API token is generated from has the git credential configured for the git repo ?If not, you can follow the steps here : https://docs.databricks.com/en/repos/get-access-tokens-from-git-provider.html
by
Thor
• New Contributor III
- 6978 Views
- 1 replies
- 2 kudos
Hello,I'm facing a problem with big tarballs to decompress and to fit in memory I had to limit Spark processing too many files at the same time so I changed the following property on my 8 cores VMs cluster:spark.task.cpus 4 This setting is the thresh...
- 6978 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @Thor,
Spark does not offer the capability to dynamically modify configuration settings, such as spark.task.cpus, for individual stages or transformations while the application is running. Once a configuration property is set for a Spark applicati...