From Audit logs, I found one workspace user keep sending tokenLogin with a token that I cannot identify. The user personal access tokens are all with different ID.
I use below payload to submit my job that include am init script saved on S3. The instance profile and init script worked on interactive cluster. But when I move to job cluster the init script cannot be configure. {
"new_cluster": {
"spar...
When run sparkR.session()I faced below error:Spark package found in SPARK_HOME: /databricks/spark
Launching java with spark-submit command /databricks/spark/bin/spark-submit sparkr-shell /tmp/Rtmp5hnW8G/backend_porte9141208532d
Error: Could not f...
That is expected. The single user mode is the legacy standard + UC ACL enabled. https://docs.databricks.com/en/archive/compute/cluster-ui-preview.html#how-does-backward-compatibility-work-with-these-changes
For your case, you need the hive table acl ...
@MonishKumar Could you provide the entire exception ?From the one line error message, I suspect this is due to the SSL cipher suites required by the SFTP server is not available on cluster. You can run below to get the cipher suites that sftp require...
You can add the jar followed below steps:How to add items to the allowlistYou can add items to the allowlist with Data Explorer or the REST API.To open the dialog for adding items to the allowlist in Data Explorer, do the following:In your Databricks...
From the error message, you could exceed below limit :Working branches are limited to 200 MB.Individual files are limited to 200 MB.Files larger than 10 MB can’t be viewed in the Databricks UI.Databricks recommends that in a repo:The total number of ...
Does the user which the API token is generated from has the git credential configured for the git repo ?If not, you can follow the steps here : https://docs.databricks.com/en/repos/get-access-tokens-from-git-provider.html