cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

err:setfacl: Option -m: Invalid argument LibraryDownloadManager error

DE-cat
New Contributor III

When starting a DB job using 13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12) cluster, I am seeing a lots of these errors in log4j output. Any ideas? Thx

23/09/11 13:24:14 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 2 out: err:setfacl: Option -m: Invalid argument near character 3

java.lang.RuntimeException: CommandLineHelper exception - stack trace
	at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
	at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
	at com.databricks.backend.daemon.driver.LibraryDownloadManager.com$databricks$backend$daemon$driver$LibraryDownloadManager$$fetchLibrary0(LibraryDownloadManager.scala:198)
	at com.databricks.backend.daemon.driver.LibraryDownloadManager$$anon$1.load(LibraryDownloadManager.scala:51)
	at com.databricks.backend.daemon.driver.LibraryDownloadManager$$anon$1.load(LibraryDownloadManager.scala:46)

 

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @DE-cat , To configure an AWS instance connection in Databricks, you need to follow these steps:1. Create an access policy and a user with access keys in the AWS Console:
  - Go to the IAM service.
  - Click the Policies tab in the sidebar.
  - Click Create policy.
  - In the policy editor, click the JSON tab.
  - Paste the provided access policy into the editor.
  - Click Review policy.
  - Enter a policy name in the Name field.
  - Click Create policy.
  - Click the Users tab in the sidebar.
  - Click Add User.
  - Enter a user name.
  - For Access type, select Programmatic access.
  - Click Next Permissions.
  - Select Attach existing policies directly.
  - In the Policy type filter, select Customer Managed.
  - Select the checkbox next to the policy you created.
  - Click Next Review.
  - Click Create user.
  - Click Download .csv to download a CSV file containing the access key ID and secret access key.2. Configure access keys in your Databricks account:
  - Log in to the Databricks account console as the account owner.
  - Click the AWS Account tab.
  - Select the Deploy to AWS using Access Key radio button.
  - Enter the access key ID and secret access key from the downloaded CSV file.

DE-cat
New Contributor III

I am using DB Azure, nothing to do with AWS. 😎

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.