cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Error cluster launch: Security Daemon Registration

FerArribas
Contributor

I have created a workspace in AWS with private link. When we launch a cluster we get the following error: "Security Daemon Registration Exception: Failed to set up the spark container due to an error when registering the container to security daemon"

The log (in a cluster workspace instance) stays in a loop at:

[ 348.935443] audit: kauditd hold queue overflow

[ 348.940183] audit: kauditd hold queue overflow

[ 348.946439] audit: kauditd hold queue overflow

[ 354.013288] audit: kauditd hold queue overflow

[ 354.019590] audit: kauditd hold queue overflow

[ 354.025712] audit: kauditd hold queue overflow

[ 475.998259] audit: kauditd hold queue overflow

[ 476.004062] audit: kauditd hold queue overflow

[ 476.009494] audit: kauditd hold queue overflow

[ 638.997129] audit: kauditd hold queue overflow

[ 639.005763] audit: kauditd hold queue overflow

[ 639.012236] audit: kauditd hold queue overflow

Thanks,

1 ACCEPTED SOLUTION

Accepted Solutions

Sivaprasad1
Valued Contributor II

@Fernando Arribas Jara​ : Is it a terraform deployment? Do you see any STS endpoint in it? if yes, could you please delete it and try

View solution in original post

3 REPLIES 3

Sivaprasad1
Valued Contributor II

@Fernando Arribas Jara​ : Is it a terraform deployment? Do you see any STS endpoint in it? if yes, could you please delete it and try

Kaniz
Community Manager
Community Manager

Hi @Fernando Arribas Jara​​, We haven’t heard from you since the last response from @Sivaprasad C S​, and I was checking back to see if you have a resolution yet.

If you have any solution, please share it with the community as it can be helpful to others. Otherwise, we will respond with more details and try to help.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

FerArribas
Contributor

Thanks @Sivaprasad C S​  It's already working. The problem was that the AWS STS private link was wrongly associated with the correct subnet.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.