cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

CONTAINER_LAUNCH_FAILURE

Khaja_Zaffer
Contributor

Hello everyone!

I need some help, unable to get cluster up and running. I did try creating classic compute but fails, is there any limit to use databricks community edition? 

Error here: 

{
  "reason": {
    "code": "CONTAINER_LAUNCH_FAILURE",
    "type": "SERVICE_FAULT",
    "parameters": {
      "databricks_error_message": "Failed to launch the Spark container on instance i-08e0677f2fe8bb9b9. [details] X_ContainerLaunchFailure: Failed to launch spark container on instance i-08e0677f2fe8bb9b9. Exception: Could not add container for 0904-140909-n8zwc7e with address 10.172.231.165 \njava.lang.RuntimeException: Timed out with exception after 119 attempts (debugStr = 'waitUntilAttachable')\n\tat com.databricks.backend.common.util.TimeUtils$.retryWithExponentialBackoff0(TimeUtils.scala:217)\n\tat com.databricks.backend.common.util.TimeUtils$.retryWithExponentialBackoff(TimeUtils.scala:144)\n\tat com.databricks.backend.common.util.TimeUtils$.retryWithTimeout(TimeUtils.scala:93)\n\tat com.databricks.backend.daemon.node.container.LxcContainerManager.waitUntilAttachable(LxcContainerManager.scala:1552)\n\tat com.databricks.backend.daemon.node.container.LxcContainerManager.$anonfun$setUpContainer$1(LxcContainerManager.scala:372)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:510)\n\tat com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:616)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:643)\n\tat com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:80)\n\tat com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:336)\n\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)\n\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:332)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:78)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:75)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.withAttributionContext(ContainerManager.scala:93)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:127)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:108)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.withAttributionTags(ContainerManager.scala:93)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:611)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:519)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.recordOperationWithResultTags(ContainerManager.scala:93)\n\tat com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:511)\n\tat com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:475)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.recordOperation(ContainerManager.scala:93)\n\tat com.databricks.backend.daemon.node.container.LxcContainerManager.setUpContainer(LxcContainerManager.scala:170)\n\tat com.databricks.backend.daemon.node.NodeDaemon.$anonfun$addContainer$10(NodeDaemon.scala:783)\n\tat com.databricks.backend.common.logging.CmvOneLogger.containerLaunchLoggingWrapper(CmvOneLogger.scala:308)\n\tat com.databricks.backend.daemon.node.NodeDaemon.$anonfun$addContainer$6(NodeDaemon.scala:749)\n\tat com.databricks.backend.common.logging.CmvOneLogger.containerLaunchLoggingWrapper(CmvOneLogger.scala:308)\n\tat com.databricks.backend.daemon.node.NodeDaemon.$anonfun$addContainer$4(NodeDaemon.scala:694)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:510)\n\tat com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:616)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:643)\n\tat com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:80)\n\tat com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:336)\n\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)\n\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:332)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:78)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:75)\n\tat com.databricks.common.util.LockHelper$.withAttributionContext(LockHelper.scala:16)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:127)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:108)\n\tat com.databricks.common.util.LockHelper$.withAttributionTags(LockHelper.scala:16)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:611)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:519)\n\tat com.databricks.common.util.LockHelper$.recordOperationWithResultTags(Lo...",
      "instance_id": "i-08e0677f2fe8bb9b9",
      "aws_api_error_code": "X_ContainerLaunchFailure",
      "aws_error_message": "Failed to launch spark container on instance i-08e0677f2fe8bb9b9. Exception: Could not add container for 0904-140909-n8zwc7e with address 10.172.231.165 \njava.lang.RuntimeException: Timed out with exception after 119 attempts (debugStr = 'waitUntilAttachable')\n\tat com.databricks.backend.common.util.TimeUtils$.retryWithExponentialBackoff0(TimeUtils.scala:217)\n\tat com.databricks.backend.common.util.TimeUtils$.retryWithExponentialBackoff(TimeUtils.scala:144)\n\tat com.databricks.backend.common.util.TimeUtils$.retryWithTimeout(TimeUtils.scala:93)\n\tat com.databricks.backend.daemon.node.container.LxcContainerManager.waitUntilAttachable(LxcContainerManager.scala:1552)\n\tat com.databricks.backend.daemon.node.container.LxcContainerManager.$anonfun$setUpContainer$1(LxcContainerManager.scala:372)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:510)\n\tat com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:616)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:643)\n\tat com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:80)\n\tat com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:336)\n\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)\n\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:332)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:78)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:75)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.withAttributionContext(ContainerManager.scala:93)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:127)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:108)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.withAttributionTags(ContainerManager.scala:93)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:611)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:519)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.recordOperationWithResultTags(ContainerManager.scala:93)\n\tat com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:511)\n\tat com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:475)\n\tat com.databricks.backend.daemon.node.container.ContainerManager.recordOperation(ContainerManager.scala:93)\n\tat com.databricks.backend.daemon.node.container.LxcContainerManager.setUpContainer(LxcContainerManager.scala:170)\n\tat com.databricks.backend.daemon.node.NodeDaemon.$anonfun$addContainer$10(NodeDaemon.scala:783)\n\tat com.databricks.backend.common.logging.CmvOneLogger.containerLaunchLoggingWrapper(CmvOneLogger.scala:308)\n\tat com.databricks.backend.daemon.node.NodeDaemon.$anonfun$addContainer$6(NodeDaemon.scala:749)\n\tat com.databricks.backend.common.logging.CmvOneLogger.containerLaunchLoggingWrapper(CmvOneLogger.scala:308)\n\tat com.databricks.backend.daemon.node.NodeDaemon.$anonfun$addContainer$4(NodeDaemon.scala:694)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:510)\n\tat com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:616)\n\tat com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:643)\n\tat com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:80)\n\tat com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:336)\n\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)\n\tat com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:332)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:78)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:75)\n\tat com.databricks.common.util.LockHelper$.withAttributionContext(LockHelper.scala:16)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:127)\n\tat com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:108)\n\tat com.databricks.common.util.LockHelper$.withAttributionTags(LockHelper.scala:16)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:611)\n\tat com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:519)\n\tat com.databricks.common.util.LockHelper$.recordOperationWithResultTags(LockHelper.scala:16)\n\tat com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:511)\n\tat com..."
    }
  }
}
2 ACCEPTED SOLUTIONS

Accepted Solutions

Advika
Databricks Employee
Databricks Employee

Hello @Khaja_Zaffer!

The best way forward is to move to the Free Edition, which offers a more stable environment than the Community Edition.
It looks like you’ve had multiple Free Trial accounts before, which is causing issues when trying to sign up for the Free Edition. In this case, I recommend raising a Support ticket to request the deletion of those expired trial accounts. Once that’s resolved, you should be able to set up a FE account at signup.databricks.com/free.

View solution in original post

Khaja_Zaffer
Contributor

To all 

legacy community edition is working fine if you use dbr <= 15.4 for both general and ML modes.

I think legacy community still far more better than free edition. 

I was selecting >15.4 DBR 

Thank you. 

View solution in original post

10 REPLIES 10

Khaja_Zaffer
Contributor

UP!
Unable to start any compute. 😞

szymon_dybczak
Esteemed Contributor III

Hi @Khaja_Zaffer ,

Maybe there is some capacity problem in AWS now?

Another thing, there is a following  limitation in databricks community edition -> you need to create a new cluster every-time and run it. Even in the same session if your cluster stops due to inactivity or some other reason, you will need to create a new cluster.

So, do you want to reuse already created cluster? If so, it's not possible. You need to create new one.

Hello @szymon_dybczak 
I have deleted all previous clusters and create a new one. the cluster is trying its best but unable to start, But yeah I used for practising so, I did good amount of data processing(mostly data used from kaggle or 1 GB dataset only), I am also thinking they might have some capacity limit! 

szymon_dybczak
Esteemed Contributor III

I'll try to create on my end. Maybe this is global issue. I'll let you know.

Edit: Yep, it appears this is global issue. I've got the same error as you:

szymon_dybczak_0-1757078668013.png

 



szymon_dybczak
Esteemed Contributor III

One question, is there any reason you can't use databricks free edition?

I have all data uploaded in the volumne in community edition, 

I tried using free edition, when I login by using my email it says this email has few accounts already existing, when I select any of them, it shows unlock account with debit or credit account or something like that 

Khaja_Zaffer_0-1757082262235.png

 

Khaja_Zaffer
Contributor

@Advika can you please help here?

Advika
Databricks Employee
Databricks Employee

Hello @Khaja_Zaffer!

The best way forward is to move to the Free Edition, which offers a more stable environment than the Community Edition.
It looks like you’ve had multiple Free Trial accounts before, which is causing issues when trying to sign up for the Free Edition. In this case, I recommend raising a Support ticket to request the deletion of those expired trial accounts. Once that’s resolved, you should be able to set up a FE account at signup.databricks.com/free.

BS_THE_ANALYST
Esteemed Contributor

@Khaja_Zaffer , if your current e-mail address is currently preventing you from leveraging the Free Edition, can't you just use a different e-mail address for the Free Edition?

@Advika's advice is the better route to go through ☺️.

All the best,
BS

Khaja_Zaffer
Contributor

To all 

legacy community edition is working fine if you use dbr <= 15.4 for both general and ML modes.

I think legacy community still far more better than free edition. 

I was selecting >15.4 DBR 

Thank you. 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now