cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I resolve problems when deploying a workspace with AWS Quickstart cloud formation template?

User16835756816
Valued Contributor

I am unable to deploy a workspace on AWS using Quickstart from my account console.

Short description-

You might receive one of the following common errors users face:

  1. Wrong credentials
  2. Elastic IP and VPC limit reached
  3. Region unavailable

Resolution-

Wrong credentials

Failed to create CreateStorageConfiguraiton and CreateCredentialConfiguration.

Most likely the wrong password has been entered in the Cloudformation template. The browser often tries to autofill your AWS credentials, but we need the Databricks ones.

cloudformation-databricks-password 

CreateWorkspace failed

Common reasons:

  • Maximum number of Elastic IP addresses has been reached.
  • Maximum number of VPC has been reached.

AWS has a limit of 5 VPC per region. Please go to your AWS Console > VPC and check if you have reached the VPC or Elastic IP limit for that region.

If you have, you might want to consolidate your networks, use a different region, use Customer managed VPC feature of Databricks, or ask AWS support to increase your limit.

Region unavailable

The list of supported AWS regions can be found here.

Not able to launch clusters

Error: AWS: unsupported failure

Common reason: AWS 2D availability zone in US West-2 is quite busy. Plus often limited on instance types as well. 

Solution: On the Cluster page click Advanced and under Instances select a different availability zone. 

Data access via instance profiles

If you have data in S3, you’ll need to configure Databricks in order to be able to query the data, create tables and set up your Lakehouse. Commonly this configuration is via instance profiles that are added to the cluster. 

If you want to automatically set up your instance profile and have a default cluster configured in your workspace - select “I have data in S3 that I want to query with Databricks” checkmark.

Screen Shot 2022-03-11 at 10.17.42 AMIn this case you’ll see a field for Data bucket in Cloudformation. Please add the name of your S3 bucket that you have the data in.

Screen Shot 2022-03-15 at 10.42.50 AM 

Multiple data buckets

Quickstart currently supports data access configuration for one S3 bucket. If you want to query all of your S3 buckets with Databricks you can enter “*” in the field.

You can always change this configuration by going to AWS Console > IAM > Roles then select the Role that was created for Databricks and manually overwrite which buckets you want to access. Read more in AWS documentation.

Manually Create a Cross Account Role

Add the role to Databricks like shown below

Cross Account RoleThen select the manual workspace creation and select the credential and the storage configuration to create your workspace. 

The quickstart automates all of this process by leveraging AWS Cloudformation. 

Further debug an error

Received response status [FAILED] from custom resource. Message returned: See the details in CloudWatch Log Stream.

Please navigate to Cloudwatch in your AWS Console. Then click Logs > Log groups and search for the stack name of your Cloudformation. By default the name will be “databricks-workspace-stack”. Depending on where the error occurred you might have 2 log groups and potentially up to 2 log streams in a Log group. Please click into each and search for “Error”. You should find an event which you can expand to read more about the details. 

This will often give you a good pointer. 

If you still cannot solve your issue please post your question with the error message or screenshot here in the Databricks Community.

1 ACCEPTED SOLUTION

Accepted Solutions

qasimhassan
Contributor

Really great explanation. The error that I was encountering since yesterday was Failed to create CreateStorageConfiguraiton and CreateCredentialConfiguration. The first step to put the password manually helped me to solve the issue

Best,
Qasim Hassan
Data Engineer @Royal Cyber
Databricks UG Pakistan Lead

View solution in original post

1 REPLY 1

qasimhassan
Contributor

Really great explanation. The error that I was encountering since yesterday was Failed to create CreateStorageConfiguraiton and CreateCredentialConfiguration. The first step to put the password manually helped me to solve the issue

Best,
Qasim Hassan
Data Engineer @Royal Cyber
Databricks UG Pakistan Lead

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group