cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks on AWS - Changes to your Unity Catalog storage credentials

abhishekdas
New Contributor II

Hi

 
Context: 
On June 30, 2023, AWS updated its IAM role trust policy, which requires updating Unity Catalog storage credentials. Databricks previously sent an email communication to customers in March 2023 on this topic and updated the documentation and terraform template to reflect the required changes. To allow customers time to adjust, we implemented a temporary workaround to allow storage credentials with previous role policy settings to continue working to give customers time to make the required changes. As per our previous email, Unity Catalog storage credentials must be configured with a self-assuming IAM role.
Action Required
You are receiving this email because you used a storage credential configured with an IAM role that does not allow self-assuming, and we ask that you update it before January 20, 2025 at which point your existing credential will stop working.
To update your non-self-assuming storage credentials, please follow guidance in the Unity Catalog documentation or the Unity Catalog terraform templates. Please ensure that they are self-assume-capable (please see โ€œStep 3: Update the IAM role policyโ€). If you are unsure how to update your IAM policies to self-assume, please check with your Account Representative.
 
We need help here since it looks like the course of action here is to update the trust relationship policy of the IAM role <company>-metadata-databricks-access-role which was created by the cloudformation stack provided by Databricks itself. We had reached out to our AWS support representative and below is the response - please advise how exactly to move forward. 
 
To begin with, regarding your query of whether to change the CloudFormation stack and re-apply, or change the policy in the IAM console, based on my analysis, the recommended approach would be to modify the CloudFormation stack and re-apply it. This method ensures that your infrastructure remains in sync with your source code and maintains the integrity. However, given that the CloudFormation stack was automatically provided by Databricks, please contact Databricks support further to inquire about the proper procedure for updating the IAM policies within their provided CloudFormation stack and If Databricks allows modifications to their stack, request guidance on how to safely implement these changes whereas If Databricks does not recommend modifying their stack, then changing the policy directly in the IAM console may be your best option.
Having said that, given that the managing resources like cloudformation stacks are provided by a third-party service like Databricks, I would strongly recommend reaching out to their support team for guidance on best practices for modifying the automatically provided CloudFormation stack as third-party assistance is out of AWS support scope. They should be able to provide you with the most up-to-date and safe methods for making necessary changes.

 
Thanks
1 ACCEPTED SOLUTION

Accepted Solutions

MoJaMa
Databricks Employee
Databricks Employee

Hi Abhishek,

100% agree that "the recommended approach would be to modify the CloudFormation stack and re-apply it" as stated by AWS Support.

Here is a template which you can consider re-using to execute.
(Obviously you'll change the bucket, role names etc)

 

AWSTemplateFormatVersion: 2010-09-09
Resources:

  UnityCatalogBucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: "{{ TEAM_NAME }}-unity-catalog-{{ ENV }}"

  UnityCatalogBucketRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: "{{ TEAM_NAME }}-unity-catalog-{{ ENV }}"
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              AWS:
                - arn:aws:iam::414351767826:role/unity-catalog-prod-UCMasterRole-14S5ZJVKOTYTL
                - !Sub arn:aws:iam::${AWS::AccountId}:root
            Action: sts:AssumeRole
             Condition:
              StringEquals:
                AWS:PrincipalArn:
                  - !Sub arn:aws:iam::414351767826:role/unity-catalog-prod-UCMasterRole-14S5ZJVKOTYTL
                  - !Sub arn:aws:iam::${AWS::AccountId}:role/{{ TEAM_NAME }}-unity-catalog-{{ ENV }}
                sts:ExternalId: "01481bf9-fd6f-4318-b9f4-4f3d743ff240"
      ManagedPolicyArns:
        - !Ref UnityCatalogBucketRolePolicy

  UnityCatalogBucketRolePolicy:
    Type: AWS::IAM::ManagedPolicy
    Properties:
      ManagedPolicyName: "{{ TEAM_NAME }}-unity-catalog-{{ ENV }}"
      PolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Action:
                - s3:GetObject
                - s3:PutObject
                - s3:DeleteObject
                - s3:ListBucket
                - s3:GetBucketLocation
            Resource:
                - !Sub arn:aws:s3:::${UnityCatalogBucket}
                - !Sub arn:aws:s3:::${UnityCatalogBucket}/*
          - Effect: Allow
            Action:
              - sts:AssumeRole
            Resource:
                - !Sub arn:aws:iam::${AWS::AccountId}:role/{{ TEAM_NAME }}-unity-catalog-{{ ENV }}

 

If you run into issues, then I highly recommend creating a Databricks Support Ticket through the Help Center. https://help.databricks.com/s/ (as recommended by AWS Support).

Hope this helps.

View solution in original post

3 REPLIES 3

Sujitha
Databricks Employee
Databricks Employee

@abhishekdas Thank you for your patience. We have been checking internally with a few folks for the best approach! We will keep you posted soon. 

MoJaMa
Databricks Employee
Databricks Employee

Hi Abhishek,

100% agree that "the recommended approach would be to modify the CloudFormation stack and re-apply it" as stated by AWS Support.

Here is a template which you can consider re-using to execute.
(Obviously you'll change the bucket, role names etc)

 

AWSTemplateFormatVersion: 2010-09-09
Resources:

  UnityCatalogBucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: "{{ TEAM_NAME }}-unity-catalog-{{ ENV }}"

  UnityCatalogBucketRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: "{{ TEAM_NAME }}-unity-catalog-{{ ENV }}"
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              AWS:
                - arn:aws:iam::414351767826:role/unity-catalog-prod-UCMasterRole-14S5ZJVKOTYTL
                - !Sub arn:aws:iam::${AWS::AccountId}:root
            Action: sts:AssumeRole
             Condition:
              StringEquals:
                AWS:PrincipalArn:
                  - !Sub arn:aws:iam::414351767826:role/unity-catalog-prod-UCMasterRole-14S5ZJVKOTYTL
                  - !Sub arn:aws:iam::${AWS::AccountId}:role/{{ TEAM_NAME }}-unity-catalog-{{ ENV }}
                sts:ExternalId: "01481bf9-fd6f-4318-b9f4-4f3d743ff240"
      ManagedPolicyArns:
        - !Ref UnityCatalogBucketRolePolicy

  UnityCatalogBucketRolePolicy:
    Type: AWS::IAM::ManagedPolicy
    Properties:
      ManagedPolicyName: "{{ TEAM_NAME }}-unity-catalog-{{ ENV }}"
      PolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Action:
                - s3:GetObject
                - s3:PutObject
                - s3:DeleteObject
                - s3:ListBucket
                - s3:GetBucketLocation
            Resource:
                - !Sub arn:aws:s3:::${UnityCatalogBucket}
                - !Sub arn:aws:s3:::${UnityCatalogBucket}/*
          - Effect: Allow
            Action:
              - sts:AssumeRole
            Resource:
                - !Sub arn:aws:iam::${AWS::AccountId}:role/{{ TEAM_NAME }}-unity-catalog-{{ ENV }}

 

If you run into issues, then I highly recommend creating a Databricks Support Ticket through the Help Center. https://help.databricks.com/s/ (as recommended by AWS Support).

Hope this helps.

abhishekdas
New Contributor II

Thank you for the response @MoJaMa - we will try it out tomorrow and post an update here.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group