cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

external location on unity catalog permits access to s3

amitca71
Contributor II

Hi,

when i create external location on unity catalog. even if i dont provide any user grants, i have an write access to the s3 bucket from attached workspace.

  1. i created external location xxxx
  2. i didnt allow any permissions on it
  3. on a workspace that is connected to the metastore:

df.write.parquet('s3://xxxx-data/amit/suppose_to_fail')

ended successfully

when i removed the external location, it was failing with AWS error of no permissions, so i'm sure the permissions were delegated from UC.

How do i prevent from users to be able to write to s3 directly when using UC (btw, in the past, i could control the write options on AWS IAM level. though doesnt provide full solution, as i need to grant write selectively)

Thanks,

Amit

5 REPLIES 5

Sivaprasad1
Valued Contributor II

@Amit Cahanovichā€‹ : Could you please share your config details,

Which DBR version ?

Cluster has any instance profile?

@Sivaprasad C Sā€‹ 11.2 (includes Apache Spark 3.3.0, Scala 2.12)

Instance profile =None

Sivaprasad1
Valued Contributor II

what is the cluster mode?

Could you please run below command and share results

uc permissions get --external-location <externallocationname> --profile <databricksprofile>

https://docs.databricks.com/dev-tools/cli/unity-catalog-cli.html#unity-catalog-cli

{

  "num_workers": 0,

  "cluster_name": "xxxxxx",

  "spark_version": "11.2.x-scala2.12",

  "spark_conf": {

    "spark.master": "local[*, 4]",

    "spark.databricks.cluster.profile": "singleNode",

    "spark.databricks.dataLineage.enabled": "true"

  },

  "aws_attributes": {

    "first_on_demand": 1,

    "availability": "SPOT_WITH_FALLBACK",

    "zone_id": "us-east-2a",

    "spot_bid_price_percent": 100,

    "ebs_volume_count": 0

  },

  "node_type_id": "i3.xlarge",

  "driver_node_type_id": "i3.xlarge",

  "ssh_public_keys": [],

  "custom_tags": {

    "ResourceClass": "SingleNode"

  },

  "spark_env_vars": {

    "DB_CLUSTER_NAME": "\"***_xxxx\"",

    "DD_SITE": "\"datadoghq.com\"",

    "DB_CLUSTER_ID": "\"***_xxxx\"",

    "DD_ENV": "staging",

    "PYSPARK_PYTHON": "/databricks/python3/bin/python3",

    "DD_API_KEY": "3aa81ed18bc46a1f9cc425ee6c5ada78"

  },

  "autotermination_minutes": 120,

  "enable_elastic_disk": true,

  "cluster_source": "UI",

  "init_scripts": [

    {

      "dbfs": {

        "destination": "dbfs:/FileStore/utils/datadog-install-driver-only.sh"

      }

    }

  ],

  "single_user_name": "xxxx@***.***",

  "data_security_mode": "SINGLE_USER",

  "runtime_engine": "STANDARD",

  "cluster_id": "0915-152649-ox2wxwwz"

}

@Sivaprasad C S 

databricks unity-catalog external-locations get --name lakehouse-input --profile DEFAULT

{

ā€‹

 "name": "xxxx",

 "url": "s3://xxxx",

 "credential_name": "databricks_unity_catalog",

 "read_only": false,

 "comment": "xxxxx",

 "owner": "xxxx@***.xx",

 "metastore_id": "xxxxxxxx",

 "credential_id": "94ce13xxxxxxxxx2e3545e5",

 "created_at": 1663136630885,

 "created_by": "xxxx.xxxx@***.***",

 "updated_at": 1663136630885,

 "updated_by": "xxxx.xxxx@***.***"

}Is it because i' m the owner of the credentials?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonā€™t want to miss the chance to attend and share knowledge.

If there isnā€™t a group near you, start one and help create a community that brings people together.

Request a New Group