cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

nikhilkumawat
by New Contributor III
  • 2384 Views
  • 2 replies
  • 1 kudos

Resolved! Not able to get json response for "/api/2.0/accounts/{account_id}/metastores" endpoint.

Hi ,I am trying to get list of all the metastores associated with an accountId. For this I am using the below REST API to get data:accountId = json.loads(dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson())["tags"]["accountId"]...

metastore_response
  • 2384 Views
  • 2 replies
  • 1 kudos
Latest Reply
ruben1
New Contributor III
  • 1 kudos

Apparently, it is working if I add the header `X-Databricks-Account-Console-API-Version` with value `2.0` to the call

  • 1 kudos
1 More Replies
Avvar2022
by Contributor
  • 9387 Views
  • 4 replies
  • 1 kudos

Resolved! I am new to Data bricks. Setting up Data bricks Unity Catalog, in terms of best practice i have few questions.

Is it best practice to separate unity catalog meta store ADLS Gen2 separate from ADLS Gen 2 to store data ?Since per region only one meta store can be created, will there be a separate meta store for PROD, and NON-PROD(QA and DEV)? If yes they need t...

  • 9387 Views
  • 4 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@Ashok Zubrewar​  coming to your 3 rd question, if you are using any external tables then non uc ADLS GEN 2 is mandatory, you can not use UC ADLS GEN2. as it hosts metadata and managed table data. there is no restriction in terms of your external buc...

  • 1 kudos
3 More Replies
Kayl669
by New Contributor III
  • 1894 Views
  • 2 replies
  • 1 kudos

Reassurance sought about behaviour of Databricks account SCIM connector

In my org we've got workspaces with a mixture of SCIM-provisioned and non-SCIM groups. These are all 'workspace local' groups. My identity provider is AAD.I've created a new workspace and want users in this workspace to be provided access only via ac...

  • 1894 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, Please refer to https://learn.microsoft.com/en-us/azure/databricks/administration-guide/users-groups/users and https://learn.microsoft.com/en-us/azure/databricks/administration-guide/users-groups/groups. Also, please note if you already have SCIM...

  • 1 kudos
1 More Replies
rammy
by Contributor III
  • 5998 Views
  • 4 replies
  • 15 kudos

How would i upload file stream object to S3 bucket using pyspark?

I could able to save data using pyspark into S3 but not sure on how to save a file stream object into S3 bucket using pyspark. I could achieve this with help of python but when Unity catalog was enabled on Databrciks it always ends up with an access ...

  • 5998 Views
  • 4 replies
  • 15 kudos
Latest Reply
rammy
Contributor III
  • 15 kudos

I got to know that there is a change required at Unity-catalog to make it work with Python and got a recommendation to use pyspark to store file into S3.I do not see much information about storing a file stream object in an S3 bucket anywhere. Can an...

  • 15 kudos
3 More Replies
Labels