- 658 Views
- 5 replies
- 6 kudos
Hello all,I hope you are doing great!I want to synchronise metadata (e.g., description, comments, tags) across schemas under the Unity Catalog (e.g., test.dev, test.uat). For example, under the schema test.dev, there is a sales table with multiple co...
- 658 Views
- 5 replies
- 6 kudos
Latest Reply
It's completely fine, and I do understand. Thank you for your time and effort here!
4 More Replies
- 435 Views
- 2 replies
- 1 kudos
We recently started using the Data Profiling/ Lakehouse monitoring feature from Databricks https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/. Data Profiling is using serverless compute for running the profilin...
- 435 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @szymon_dybczak Thanks for th quick replay.But it seems serverless budget policies cannot be applied to data profiling/ monitoring jobs. https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/Serverless budget po...
1 More Replies
- 2388 Views
- 0 replies
- 0 kudos
Hello,Is it possible to utilize S3 tags when writing a DataFrame with PySpark? Or is the only option to write the dataframe and then use boto3 to tag all the files?More information about S3 object tagging is here: Amazon S3 Object Tagging.Thank you.
- 2388 Views
- 0 replies
- 0 kudos