cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Ela
by New Contributor III
  • 840 Views
  • 1 replies
  • 1 kudos

Checking for availability of dynamic data masking functionality in SQL.

I am looking forward for functionality similar to snowflake which allows attaching masking to a existing column. Documents found related to masking with encryption but my use case is on the existing table. Solutions using views along with Dynamic Vie...

  • 840 Views
  • 1 replies
  • 1 kudos
Latest Reply
sivankumar86
New Contributor II
  • 1 kudos

Unity catalog provide similar feature https://docs.databricks.com/en/data-governance/unity-catalog/row-and-column-filters.html

  • 1 kudos
marcin-sg
by New Contributor III
  • 920 Views
  • 1 replies
  • 2 kudos

Create (account wide) groups without account admin permissions

The use case is quite simple: each environment - databricks workspace (prod, test, dev) will be created by a separate service principal (which for isolation purpose should not have account wide admin permission) with terraform, but will belong to the...

  • 920 Views
  • 1 replies
  • 2 kudos
Latest Reply
marcin-sg
New Contributor III
  • 2 kudos

Another thing would be also to assign workspace to a metastore without account admin permission - for similar reason.

  • 2 kudos
carlosancassani
by New Contributor III
  • 1368 Views
  • 3 replies
  • 5 kudos

Error: Credential size is more than configured size limit. As a result credential passthrough won't work for this notebook run.

I get this error when trying to execute parallel slave notebook from a Pyspark "master notebook".note 1: I use same class, functions, cluster, credential for another use case of parallel notebook in the same databricks instance and it works fine.note...

image
  • 1368 Views
  • 3 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @carlosancassani​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 5 kudos
2 More Replies
lshar
by New Contributor III
  • 17069 Views
  • 7 replies
  • 5 kudos

Resolved! How do I pass arguments/variables from widgets to notebooks?

Hello,I am looking for a solution to this problem, which is known since 7 years: https://community.databricks.com/s/question/0D53f00001HKHZfCAP/how-do-i-pass-argumentsvariables-to-notebooksWhat I need is to parametrize my notebooks using widget infor...

example_if_run
  • 17069 Views
  • 7 replies
  • 5 kudos
Latest Reply
lshar
New Contributor III
  • 5 kudos

@Hubert Dudek​ @Kaniz Fatma​ When I use the dbutils.notebook.run(..) a new cluster is started, hence I can run some other code, but cannot use variable and functions as if I have just run them directly in the same notebook. Hence, my goal is not met....

  • 5 kudos
6 More Replies
Soma
by Valued Contributor
  • 2257 Views
  • 6 replies
  • 3 kudos

Resolved! Dynamically supplying partitions to autoloader

We are having a streaming use case and we see a lot of time in listing from azure.Is it possible to supply partition to autoloader dynamically on the fly

  • 2257 Views
  • 6 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

@somanath Sankaran​ - Thank you for posting your solution. Would you be happy to mark your answer as best so that other members may find it more quickly?

  • 3 kudos
5 More Replies
Labels