- 3525 Views
- 4 replies
- 4 kudos
Hi Team,Good morning. I would like to understand if there is a possibility to determine the workload automatically through code (data load from a file to a table, determine the file size, kind of a benchmark that we can check), based on which we can ...
- 3525 Views
- 4 replies
- 4 kudos
Latest Reply
Hi @Arunsundar Muthumanickam , When you say workload, I believe you might be handling various volumes of data between Dev and Prod environment. If you are using Databricks cluster and do not have much idea on how the volumes might turn out in differ...
3 More Replies
- 3613 Views
- 2 replies
- 2 kudos
i want to implement autoloader to ingest data into delta lake from 5 different source systems and i have 100 different tables in each database how do we dynamically address this by using autoloader , trigger once option - full load , append merge sen...
- 3613 Views
- 2 replies
- 2 kudos
Latest Reply
You can create a generic notebook that will be parametrized with the table name/source system and then just simply trigger notebook with different parameters (for each table/source system).For parametrization you can use dbutils.widgets (https://docs...
1 More Replies
- 5999 Views
- 2 replies
- 6 kudos
I have separate column value defined in 13 diffrent notebook and i want merge into 1 databrick notebook and want to pass dynamic parameter using databrick so it will help me to run in single databricks notebook .
- 5999 Views
- 2 replies
- 6 kudos
Latest Reply
Hi @siddhesh Bhavar you can use widgets with the %run command to achieve this. https://docs.databricks.com/notebooks/widgets.html#use-widgets-with-run%run /path/to/notebook $X="10" $Y="1"
1 More Replies
- 16802 Views
- 3 replies
- 0 kudos
I've got a table I want to add some data to and it's partitoned. I want to use dynamic partitioning but I get this error
org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off ...
- 16802 Views
- 3 replies
- 0 kudos
Latest Reply
I got it working. This was exactly what I needed. Thank you @Peyman Mohajerian
2 More Replies