HI @jitenjha11 ,You have a couple of options to handle this scenario:Batch Processing:Once the n number of text files arrive in the volume, you can read them in batches, process the required data, and then move the processed files to an archive direc...
@Daan ,You can maintain a default template that holds the common configuration. While creating a job-specific configuration, you can safely merge your job-specific dictionary with the base template using the | operator.
Hi @Daan ,Your requirement is to create jobs dynamically by iterating through a list of suppliers. This is definitely achievable using the Databricks SDK.I’d recommend providing the job parameters and definitions in a JSON format, as it’s more relia...
Hi @Sahil0007 ,The [REDACTED] value you're seeing is being retrieved from Key Vault.Here is the workaround, you can reverse the value twice to decode it and retrieve the original string. Alternatively, you can slice the string into two parts and conc...
Hi @Abhimanyu ,Yeah Actually in spark, '.'(dot) in columns is used for StructType by which the nested object can be accessed.But definitely You can rename thE columns dynamically whichever has '.' in it.Attached few screenshots for your reference tha...