by
JensH
• New Contributor III
- 2666 Views
- 3 replies
- 2 kudos
Hi,I would like to use the new "Job as Task" feature but Im having trouble to pass values.ScenarioI have a workflow job which contains 2 tasks.Task_A (type "Notebook"): Read data from a table and based on the contents decide, whether the workflow in ...
- 2666 Views
- 3 replies
- 2 kudos
Latest Reply
I found the following information:
value is the value for this task value’s key. This command must be able to represent the value internally in JSON format. The size of the JSON representation of the value cannot exceed 48 KiB.You can refer to https...
2 More Replies
- 405 Views
- 1 replies
- 0 kudos
Hi, When a Job is running, I would like to change the parameters with an API call.I know that I can set parameters value from API when I start a job from API, or that I can update the default value if the job isn't running, but I didn't find an API c...
- 405 Views
- 1 replies
- 0 kudos
Latest Reply
No, there is currently no option to change parameters while the job is running, from the UI you will be able to modify them but it wont affect the current run, it will be applied on the new job runs you trigger.
- 5713 Views
- 2 replies
- 3 kudos
What is the difference between Databricks Auto-Loader and Delta Live Tables? Both seem to manage ETL for you but I'm confused on where to use one vs. the other.
- 5713 Views
- 2 replies
- 3 kudos
Latest Reply
You say "...__would__ be a piece..." and "...DLT __would__ pick up...".Is DLT built upon AL?
1 More Replies
- 8138 Views
- 11 replies
- 4 kudos
I have successfully passed the test after completion of the course with 95%. But I have'nt recieved any badge from your side as promised. I have been provided with a certificate which looks fake by itself. I need to post my credentials on Linkedin wi...
- 8138 Views
- 11 replies
- 4 kudos
Latest Reply
Even I'm facing similar issue. I have completed the training and the quiz successful and able to download a course completion certificate. Certificate doesn't have any ID and looking very generic and fake. Have signed up for the https://credentials.d...
10 More Replies
- 1034 Views
- 1 replies
- 0 kudos
Hi!I am pulling data from a Blob storage to Databrick using Autoloader. This process is working well for almost 10 resources, but for a specific one I am getting this error java.lang.NullPointerException.Looks like this issue in when I connect to th...
- 1034 Views
- 1 replies
- 0 kudos
Latest Reply
@Maxi1693 - The value for the schemaEvolutionMode should be a string. could you please try changing the below from
.option("cloudFiles.schemaEvolutionMode", None)
to
.option("cloudFiles.schemaEvolutionMode", "none")
and let us know.
Refe...
- 913 Views
- 5 replies
- 1 kudos
- 913 Views
- 5 replies
- 1 kudos
Latest Reply
Yes, I meant to set it to None. Is the issue specific to any particular cluster? Or do you see the issue with all the clusters in your workspace?
4 More Replies
- 589 Views
- 2 replies
- 1 kudos
Hi Guys, I am new to the Delta pipeline. I have created a pipeline and now when i try to run the pipeline i get the error message "PERMISSION_DENIED: You are not authorized to create clusters. Please contact your administrator" even though I can crea...
- 589 Views
- 2 replies
- 1 kudos
Latest Reply
Thank you for responding @Palash01 . thanks for giving me the direction so to get around it i had to get permission to "unrestricted cluster creation".
1 More Replies
- 556 Views
- 3 replies
- 0 kudos
Do you know why the userIdentity is anonymous in AWS Cloudtail's logs even though I have specified an instance profile?
- 556 Views
- 3 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
2 More Replies
- 2021 Views
- 5 replies
- 2 kudos
If this isn't the right spot to post this, please move it or refer me to the right area.I recently learned about the "_metadata.file_name". It's not quite what I need.I'm creating a new table in DataBricks and want to add a USR_File_Name column cont...
- 2021 Views
- 5 replies
- 2 kudos
Latest Reply
Hi, Could you please elaborate more on the expectation here?
4 More Replies
- 242 Views
- 1 replies
- 0 kudos
Hy guys,How can I get the pricing of cluster types (standard_D*, standard_E*, standart_F*, etc.) ?Im doing a study to decrease the price of my actual cluster.Have any idea ?Thank you, thank you
- 242 Views
- 1 replies
- 0 kudos
Latest Reply
Hey, you can use the pricing calculator here: https://www.databricks.com/product/pricing/product-pricing/instance-types
- 1306 Views
- 4 replies
- 1 kudos
Hi, I'm trying to create a calendar dimension including a fiscal year with a fiscal start of April 1. I'm using the fiscalyear library and am setting the start to month 4 but it insists on setting April to month 7.runtime 12.1My code snipet is:start_...
- 1306 Views
- 4 replies
- 1 kudos
Latest Reply
import fiscalyear
import datetime
def get_fiscal_date(year,month,day):
fiscalyear.setup_fiscal_calendar(start_month=4)
v_fiscal_month=fiscalyear.FiscalDateTime(year, month, day).fiscal_month #To get the Fiscal Month
v_fiscal_quarter=fiscalyea...
3 More Replies
- 455 Views
- 2 replies
- 1 kudos
Hello,I am writing to bring to your attention an issue that we have encountered while working with Databricks and seek your assistance in resolving it.When running a Job of Workflow with the task "Run Job" and clicking on "View YAML/JSON," we have ob...
- 455 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Kaniz, thank you for your fast response.However, the versioned JSON or YAML (via Databricks Asset Bundle) in the Job UI should also include the job_name, or we have to change it manually by replacing the job_id with the job_name. For this reason,...
1 More Replies
by
442027
• New Contributor II
- 443 Views
- 1 replies
- 0 kudos
It notes in the documentation here that the default delta log retention interval is 30 days - however when I create checkpoints in the delta log to trigger the cleanup - historical records from 30 days aren't removed; i.e. current day checkpoint is a...
- 443 Views
- 1 replies
- 0 kudos
Latest Reply
you need to set SET TBLPROPERTIES ('delta.checkpointRetentionDuration' = '30 days',)
by
Mrk
• New Contributor II
- 3753 Views
- 4 replies
- 3 kudos
Hi,When I create an identity column using the GENERATED ALWAYS AS IDENTITY statement and I try to INSERT or MERGE data into that table I keep getting the following error message:Cannot write to 'table', not enough data columns; target table has x col...
- 3753 Views
- 4 replies
- 3 kudos
Latest Reply
You can run the INSERT by passing the subset of columns you want to provide values for... for example your insert statement would be something like:INSERT INTO target_table_with_identity_col(<list-of-cols-names-without-the-identity-column>SELECT(<lis...
3 More Replies
- 947 Views
- 3 replies
- 1 kudos
Hi. I am using structured streaming and auto loader to read json files, and it is automated by Workflow. I am having difficulties with the job failing as schema changes are detected, but not retrying. Hopefully someone can point me in the right dir...
- 947 Views
- 3 replies
- 1 kudos
Latest Reply
Another point I have realised, is that the task and the parent notebook (which then calls the child notebook that runs the auto loader part) does not fail if the schema-changed failure occurs during the auto loader process. It's the child notebook a...
2 More Replies