Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
User sessions automatically timeout after six hours of idle time. This is not configurable like @Kunal Gaurav mentioned. Please raise a feature request if you have a requirement to configure this.Now, in Azure you could configure AAD refresh token ...
I am using terrafom to do databricks workspace configuration and while mounting 6 buckets if duration of mount is bigger than 20 min I get timeout. Is it possible to change the timeout ? thanksHoratiu
Hello everyone,I have several notebooks (around 10) and I want to run them in a sequential order. At first I thought of using %run but I have a variable that is repeatedly used in every notebook. So now I am thinking to pass that variable from one ma...
Hi @pavan venkata Yes, as the document says 0 means no timeout. It means that the notebook will take it's sweet time to complete execution without throwing an error due to a time limit. Be it if the notebook takes 1 min or 1 hour or 1 day or more. H...
First, our process was triggered by the datafactory. First the connexion was set with token access, then with managed service identity.We prove the untimely time out was not due to the datafactory by running directly the notebook. Secondly, we tried ...
we are creating a denorm table based on a JSON ingestion but the complex table is getting generated .when we try to deflatten the JSON rows it is taking for more than 5 hours and the error message is timeout erroris there any way that we could resolv...
Hey @Raviteja Paluri Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. Thanks!
This is part of the configuration of the task itself, so if no timeout is specified, it can theoretically run forever (e.g. streaming use case). Please refer timeout section in below link.https://docs.databricks.com/dev-tools/api/latest/jobs.html#ope...
I'm trying to upload a file that is .5GB for a school lab and when I drag the file to DBFS it uploads for about 30 seconds and then I receive a downstream duration timeout error. What can I do to solve this issue?
Hi @Jason Schmit ,Your file might be too large to upload by using the upload interface docs I will recommend to split it up into smaller files. You can also use DBFS CLI, dbutils to upload your file.