Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
I would like to move the folder from my repo under /Workspace/Repos/ar... to the external Azure blob location.I tried dbutils.fs.mv(repo_path, az_path) but this gave me an error for the file not found.Also, I am not able to see workspace -> repo usin...
During my experimentation with the latest feature that allows including notebook output in a commit, I ran into a specific issue. While attempting to commit my recent changes, I encountered an error message stating "Error fetching Git status." Intere...
I've found that the restriction I've encountered isn't related to the file size within Repos, but rather the maximum file size that can be shown in the Azure Databricks UI. You can find this limitation documented at https://learn.microsoft.com/en-us/...
Hi, you can try checking https://docs.databricks.com/en/administration-guide/workspace/index.html , please let us know if this helps.
Also please tag @Debayan​ with your next response which will notify me, Thank you!
I am trying to setup s3 as a structured streaming source. The bucket receives ~17K files/day and the original load to the bucket was ~54K files. The bucket was first loaded 3 months ago and we haven't started reading from it since. So let's say there...
Thanks,We were able to make things work by increasing the driver instance size so it has more memory for the initial load. After initial load we scaled the instance down for subsequent runs. We're still testing, if we aren't able to make it work we'l...
I have a notebook that calls other notebooks with `dbutils.notebook.run` and execute them as a 'Notebook job'. But sometimes when a notebook is taking a long time and the cluster is just waiting for, for instance, an api response, the subsequent comm...
Build foundational knowledge of generative AI, including large language models (LLMs), with 4 short videos.
Here is how it works:
Watch 4 short tutorial videosPass the knowledge testEarn a badge for Generative AI Fundamentals you can share on your ...
Hi Data bricks Team,I set for Data bricks Certified Machine Learning Professional exam for 2nd time (10 Sept 2023), but didn't pass again. Got 66.66% overall.I am seasoned Data bricks user but this particular exam is quite unorthodox one. Nevertheles...
Is there any way to create Databricks jobCluster through Databricks connect We are using All purposed cluster so far, to reduce Databricks cost we are planning to go ahead with jobCluster but unfortunately, I couldn't find a way to create jobCluster ...
My Databricks Certified Data Engineer Associate exam got timed out as I was uninstalling the Antivirus, it took a while to set up my device. By that time it got timed out to start the exam. So please reschedule my exam.webassessor id: anjaliisanvaga...
Dear Support Team, I apologize for the inconvenience caused. Unfortunately, due to unforeseen circumstances with my antivirus uninstallation, I was unable to start the exam on time. I kindly request you to reschedule my Databricks Certified Data Engi...
Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 40 minutes. I had also shown my exam room to the proctor. My exam got suspended due to eye movement. I was not moving my eyes away from laptop screen. Some questions ar...
I'm trying to connect to redshift with redshift-connector module, but it get 'Connection refused' error, redshift cluster and ec2 cluster are in the same VPC, security group of redshift cluster is open with inbound from VPC's IP range, i can connect ...
Hi all, Im trying to merge two streaming tables together with a left (outer) join, but it seems to somehow return all values from the left table that have matching values from the right table, instead of also appending the NULL values if there is no ...
I want to create an online chat bot that answers questions related to the immigration visa category that we work in. I don’t have programming experience myself. I would like to find someone in this community that could lead our product development.
In ADF, a variable called Datetime was specified to be entered and executed in a workflow called Parent using Web Activity.There is a Child Workflow (two-step workflow) in Parent Workflow, and Datetime variables are available in Parent Workflow, but ...
the first attached image runs fine when cluster startsthis next image of the script does not finish when I include the -y parameter. if I include the -y I get failed: Script exit status is non-zero can someone please help
Hi, if I have understand your issue correctly, when you run directly in notebook it ends with an error but when you run it inside .sh with -y it runs okay?