Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
Hi All,I am curious to know the difference between a spark cluster and a DataBricks one.As per the info I have read Spark Cluster creates driver and Workers when the Application is submitted whereas in Databricks we can create cluster in advance in c...
Hi, as mentioned in the title, I'm getting this error when I try to use model serving, despite being on the premium plan, my trial account ends on 28th September 2023, is there a way to use model serving immediately or am i stuck until 28th September...
Hi there,I am facing several issues while trying to run SQL warehouse-starter on Azure databricks.Please note I am new to this data world, Azure & Databricks . while starting SQL starter warehouse in Databricks Trail version and I am getting these ...
Hi Community I have a doubt. The bronze layer always causes confusion for me. Someone mentioned, "File Format: Store data in Delta Lake format to leverage its performance, ACID transactions, and schema evolution capabilities" for bronze layers.Then, ...
Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 20 minutes. My exam got suspended due to eye movement without any warning. I was not moving my eyes away from laptop screen. Some questions are so big in the exam so I...
@rajib_bahar_ptg funny, not funny, right?! I just posted that tip today in this post: https://community.databricks.com/t5/certifications/minimize-the-chances-of-your-exam-getting-suspended-tip-3/td-p/45712
Hi all!!!I have created a cluster policy but when i want to use that while creating dlt pipeline, It is showing none. I have checked I have all the necessary permissions to create cluster policies. Still, in dlt ui it is showing none.
Hi all,I have 50+ .html and .py files, for which I have to create separate notebooks for each and every single one of them. For this, manually creating a notebook using the UI and importing the .html/.py file is a bit tedious and time consuming. Is t...
Depending on your use case and requirements, one alternative would be to create a script that loops through your files and uploads them using the API. You can find more information about the API here: https://docs.databricks.com/api/workspace/workspa...
I would like to move the folder from my repo under /Workspace/Repos/ar... to the external Azure blob location.I tried dbutils.fs.mv(repo_path, az_path) but this gave me an error for the file not found.Also, I am not able to see workspace -> repo usin...
During my experimentation with the latest feature that allows including notebook output in a commit, I ran into a specific issue. While attempting to commit my recent changes, I encountered an error message stating "Error fetching Git status." Intere...
I've found that the restriction I've encountered isn't related to the file size within Repos, but rather the maximum file size that can be shown in the Azure Databricks UI. You can find this limitation documented at https://learn.microsoft.com/en-us/...
Hi, you can try checking https://docs.databricks.com/en/administration-guide/workspace/index.html , please let us know if this helps.
Also please tag @Debayan​ with your next response which will notify me, Thank you!
I am trying to setup s3 as a structured streaming source. The bucket receives ~17K files/day and the original load to the bucket was ~54K files. The bucket was first loaded 3 months ago and we haven't started reading from it since. So let's say there...
Thanks,We were able to make things work by increasing the driver instance size so it has more memory for the initial load. After initial load we scaled the instance down for subsequent runs. We're still testing, if we aren't able to make it work we'l...
I have a notebook that calls other notebooks with `dbutils.notebook.run` and execute them as a 'Notebook job'. But sometimes when a notebook is taking a long time and the cluster is just waiting for, for instance, an api response, the subsequent comm...
Build foundational knowledge of generative AI, including large language models (LLMs), with 4 short videos.
Here is how it works:
Watch 4 short tutorial videosPass the knowledge testEarn a badge for Generative AI Fundamentals you can share on your ...
Hi Data bricks Team,I set for Data bricks Certified Machine Learning Professional exam for 2nd time (10 Sept 2023), but didn't pass again. Got 66.66% overall.I am seasoned Data bricks user but this particular exam is quite unorthodox one. Nevertheles...
Is there any way to create Databricks jobCluster through Databricks connect We are using All purposed cluster so far, to reduce Databricks cost we are planning to go ahead with jobCluster but unfortunately, I couldn't find a way to create jobCluster ...