Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insight...
Hi Team,I am using Databricks free edition to run some jobs on environment but I am getting error like : Public DBFS root is disabled. Access is denied on path: /FileStore/tables/So how can i get access for this location, could anyone help me here.
My problem was resolved. I discovered that I was trying to access via Repos, but I changed to access via Catalog, and it worked well.
Hi, I purchased Databricks Academy Labs($200) and currently I am doing Data Engineer Learning Plan. However I cannot see any labs that are referenced in the tutorials. Thanks.
Howdy! Had this exact problem yesterday. Here is the discussion with a couple of solutions that collectively should get you there: https://community.databricks.com/t5/training-offerings/can-t-access-labs/td-p/125194
Taxidermy Mount near mehttps://taxidermyforsalenearme.com
Dear @Cert-Team , My Databricks data engineer professional Certification exam is suspended first support guys told me to wait ,i showed all my room seating area everything at the end they said it got suspended, i raised the ticket in portal.I have re...
I have a new subscription to the labs and am enrolled in a labs enabled course (Data Ingestion with Lakeflow Connect). Nowhere can I find a link to access the lab material. The course looks identical to the free version. The screens in the "Accessing...
Thank you for the response! Here's a bit more to hopefully help out future folks.The following is not currently accurately described in the "Accessing Vocareum Labs in Your Training" lesson.When enrolled in a Lab course (typically indicated by a red ...
Hi @Cert-Team I would require your help on the below support ticket.Request ID: #00699279I faced some interruptions and challenges while attempting my certification. My exam got suspended just because I leaned closer to the laptop screen to clearly r...
Hello @Bhargav14,Thank you for filing a ticket with our support team, Support team will respond shortly. Please note that we cannot provide support or handle exam suspensions via community.Thanks & Regards,@cert-ops
Hi Team,I want to do databricks data engineer associate certification so can i get any discount voucher. Thanks.
Hello @Akash_kumar! There are currently no ongoing events offering Certification vouchers. The recent Virtual Learning Festival, which offered a 50% discount voucher for Certifications, has concluded. These events are held quarterly in January, April...
I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test: def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...
I am facing this issue too, I have added the volume under the app resource as a UC volume with read and write permissions, but pd.read_csv() is unable to find the file path. Please let me know what I can do
Getting into the data space can feel overwhelming, with so many tools, terms, and technologies. But after years inExpect failure. Design for it.Jobs will fail. The data will be late. Build systems that can recover gracefully, and continually monitor ...
Hi Team, I am trying to update the catalog permission privileges using databricks cli command Grant by appending json file but which is not updating the prIviliges, please help on grant update command usage.Command using : databricks grants update c...
If someone needs this in the future, like I did.The issue is with your JSON structure. The Databricks CLI uses "changes" with "add" instead of "privilege_assignments" with "privileges".{ "changes": [ { "principal": "mailid", "add": ...
What are some best practices for optimizing Spark jobs in Databricks, especially when dealing large datasets? Any tips or resources would be greatly appreciated! I’m trying to analyze data on restaurant menu prices so that insights would be especiall...
There are so many.Here are a few:- look for data skew- shuffle as less as possible- avoid many small files- use spark and not only pure python- if using an autoscale cluster: check if you don't lose a lot of time scaling up/down
Hi there,Just testing the new Databricks free edition. Was trying to play around with LLMs, but I', not able to create serving endpoints with foundational model entities, interact with pay-per-token foundational model APIs or use them in Databricks a...
I have the same problem. I am unable to create a Serving Endpoint with any of the foundation models in the Databricks Free edition at the moment. By using the above code snippet, the provisioning starts, then it either hangs for long hours without an...
Hi We've just started using databricks and so am a little naive into the file system, especially regarding unity catalog.The issue is that we're creating a loggeer and wanting to write the files based on a queue handler/listener pattern. The patternn...
When using the CLI you need to add the scheme:dbfs:/Volumes/​...The rest should be fine to refer with "/Volumes/...", for more info Manage files in volumes | Databricks Documentation.Hope this solves the issue!
Hi all, I'm currently looking at upskilling in Databricks. I'd like to focus on becoming great at solving a variety of problems. I come with a background in the Alteryx community and they have something called Weekly Challenges: https://community.alt...
@Advika let me know if I can be of any help with building this at all. To get maximum engagement from the community, it'll be great if there can be badges (these show on the community profile) associated to completing these challenges. I.e. first 1, ...
Hi all,Could someone clarify the intended usage of the variable-overrides.json file in Databricks Asset Bundles?Let me give some context. Let's say my repository layout looks like this:databricks/ ├── notebooks/ │ └── notebook.ipynb ├── resources/ ...
It does. Thanks for the reponse. I also continued playing around with it and found a way using the variable-overrides.json file. I'll leave it here just in case anyone is interested:Repository layout:databricks/ ├── notebooks/ │ └── notebook.ipynb ...