Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insight...
Hi @Cert-Team I would require your help on the below support ticket.Request ID: #00699279I faced some interruptions and challenges while attempting my certification. My exam got suspended just because I leaned closer to the laptop screen to clearly r...
@Bhargav14 follow the steps in this thread: https://community.databricks.com/t5/certifications/suspended-exam-don-t-worry-our-support-team-can-help/td-p/44469 Let me know how you get on. All the best,BS
I have a new subscription to the labs and am enrolled in a labs enabled course (Data Ingestion with Lakeflow Connect). Nowhere can I find a link to access the lab material. The course looks identical to the free version. The screens in the "Accessing...
Hi all, I'm currently looking at upskilling in Databricks. I'd like to focus on becoming great at solving a variety of problems. I come with a background in the Alteryx community and they have something called Weekly Challenges: https://community.alt...
@Advika let me know if I can be of any help with building this at all. To get maximum engagement from the community, it'll be great if there can be badges (these show on the community profile) associated to completing these challenges. I.e. first 1, ...
Hi We've just started using databricks and so am a little naive into the file system, especially regarding unity catalog.The issue is that we're creating a loggeer and wanting to write the files based on a queue handler/listener pattern. The patternn...
Hi all,Could someone clarify the intended usage of the variable-overrides.json file in Databricks Asset Bundles?Let me give some context. Let's say my repository layout looks like this:databricks/ ├── notebooks/ │ └── notebook.ipynb ├── resources/ ...
It does. Thanks for the reponse. I also continued playing around with it and found a way using the variable-overrides.json file. I'll leave it here just in case anyone is interested:Repository layout:databricks/ ├── notebooks/ │ └── notebook.ipynb ...
I appeared for the Databricks Certified Machine Learning Associate exam today. However, during the exam, I was paused three times and asked to show my room and surroundings. I complied with all the instructions provided by the proctor each time, ensu...
Hello @Pritisingh-689,Thank you for filing a ticket with our support team, Support team will respond shortly. Please note that we cannot provide support or handle exam suspensions via community.Thanks & Regards,@cert-ops
Hi Team,The customer is facing a challenge related to increasing Databricks workspace maintenance costs. Apparently, every project is creating its own workspace for specific functionalities, and this has become a standard practice. As a result, the n...
This is something that you should discuss with your Databricks rep imo. Even with standard tools, migrating consolidating 200 workspaces is something that needs very careful planning and testing.
Hi Team,I am reaching out to you, to bring to your attention to a series of issues I faced during my Databricks Certified Generative AI Engineer Associate exam, which ultimately led to an unfair suspension and disruption of my exam experience.While a...
Hello @tanviradia55,Thank you for filing a ticket with our support team, Support team will respond shortly. Please note that we cannot provide support or handle exam suspensions via community.Thanks & Regards,@cert-ops
First of all, Hello Everyone!This is my first post on here. For the past few months, I have been working on my personal Microsoft Azure Blog, that includes some Azure Databricks user guides. Sadly, as I am currently searching for work (that is rathe...
@giuseppe_esq personally, I'd love to get the Azure certs! I'll definitely be following the blog. Thanks a bunch for linking that . I'm also learning Databricks so do keep in touch. To answer this question:Awesome, thanks again. Sorry if I sound clu...
Dario Schiraldi Deutsche Bank Executive, known for his strong leadership in the financial and banking sector. Dario Schiraldi brings 20 years of leadership experience to major worldwide organizations where his expertise extends into both market acqui...
SAS to PY is an AI/ML-based Accelerator designed for "SAS to Python or PySpark" code migration. This Accelerator is engineered to convert SAS legacy proprietary codes to the more flexible, open-source Python or PySpark environment with 95% automatica...
Why Should You Use Databricks Asset Bundles (DABs)?Without proper tooling, Data Engineering and Machine Learning projects can quickly become messy.That is why we recommend leveraging DABs to solve these common challenges:1. Collaboration:Without stru...
Hi everyone,I am Dario Schiraldi, CEO of Travel Works, and I am reaching out to the community for some insights. We are in the process of integrating Databricks with AWS for a new project, and I have love to hear from anyone who has experience with t...
Hello Dario Good to meet you. You can connect with your account manager of databricks. Also Azure provides first partner assistance to databricks. you can check Azure services as well. Thank you.
Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...
How did you solve the type error checks on `pyspark.sql ` ? mypy doesn't create the missing stubs for that one?
Facing issue when doing merge of dataframe to delta table which has mask applied on two of the columns.Code DeltaTable.forName(sparkSession=spark,tableOrViewName=f'{catalog}.{schema}.{table_name}').alias('target').merge( new_df.alias('updates'), ...
It looks like Delta Lake APIs (i.e. DeltaTable... ) are not supported with Row filters and column masks.Please see limitations: https://docs.databricks.com/aws/en/tables/row-and-column-filters#limitations