Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insight...
First of all, Hello Everyone!This is my first post on here. For the past few months, I have been working on my personal Microsoft Azure Blog, that includes some Azure Databricks user guides. Sadly, as I am currently searching for work (that is rathe...
@giuseppe_esq I built out a local solution based off my advice above using the Databricks Python SDK & ChatGPT (of course) . I can confirm that I have been able to upload files from my local storage straight to my free edition databricks environment....
Hi Team,I am reaching out to you, to bring to your attention to a series of issues I faced during my Databricks Certified Generative AI Engineer Associate exam, which ultimately led to an unfair suspension and disruption of my exam experience.While a...
Hi @tanviradia55 sorry to hear about your exam troubles. Could you please take a look at this thead?this will hopefully help you!
SAS to PY is an AI/ML-based Accelerator designed for "SAS to Python or PySpark" code migration. This Accelerator is engineered to convert SAS legacy proprietary codes to the more flexible, open-source Python or PySpark environment with 95% automatica...
Why Should You Use Databricks Asset Bundles (DABs)?Without proper tooling, Data Engineering and Machine Learning projects can quickly become messy.That is why we recommend leveraging DABs to solve these common challenges:1. Collaboration:Without stru...
Hi everyone,I am Dario Schiraldi, CEO of Travel Works, and I am reaching out to the community for some insights. We are in the process of integrating Databricks with AWS for a new project, and I have love to hear from anyone who has experience with t...
Hello Dario Good to meet you. You can connect with your account manager of databricks. Also Azure provides first partner assistance to databricks. you can check Azure services as well. Thank you.
Hi all, I'm currently looking at upskilling in Databricks. I'd like to focus on becoming great at solving a variety of problems. I come with a background in the Alteryx community and they have something called Weekly Challenges: https://community.alt...
Hi Team,The customer is facing a challenge related to increasing Databricks workspace maintenance costs. Apparently, every project is creating its own workspace for specific functionalities, and this has become a standard practice. As a result, the n...
Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...
How did you solve the type error checks on `pyspark.sql ` ? mypy doesn't create the missing stubs for that one?
Facing issue when doing merge of dataframe to delta table which has mask applied on two of the columns.Code DeltaTable.forName(sparkSession=spark,tableOrViewName=f'{catalog}.{schema}.{table_name}').alias('target').merge( new_df.alias('updates'), ...
It looks like Delta Lake APIs (i.e. DeltaTable... ) are not supported with Row filters and column masks.Please see limitations: https://docs.databricks.com/aws/en/tables/row-and-column-filters#limitations
Hi Team, My Databricks Certified Data Engineer Associate exam got suspended on July 7th and it is in progress state. I was there continuously in front of the camera, was facing some n/w fluctuations and checked n/w on phone and suddenly the alert app...
Hi @MaheshKumbhar. That sounds really frustrating.For what it's worth, I've looked through the terms and conditions:https://www.databricks.com/learn/certification/terms-and-conditions I think checking your phone during an exam probably isn't the grea...
Hi Everyone,I'm Namrata Hinduja, Swiss (Switzerland) and I feel Exam security is crucial to maintain the integrity and value of Databricks Certification. While some exam requirements—like removing phones, not reading aloud, or staying on-screen—may f...
Hi @NamrataHindujaS. This is undoubtedly a great point. This standard is held by Databrickshttps://www.databricks.com/learn/certification/faq All the best,BS
I have purchased Databricks via Google Cloud but my order is still pending. I emailed support and they mentioned that it is pending because "You have not purchased any of the support contracts (business, enhanced, or production support) to move the c...
Hi @amartinez4, based on your requirement, why don't you sign up for the new Databricks Free Edition?https://www.databricks.com/blog/introducing-databricks-free-edition If you'd rather work with the trial version, can you let us know if you've had an...
Hi Team,I am using Databricks free edition to run some jobs on environment but I am getting error like : Public DBFS root is disabled. Access is denied on path: /FileStore/tables/So how can i get access for this location, could anyone help me here.
Hi @ViratKumar1061, has your problem been resolved? If not, could you provide an update so we look at next steps for a solution.All the best,BS
Hi , I am able to login through partner-databricks-academy but not able to see any option for accessing the lab. please guide.
Thank you @Advika I could see Lab after enrolling in an e-learning course. However, the lab was confined to the course. I am currently doing the "Data Ingestion with Lakeflow Connect" course that mentions a lab, but I don't see a way to access it. Ca...
Joining a regional user group is a great way to connect with data and AI professionals near you. These groups bring together practitioners to learn, network, and share real-world experiences — all within your local context. To join a group: Select y...
Well up for this. I'll find my local UG.