Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
The forum board you are trying to access is permanently deleted.
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insig...
I have created a web app in databricks.the rule for hackathon says the ink to working demo shall be provided for judges. The databricks webapp or any other databricks artificate can not be made publicly accessible as per my knowledge. Also we can no...
Hi @DataBeli ,From what I understand, in the Free Edition, Databricks Apps can only be shared with authenticated Databricks users. As public or anonymous access isn’t supported, you can try adding the judges’ email addresses to your workspace and the...
How can I optimize Databricks to handle large datasets without running into memory or performance problems?
Hey! Great question — I’ve run into this issue quite a few times while working with large datasets in Databricks, and out-of-memory errors can be a real headache. One of the biggest things that helps is making sure your cluster configuration matches ...
Hi,I've constructed an AWS lambda function which is used to auto rotate my Service Principal Secret in the Databricks account. Authentication is setup with OAuth2, the api call for the token generation is successful but when executing the api call to...
Your error message, "Invalid service principal id," typically indicates a mismatch or formatting problem with the service principal's unique identifier in your API request. Although you checked the client_id, this value is not always the one needed f...
I'm currently using databricks-connect through VS Code on MacOS. However, this seems to install (and re-install upon deletion) an IPython startup script which initializes a SparkSession. This is fine as far as it goes, except that this script is *glo...
Databricks Connect on MacOS (and some other platforms) adds a file to the global IPython startup folder, which causes every new IPython session—including those outside the Databricks environment—to attempt loading this SparkSession initialization. Th...
Hi,Apologies, as I am trying to use Pytest first time. I know this question has been raised but I went through previous answers but the issue still exists.I am following DAtabricks and other articles using pytest. My structure is simple as -tests--co...
Your issue with ModuleNotFoundError: No module named 'test_tran' when running pytest from a notebook is likely caused by how Python sets the module import paths and the current working directory inside Databricks notebooks (or similar environments). ...
I am trying to create a workspace using AWS CloudFormation, but the stack fails with the following error:"The resource CreateWorkspace is in a CREATE_FAILED state. This Custom::CreateWorkspace resource is in a CREATE_FAILED state. Received response s...
When a CloudFormation stack fails with “The resource CreateWorkspace is in a CREATE_FAILED state” for a Custom::CreateWorkspace resource, it typically means the Lambda or service backing the custom resource returned a FAILED signal to CloudFormation ...
I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...
There is currently no explicit, built-in mechanism in Databricks Asset Bundles (as of 2024) for directly defining global, environment-targeted constants at the bundle level that can be seamlessly accessed inside notebooks without using job or task pa...
Hi Databricks Team,I’m currently participating in the Databricks Free Edition Hackathon, but I’m running into an issue where I can no longer run my SQL Warehouse or Serverless Cluster.Each time I try to start the Serverless Starter Warehouse or execu...
Unfortunately, there’s not much that can be done if you’ve hit the daily limits. Wishing you the best of luck moving forward. Cheers, Louis.
I have started two courses on https://customer-academy.databricks.com/learn/courses/for example : https://customer-academy.databricks.com/learn/courses/2469/get-started-with-databricks-for-data-engineering/lessons/38169/demo-creating-and-working-with...
Hello @dndeng! Databricks no longer provides notebooks or other downloadable materials with Academy courses. This change helps prevent outdated content from circulating and ensures compatibility within the lab environment. You can access the lab mate...
Hi,We have scala jar build with sbt which is used in Databricks jobs to readstream data from kafka...We are enhancing the from_avro function like below... def deserializeAvro( topic: String, client: CachedSchemaRegistryClient, sc: SparkConte...
Thanks For the Update Louis... As we are planning to Sync All our notebook from Scala to Pyspark , we are in process of converting the code. I think Adding the additional dependency of ABRiS or Adobe’s spark-avro with Schema Registry support will tak...
So, if you purchase an Academy Labs subscription for a year, are all the courses that's marked as included free of charge? When searching in the Academy Labs page, they are marked as "Included" but there's a price of $75 (for example) on each.
I have enrolled into https://customer-academy.databricks.com/learn/courses/2469/get-started-with-databricks-for-data-engineering/lessons/38169/demo-creating-and-working-with-a-delta-tablewhich includes a lab demo , will I have access to the demo mate...
We recently started using the Data Profiling/ Lakehouse monitoring feature from Databricks https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/. Data Profiling is using serverless compute for running the profilin...
Hi @szymon_dybczak Thanks for th quick replay.But it seems serverless budget policies cannot be applied to data profiling/ monitoring jobs. https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/Serverless budget po...
Hi Team,I am deploying to the Databricks workspace using GitHub and DAB. I have noticed that during deployment, the wheel file name is being converted to all lowercase letters (e.g., pyTestReportv2.whl becomes pytestreportv2.whl). This issue does not...
Hi @Nisha_Tech It seems like a Git issue rather than Databricks or DAB. There is a git configuration parameter decides the upper case/ lower case of the file names deploved. Please refer here: https://github.com/desktop/desktop/issues/2672#issuecomme...
Dear Databricks Community Team,I hope you are doing well. I wanted to inform you that I have successfully completed all the required learning path modules between October 8th and October 31st as part of the Databricks certification discount program.I...
Hello @LahariVelpula18! Please refer to the Virtual Learning Festival – October 2025 Update for details regarding voucher distribution. It includes the latest information and next steps.
Hi Team,I completed a self-paced training course during the Databricks Virtual Learning Festival (October 2025) via the Databricks Academy. I understand that a 50% certification voucher is provided upon completion, but I haven’t received it yet. #Dat...
its end of a week in Nov still not received as this was mentioned that in the early Nov i will get the voucher
| User | Count |
|---|---|
| 212 | |
| 183 | |
| 94 | |
| 75 | |
| 50 |