cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Explore learning resources, share insights, and collaborate with peers to enhance your skills in data engineering, machine learning, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Certifications

Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...

852 Posts

Training offerings

Explore discussions on Databricks training programs and offerings within the Community. Get insights...

192 Posts

Get Started Discussions

Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...

2679 Posts

Activity in Community Discussions

DataBeli
by > New Contributor
  • 52 Views
  • 2 replies
  • 0 kudos

Not able to create pubic link for databricks web app

I have created a web app in databricks.the rule for hackathon says the ink to working demo shall be provided for judges. The databricks webapp  or any other databricks artificate can not be made publicly accessible as per my knowledge. Also we can no...

DataBeli_0-1762863320258.png
  • 52 Views
  • 2 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @DataBeli ,From what I understand, in the Free Edition, Databricks Apps can only be shared with authenticated Databricks users. As public or anonymous access isn’t supported, you can try adding the judges’ email addresses to your workspace and the...

  • 0 kudos
1 More Replies
Suheb
by > New Contributor II
  • 27 Views
  • 4 replies
  • 1 kudos

When working with large data sets in Databricks, what are best practices to avoid memory out-of-memo

How can I optimize Databricks to handle large datasets without running into memory or performance problems?

  • 27 Views
  • 4 replies
  • 1 kudos
Latest Reply
tarunnagar
New Contributor III
  • 1 kudos

Hey! Great question — I’ve run into this issue quite a few times while working with large datasets in Databricks, and out-of-memory errors can be a real headache. One of the biggest things that helps is making sure your cluster configuration matches ...

  • 1 kudos
3 More Replies
zibi
by > New Contributor
  • 69 Views
  • 2 replies
  • 0 kudos

API call fails to initiate create Service Principal secret

Hi,I've constructed an AWS lambda function which is used to auto rotate my Service Principal Secret in the Databricks account. Authentication is setup with OAuth2, the api call for the token generation is successful but when executing the api call to...

  • 69 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Your error message, "Invalid service principal id," typically indicates a mismatch or formatting problem with the service principal's unique identifier in your API request. Although you checked the client_id, this value is not always the one needed f...

  • 0 kudos
1 More Replies
jay-cunningham
by > New Contributor
  • 3190 Views
  • 1 replies
  • 0 kudos

Is there a way to prevent databricks-connect from installing a global IPython Spark startup script?

I'm currently using databricks-connect through VS Code on MacOS. However, this seems to install (and re-install upon deletion) an IPython startup script which initializes a SparkSession. This is fine as far as it goes, except that this script is *glo...

  • 3190 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Databricks Connect on MacOS (and some other platforms) adds a file to the global IPython startup folder, which causes every new IPython session—including those outside the Databricks environment—to attempt loading this SparkSession initialization. Th...

  • 0 kudos
ramisinghl01
by > New Contributor
  • 4003 Views
  • 1 replies
  • 0 kudos

PYTEST: Module not found error

Hi,Apologies, as I am trying to use Pytest first time. I know this question has been raised but I went through previous answers but the issue still exists.I am following DAtabricks and other articles using pytest. My structure is simple as -tests--co...

  • 4003 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Your issue with ModuleNotFoundError: No module named 'test_tran' when running pytest from a notebook is likely caused by how Python sets the module import paths and the current working directory inside Databricks notebooks (or similar environments). ...

  • 0 kudos
Vanamaajay
by > New Contributor
  • 3508 Views
  • 1 replies
  • 0 kudos

CloudFormation Stack Failure: Custom::CreateWorkspace in CREATE_FAILED State

I am trying to create a workspace using AWS CloudFormation, but the stack fails with the following error:"The resource CreateWorkspace is in a CREATE_FAILED state. This Custom::CreateWorkspace resource is in a CREATE_FAILED state. Received response s...

  • 3508 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

When a CloudFormation stack fails with “The resource CreateWorkspace is in a CREATE_FAILED state” for a Custom::CreateWorkspace resource, it typically means the Lambda or service backing the custom resource returned a FAILED signal to CloudFormation ...

  • 0 kudos
akshaym0056
by > New Contributor
  • 3218 Views
  • 1 replies
  • 0 kudos

How to Define Constants at Bundle Level in Databricks Asset Bundles for Use in Notebooks?

I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...

  • 3218 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

There is currently no explicit, built-in mechanism in Databricks Asset Bundles (as of 2024) for directly defining global, environment-targeted constants at the bundle level that can be seamlessly accessed inside notebooks without using job or task pa...

  • 0 kudos
Aran_
by > New Contributor
  • 70 Views
  • 1 replies
  • 1 kudos

Hitting Free Tier Daily Limit During Hackathon - Unable to Continue Work!

Hi Databricks Team,I’m currently participating in the Databricks Free Edition Hackathon, but I’m running into an issue where I can no longer run my SQL Warehouse or Serverless Cluster.Each time I try to start the Serverless Starter Warehouse or execu...

  • 70 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Unfortunately, there’s not much that can be done if you’ve hit the daily limits. Wishing you the best of luck moving forward. Cheers, Louis.

  • 1 kudos
dndeng
by > New Contributor II
  • 38 Views
  • 1 replies
  • 0 kudos

Downloading the Training Material / Slides

I have started two courses on https://customer-academy.databricks.com/learn/courses/for example : https://customer-academy.databricks.com/learn/courses/2469/get-started-with-databricks-for-data-engineering/lessons/38169/demo-creating-and-working-with...

  • 38 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @dndeng! Databricks no longer provides notebooks or other downloadable materials with Academy courses. This change helps prevent outdated content from circulating and ensures compatibility within the lab environment. You can access the lab mate...

  • 0 kudos
Naveenkumar1811
by > New Contributor
  • 105 Views
  • 2 replies
  • 0 kudos

Compilation Failing with Scala SBT build to be used in Databricks

Hi,We have scala jar build with sbt which is used in Databricks jobs to readstream data from kafka...We are enhancing the from_avro function like below... def deserializeAvro(    topic: String,    client: CachedSchemaRegistryClient,    sc: SparkConte...

  • 105 Views
  • 2 replies
  • 0 kudos
Latest Reply
Naveenkumar1811
New Contributor
  • 0 kudos

Thanks For the Update Louis... As we are planning to Sync All our notebook from Scala to Pyspark , we are in process of converting the code. I think Adding the additional dependency of ABRiS or Adobe’s spark-avro with Schema Registry support will tak...

  • 0 kudos
1 More Replies
override
by > New Contributor
  • 91 Views
  • 2 replies
  • 0 kudos

Acedemy Labs - "Included" courses free or charge?

So, if you purchase an Academy Labs subscription for a year, are all the courses that's marked as included free of charge? When searching in the Academy Labs page, they are marked as "Included" but there's a price of $75 (for example) on each. 

  • 91 Views
  • 2 replies
  • 0 kudos
Latest Reply
dndeng
New Contributor II
  • 0 kudos

I have enrolled into https://customer-academy.databricks.com/learn/courses/2469/get-started-with-databricks-for-data-engineering/lessons/38169/demo-creating-and-working-with-a-delta-tablewhich includes a lab demo , will I have access to the demo mate...

  • 0 kudos
1 More Replies
Charuvil
by > New Contributor III
  • 44 Views
  • 2 replies
  • 0 kudos

How to tag/ cost track Databricks Data Profiling?

We recently started using the Data Profiling/ Lakehouse monitoring feature from Databricks https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/. Data Profiling is using serverless compute for running the profilin...

  • 44 Views
  • 2 replies
  • 0 kudos
Latest Reply
Charuvil
New Contributor III
  • 0 kudos

Hi @szymon_dybczak Thanks for th quick replay.But it seems serverless budget policies cannot be applied to data profiling/ monitoring jobs. https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/Serverless budget po...

  • 0 kudos
1 More Replies
Nisha_Tech
by > New Contributor II
  • 48 Views
  • 1 replies
  • 1 kudos

Wheel File name is changed after using Databricks Asset Bundle Deployment on Github Actions

Hi Team,I am deploying to the Databricks workspace using GitHub and DAB. I have noticed that during deployment, the wheel file name is being converted to all lowercase letters (e.g., pyTestReportv2.whl becomes pytestreportv2.whl). This issue does not...

  • 48 Views
  • 1 replies
  • 1 kudos
Latest Reply
Charuvil
New Contributor III
  • 1 kudos

Hi @Nisha_Tech It seems like a Git issue rather than Databricks or DAB. There is a git configuration parameter decides the upper case/ lower case of the file names deploved. Please refer here: https://github.com/desktop/desktop/issues/2672#issuecomme...

  • 1 kudos
LahariVelpula18
by > New Contributor
  • 40 Views
  • 1 replies
  • 0 kudos

Subject: Not Received Discount Coupon After Completing Learning Path

Dear Databricks Community Team,I hope you are doing well. I wanted to inform you that I have successfully completed all the required learning path modules between October 8th and October 31st as part of the Databricks certification discount program.I...

  • 40 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @LahariVelpula18! Please refer to the Virtual Learning Festival – October 2025 Update for details regarding voucher distribution. It includes the latest information and next steps.

  • 0 kudos
saurav_sinha16
by > New Contributor II
  • 89 Views
  • 3 replies
  • 2 kudos

Completed Self-Paced Training – Haven’t Received 50% Certification Voucher in Oct end 2025

Hi Team,I completed a self-paced training course during the Databricks Virtual Learning Festival (October 2025) via the Databricks Academy. I understand that a 50% certification voucher is provided upon completion, but I haven’t received it yet. #Dat...

  • 89 Views
  • 3 replies
  • 2 kudos
Latest Reply
saurav_sinha16
New Contributor II
  • 2 kudos

its end of a week in Nov still not received as this was mentioned that in the early Nov i will get the voucher 

  • 2 kudos
2 More Replies