cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

unj1m
by New Contributor III
  • 9012 Views
  • 4 replies
  • 0 kudos

Resolved! What version of Python is used for the 16.1 runtime

I'm trying to create a spark udf for a registered model and getting:Exception: Python versions in the Spark Connect client and server are different. To execute user-defined functions, client and server should have the same minor Python version. Pleas...

  • 9012 Views
  • 4 replies
  • 0 kudos
Latest Reply
AndriusVitkausk
New Contributor III
  • 0 kudos

Does this mean that:1. A new dbx runtime comes out2. Serverless compute automatically switches to the new runtime + new python version3. Any external environments that use serverless ie, local VScode / CICD environments also need to upgrade their pyt...

  • 0 kudos
3 More Replies
nikhil_2212
by New Contributor II
  • 742 Views
  • 1 replies
  • 0 kudos

Lakehouse monitoring metrices tables not created automatically.

Hello,I have an external table created in databricks unity catalog workspace and trying to "Create a monitor" for the same from quality tab.While creating the same the dashboard is getting created however the two metrices tables "profile" & "drift" a...

  • 742 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @nikhil_2212! It looks like this post duplicates the one you recently posted. A response has already been provided to the Original post. I recommend continuing the discussion in that thread to keep the conversation focused and organised.

  • 0 kudos
VijayP
by New Contributor
  • 696 Views
  • 1 replies
  • 0 kudos

Stream processing large number of JSON files and handling exception

application writes several JSON  (small)  files and the expected volumes of these files are high ( Estimate: 1 million during the peak season in a hourly window) . As per current design, these files are streamed through Spark Stream and we use autolo...

  • 696 Views
  • 1 replies
  • 0 kudos
Latest Reply
cgrant
Databricks Employee
  • 0 kudos

We have customers that read millions of files per hour+ using Databricks Auto Loader. For high-volume use cases, we recommend enabling file notification mode, which, instead of continuously performing list operations on the filesystem, uses cloud nat...

  • 0 kudos
Pooviond
by New Contributor
  • 1034 Views
  • 1 replies
  • 0 kudos

Urgent: Need Authentication Reset for Databricks Workspace Access

I am unable to access my Databricks workspace because it is still redirecting to Microsoft Entra ID (Azure AD) authentication, even after I have removed the Azure AD enterprise application and changed the AWS IAM Identity Center settings.Issue Detail...

  • 1034 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @Pooviond! Please submit a ticket with the Databricks Support team for assistance in resolving this issue.

  • 0 kudos
mrstevegross
by Contributor III
  • 4231 Views
  • 4 replies
  • 1 kudos

Resolved! How best to measure the time-spent-waiting-for-an-instance?

I'm exploring using an instance pool. Can someone clarify for me which job event log tells me the time-spent-waiting-for-an-instance? I've found 2 candidates:1. The delta between "waitingForCluster" and "started" on the "run events" log, accessible v...

mrstevegross_0-1741800626468.png mrstevegross_1-1741800790749.png
  • 4231 Views
  • 4 replies
  • 1 kudos
Latest Reply
julieAnderson
New Contributor II
  • 1 kudos

 System Logs or Event Timings

  • 1 kudos
3 More Replies
Forssen
by New Contributor II
  • 1409 Views
  • 2 replies
  • 1 kudos

Resolved! When is it time to change from ETL in notebooks to whl/py?

Hi!I would like some input/tips from the community regarding when is it time to go from a working solution in notebooks to something more "stable", like whl/py-files?What are the pros/cons with notebooks compared to whl/py?The way i structured things...

  • 1409 Views
  • 2 replies
  • 1 kudos
Latest Reply
Isi
Honored Contributor III
  • 1 kudos

Hey @Forssen ,My advice:Using .py files and .whl packages is generally more secure and scalable, especially when working in a team. One of the key advantages is that code reviews and version control are much more efficient with .py files, as changes ...

  • 1 kudos
1 More Replies
hbs59
by New Contributor III
  • 8811 Views
  • 7 replies
  • 2 kudos

Resolved! Move multiple notebooks at the same time (programmatically)

If I want to move multiple (hundreds of) notebooks at the same time from one folder to another, what is the best way to do that? Other than going to each individual notebook and clicking "Move".Is there a way to programmatically move notebooks? Like ...

  • 8811 Views
  • 7 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

You can use the export and import API calls in order to export this notebook to your local machine and then import it to the new workspace.Export: https://docs.databricks.com/api/workspace/workspace/exportImport: https://docs.databricks.com/api/works...

  • 2 kudos
6 More Replies
LasseL
by New Contributor III
  • 3304 Views
  • 1 replies
  • 0 kudos

Resolved! Deduplication with rocksdb, should old state files be deleted manually (to manage storage size)?

Hi, I have following streaming setup:I want to remove duplicates in streaming.1) deduplication strategy is defined by two fields: extraction_timestamp and hash (row wise hash)2) watermark strategy: extraction_timestamp with "10 seconds" interval--> R...

  • 3304 Views
  • 1 replies
  • 0 kudos
Latest Reply
LasseL
New Contributor III
  • 0 kudos

Found solution. https://kb.databricks.com/streaming/how-to-efficiently-manage-state-store-files-in-apache-spark-streaming-applications <-- these two parameters.

  • 0 kudos
IGRACH
by New Contributor III
  • 1641 Views
  • 6 replies
  • 2 kudos

Disable exiting current cell when moving around with keyboard arrows

Is there any way do disable exiting current cell when I move cursor around with arrows. When I press up arrow or down arrow it will exit the current cell and go to another cell. Can that functionally be disabled so when I hold up or down arrow key, c...

  • 1641 Views
  • 6 replies
  • 2 kudos
Latest Reply
IGRACH
New Contributor III
  • 2 kudos

Is there any place where I can put this as a request.

  • 2 kudos
5 More Replies
cony2025
by New Contributor II
  • 5024 Views
  • 4 replies
  • 0 kudos

Use Python notebook to read data from Databricks

I'm very new to Databricks. I hope this is the right place to ask this question.I want to use PySpark in a notebook to read data from a Databricks database with the below codes. databricks_host = "adb-xxxx.azuredatabricks.net" http_path = "/sql/1.0/w...

Screenshot 2025-03-04 225915.png Screenshot 2025-03-04 211512.png
  • 5024 Views
  • 4 replies
  • 0 kudos
Latest Reply
dna1
New Contributor II
  • 0 kudos

I would try changing the query to something like the following, it should return the column names in the table so you can see if the jdbc call is actually returning the data correctlySELECT * FROM wphub_poc.gold.v_d_building limit 10 

  • 0 kudos
3 More Replies
robbdunlap
by New Contributor III
  • 5547 Views
  • 8 replies
  • 22 kudos

Turn Off Auto-reveal of Navigation Sidebar

I work with the navigation sidebar closed and use the stacked hamburgers symbol in the upper left to reveal it when I want. Now, if you mouse over the left edge of the browser window too slowly it will auto-reveal the navigation sidebar. I do not wan...

  • 5547 Views
  • 8 replies
  • 22 kudos
Latest Reply
Advika
Databricks Employee
  • 22 kudos

I've checked with the team, and there's no way to turn this off. However, they are making adjustments to improve the experience, and a fix to refine the sidebar functionality is on the way.

  • 22 kudos
7 More Replies
rgcabusas
by New Contributor II
  • 3592 Views
  • 3 replies
  • 0 kudos

Virtual Learning Festival Enrollment

Hi everyone,I tried to enroll to Virtual Learning Festival: 9 April - 30 April but upon clicking the Customers & Prospects link for LEARNING PATHWAY 1: ASSOCIATE DATA ENGINEERING I got the error (refer attached image).Thank you in advance for the hel...

  • 3592 Views
  • 3 replies
  • 0 kudos
Latest Reply
rgcabusas
New Contributor II
  • 0 kudos

Hi @Advika,Refer attached image. I thought it was attached in my question. I really new to here.

  • 0 kudos
2 More Replies
Master_DataBric
by New Contributor II
  • 575 Views
  • 1 replies
  • 0 kudos

Expectation in DLT using multiple columns

Is it possible to define an  expectation  in DLT pipeline using multiple columns?For example, my source has two fields - Division, Material_Number. For division 20, material number starts with 5; for 30 material number starts with 9.Can we have this ...

  • 575 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi @Master_DataBric , Yes its possibleHere is the doc link : - https://docs.databricks.com/aws/en/dlt/expectations?language=Python- https://docs.databricks.com/aws/en/dlt/expectations?language=SQL

  • 0 kudos
TravisBrowne
by New Contributor II
  • 7125 Views
  • 2 replies
  • 1 kudos

POC Comparison: Databricks vs AWS EMR

Hello,I need some assistance with a comparison between Databricks and AWS EMR. We've been evaluating the Databricks Data Intelligence platform for a client and found it to be significantly more expensive than AWS EMR. I understand the challenge in ma...

  • 7125 Views
  • 2 replies
  • 1 kudos
Latest Reply
sandeepmankikar
Contributor
  • 1 kudos

Databricks is highly optimized for Delta, which leverages columnar storage, indexing, and caching for better performance.Instead of directly processing CSV files, convert them to Delta first, then perform aggregations and joins, see if this helps

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels