cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vijaypodili
by New Contributor III
  • 353 Views
  • 3 replies
  • 2 kudos

Resolved! Azure databrics Learning tutorials ADB+SQL,ADB+PYSPARK,ADB+PYTHON

Suggest me the best learning tutorials of Azure databricks with the combinations of pyspark,python,sql is the any learning web based tutorials from Databricks  suggest me the best one from scratch to advanced

  • 353 Views
  • 3 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @vijaypodili ,I can recommend Data Engineering Learning Path on Databricks Academy:https://customer-academy.databricks.com/On Udemy, there's an excellent course that covers all the important aspects of working with Databricks on a daily basis:Data...

  • 2 kudos
2 More Replies
EllieFarrell
by New Contributor II
  • 267 Views
  • 2 replies
  • 4 kudos

Resolved! Unexpected Script Execution Differences on databricks.com vs Mobile-Triggered Runtimes

I’m noticing some unusual inconsistencies in how scripts execute on databricks.com compared to when the same workflow is triggered through a mobile-based API. On Databricks, the script runs perfectly when executed directly inside a cluster notebook. ...

  • 267 Views
  • 2 replies
  • 4 kudos
Latest Reply
bianca_unifeye
Contributor
  • 4 kudos

Hi Ellie,What you’re seeing is actually quite common , the same script can behave slightly differently when: run interactively in a notebook on a cluster, vsrun as a job / via API trigger (or from a mobile wrapper hitting that API). It’s usually not ...

  • 4 kudos
1 More Replies
tts
by New Contributor III
  • 3614 Views
  • 9 replies
  • 0 kudos

Resolved! Programatic selection of serverless compute for notebooks environment version

Hello,I have a case where I am executing notebooks from an external system using databricks api /api/2.2/jobs/runs/submit. This has always been non problematic with the job compute, but due to the quite recent serverless for notebooks support being i...

tts_1-1739539955132.png
  • 3614 Views
  • 9 replies
  • 0 kudos
Latest Reply
toby_chu
New Contributor II
  • 0 kudos

Not so sure about the general but in eu-west-3, we could specify the serverless environment version using DAB using the `environments` block and `spec` params:resources: jobs: pipeline: name: "[${bundle.target}]pipeline" webhook_not...

  • 0 kudos
8 More Replies
Phani1
by Databricks MVP
  • 1821 Views
  • 2 replies
  • 0 kudos

informatica jobs from data bricks

Hi TeamHow can we call informatica jobs from data bricks? could you please suggest on this.Regards,Phanindra

Get Started Discussions
informatica jobs
  • 1821 Views
  • 2 replies
  • 0 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 0 kudos

unsure how above answer helps here @Phani1 - The only way I could think of is - call Informatica jobs (specifically Informatica Cloud Data Integration (CDI) mappings or tasks) from Databricks by leveraging REST APIs.Direct API - Trigger an Informatic...

  • 0 kudos
1 More Replies
anabel0
by New Contributor II
  • 237 Views
  • 2 replies
  • 0 kudos

Databricks Java SDK retrieving job task values

Greetings,I have a Job that consists of notebook tasks running python code.Some of the task set task values using dbutils.jobs.taskValues.set(key=key, value=value)as described here How do I retrieve those task values using Databricks Java SDK v0.69.0...

  • 237 Views
  • 2 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Unfortunately you can’t read dbutils.jobs.taskValues directly via the Java SDK. But you could return a JSON payload via dbutils.notebook.exit and read it with getRunOutput. Databricks exposes the notebook “exit” result through Jobs GetRunOutput, then...

  • 0 kudos
1 More Replies
AntonisCh
by New Contributor II
  • 317 Views
  • 5 replies
  • 6 kudos

Resolved! Synchronising metadata (e.g., tags) across schemas under Unity Catalog (Azure)

Hello all,I hope you are doing great!I want to synchronise metadata (e.g., description, comments, tags) across schemas under the Unity Catalog (e.g., test.dev, test.uat). For example, under the schema test.dev, there is a sales table with multiple co...

  • 317 Views
  • 5 replies
  • 6 kudos
Latest Reply
AntonisCh
New Contributor II
  • 6 kudos

It's completely fine, and I do understand. Thank you for your time and effort here! 

  • 6 kudos
4 More Replies
Ritesh-Dhumne
by New Contributor III
  • 242 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks Scenarios

I’m a data engineer with some experience in Databricks. I’m looking for real-life scenarios that are commonly encountered by data engineers. Could you also provide details on how to implement these scenarios?

  • 242 Views
  • 3 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

Generic topic. Here are few latest article to help you on thishttps://community.databricks.com/t5/get-started-guides/getting-started-with-databricks-build-a-simple-lakehouse/tac-p/139492#M29https://community.databricks.com/t5/announcements/big-book-o...

  • 1 kudos
2 More Replies
ShaneCorn
by Contributor
  • 174 Views
  • 2 replies
  • 1 kudos

What are the best ways to implement transcription in podcast apps?

I am starting this discussion for everyone who can answer my query.

  • 174 Views
  • 2 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

1. Use Speech-to-Text Models via MLflowIntegrate open-source models like OpenAI Whisper, Hugging Face Wav2Vec2, or AssemblyAI API.Log the model in MLflow for versioning and reproducibility.Deploy as a Databricks Model Serving endpoint for real-time t...

  • 1 kudos
1 More Replies
Dorothy80Galvin
by New Contributor II
  • 2826 Views
  • 4 replies
  • 1 kudos

How can I Resolve QB Desktop Update Error 15225?

I'm encountering QB Desktop update error 15225. What could be causing this issue, and how can I resolve it? It's disrupting my workflow, and I need a quick fix.

  • 2826 Views
  • 4 replies
  • 1 kudos
Latest Reply
jamessmith11
New Contributor II
  • 1 kudos

If you're seeing Update Error 15225, don’t worry — it’s usually fixable. First, check that your internet connection is stable and make sure your computer’s date and time are correct. Then, open Internet Options and verify that SSL settings are turned...

  • 1 kudos
3 More Replies
bianca_unifeye
by Contributor
  • 186 Views
  • 1 replies
  • 2 kudos

Databricks One Lake

Microsoft Ignite always brings exciting updates but the real question is: what do these announcements actually mean for the business, not just for technology teams?That’s exactly what this article is about. I’m breaking down the new Databricks–OnevLa...

  • 186 Views
  • 1 replies
  • 2 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 2 kudos

Great article @bianca_unifeye Teh move is certainly going to build and unifye the Governance bridge between Azure Databricks and OneLake.

  • 2 kudos
surajitDE
by New Contributor III
  • 210 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks Dashboard Issue: No Mouse-Based Navigation When Dashboard Tabs Exceed the Top Ribbon

When dashboards have many pages, the top tab bar overflows and can’t be navigated using the mouse. Only left keyboard arrow, right keyboard arrow works, which is slow and inconvenient not user friendly.Expected: ability to scroll tabs with mouse e.g....

  • 210 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @surajitDE! You can use the horizontal scroll bar to navigate through the dashboard pages.If you’re on a trackpad, you can simply scroll horizontally. If you’re using a mouse, you can: Hold Shift and scroll with the mouse wheel (easiest), orDra...

  • 0 kudos
esistfred
by New Contributor III
  • 3272 Views
  • 4 replies
  • 6 kudos

Resolved! How to use variable-overrides.json for environment-specific configuration in Asset Bundles?

Hi all,Could someone clarify the intended usage of the variable-overrides.json file in Databricks Asset Bundles?Let me give some context. Let's say my repository layout looks like this:databricks/ ├── notebooks/ │ └── notebook.ipynb ├── resources/ ...

  • 3272 Views
  • 4 replies
  • 6 kudos
Latest Reply
esistfred
New Contributor III
  • 6 kudos

It does. Thanks for the reponse. I also continued playing around with it and found a way using the variable-overrides.json file. I'll leave it here just in case anyone is interested:Repository layout:databricks/ ├── notebooks/ │ └── notebook.ipynb ...

  • 6 kudos
3 More Replies
chris0991
by New Contributor III
  • 2348 Views
  • 4 replies
  • 1 kudos

Best practices for optimizing Spark jobs

What are some best practices for optimizing Spark jobs in Databricks, especially when dealing large datasets? Any tips or resources would be greatly appreciated! I’m trying to analyze data on restaurant menu prices so that insights would be especiall...

  • 2348 Views
  • 4 replies
  • 1 kudos
Latest Reply
Coffee77
Contributor III
  • 1 kudos

In addition to above cool comments, try to use clusters with VMs enabled for disk caching as well. This caches data at parquet files level in VM local storage, acting as a great complement to spark caching.

  • 1 kudos
3 More Replies
bianca_unifeye
by Contributor
  • 189 Views
  • 0 replies
  • 1 kudos

Agent Bricks Webinar

Our Databricks x Unifeye Meetup community just hit 150 members!  A huge milestone, especially considering we’ve consistently had 50+ people joining every webinar. The momentum is real, and the audience keeps growing! This week, we’re taking it one s...

1763050202042.jpg
  • 189 Views
  • 0 replies
  • 1 kudos
CookDataSol
by New Contributor II
  • 287 Views
  • 2 replies
  • 1 kudos

Resolved! SQL cell v spark.sql in notebooks

I am fairly new to Databricks, and indeed Python, so apologies if this has been answered elsewhere but I've been unable to find it.I have been mainly working in notebooks as opposed to the SQL editor, but coding in SQL where possible using SQL cells ...

  • 287 Views
  • 2 replies
  • 1 kudos
Latest Reply
CookDataSol
New Contributor II
  • 1 kudos

Thanks Louis, really good explanation and helpful examples!

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels