cancel
Showing results for 
Search instead for 
Did you mean: 
Knowledge Sharing Hub
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SumitSingh
by Contributor
  • 3279 Views
  • 7 replies
  • 9 kudos

From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications

In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...

SumitSingh_0-1721402402230.png SumitSingh_1-1721402448677.png SumitSingh_2-1721402469214.png
  • 3279 Views
  • 7 replies
  • 9 kudos
Latest Reply
sandeepmankikar
New Contributor III
  • 9 kudos

As an additional tip for those working towards both the Associate and Professional certifications, I recommend avoiding a long gap between the two exams to maintain your momentum. If possible, try to schedule them back-to-back with just a few days in...

  • 9 kudos
6 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 1033 Views
  • 0 replies
  • 0 kudos

Understanding Databricks Workspace IP Access List

What is a Databricks Workspace IP Access List?The Databricks Workspace IP Access List is a security feature that allows administrators to control access to the Databricks workspace by specifying which IP addresses or IP ranges are allowed or denied a...

imresizer-1733377644213.jpg
Knowledge Sharing Hub
Databricks
Databricks Workspace
IP Access List
Security
  • 1033 Views
  • 0 replies
  • 0 kudos
jirukulapati
by New Contributor III
  • 1331 Views
  • 5 replies
  • 1 kudos

Databricks App Availability

Hi there,I recently came across this post about databricks apps that says it available for public previewhttps://www.databricks.com/blog/introducing-databricks-appsHowever, when I go to previews in the workspace, I don't see an option to enable it, i...

  • 1331 Views
  • 5 replies
  • 1 kudos
Latest Reply
jirukulapati
New Contributor III
  • 1 kudos

Hi there,Just following up, anyone know when Apps will be support in southeastasia?

  • 1 kudos
4 More Replies
DavidOBrien
by New Contributor
  • 6499 Views
  • 3 replies
  • 0 kudos

Editing value of widget parameter within notebook code

I have a notebook with a text widget where I want to be able to edit the value of the widget within the notebook and then reference it in SQL code. For example, assuming there is a text widget named Var1 that has input value "Hello", I would want to ...

  • 6499 Views
  • 3 replies
  • 0 kudos
Latest Reply
anardinelli
Databricks Employee
  • 0 kudos

Hi @DavidOBrien, how are you? You can try the following approach: # Get the current value of the widget current_value = dbutils.widgets.get("widget_name") # Append the new value to the current value new_value = current_value + "appended_value" # Se...

  • 0 kudos
2 More Replies
Sourav-Kundu
by Contributor
  • 648 Views
  • 1 replies
  • 4 kudos

Orchestrate Databricks jobs with Apache Airflow

You can Orchestrate Databricks jobs with Apache AirflowThe Databricks provider implements the below operators:DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing jobDatabricksRunNowOperator : Runs an existing Spark job run...

  • 648 Views
  • 1 replies
  • 4 kudos
Latest Reply
Advika_
Databricks Employee
  • 4 kudos

Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper .

  • 4 kudos
Sourav-Kundu
by Contributor
  • 578 Views
  • 1 replies
  • 2 kudos

Use Retrieval-augmented generation (RAG) to boost performance of LLM applications

Retrieval-augmented generation (RAG) is a method that boosts the performance of large language model (LLM) applications by utilizing tailored data.It achieves this by fetching pertinent data or documents related to a specific query or task and presen...

  • 578 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika_
Databricks Employee
  • 2 kudos

Thanks for sharing such valuable insight, @Sourav-Kundu . Your breakdown of how RAG enhances LLMs is spot on- clear and concise!

  • 2 kudos
Sourav-Kundu
by Contributor
  • 1110 Views
  • 1 replies
  • 2 kudos

You can use Low Shuffle Merge to optimize the Merge process in Delta lake

Low Shuffle Merge in Databricks is a feature that optimizes the way data is merged when using Delta Lake, reducing the amount of data shuffled between nodes.- Traditional merges can involve heavy data shuffling, as data is redistributed across the cl...

  • 1110 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika_
Databricks Employee
  • 2 kudos

Great post, @Sourav-Kundu. The benefits you've outlined, especially regarding faster execution and cost efficiency, are valuable for anyone working with large-scale data processing. Thanks for sharing!

  • 2 kudos
Sourav-Kundu
by Contributor
  • 545 Views
  • 0 replies
  • 1 kudos

Databricks Asset Bundles package and deploy resources like notebooks and workflows as a single unit.

Databricks Asset Bundles help implement software engineering best practices like version control, testing and CI/CD for data and AI projects.1. They allow you to define resources such as jobs and notebooks as source files, making project structure, t...

  • 545 Views
  • 0 replies
  • 1 kudos
Sourav-Kundu
by Contributor
  • 1419 Views
  • 0 replies
  • 1 kudos

How to recover Dropped Tables in Databricks

Have you ever accidentally dropped a table in Databricks, or had someone else mistakenly drop it?Databricks offers a useful feature that allows you to view dropped tables and recover them if needed.1. You need to first execute SHOW TABLES DROPPED2. T...

  • 1419 Views
  • 0 replies
  • 1 kudos
nafikazi
by New Contributor III
  • 1688 Views
  • 4 replies
  • 2 kudos

Resolved! Want to learn LakeFlow Pipelines in community edition.

Hello Everyone. I want to explore LakeFlow Pipelines in the community version but don’t have access to Azure or AWS. I had a bad experience with Azure, where I was charged $85 while just trying to learn. Is there a less expensive, step-by-step learni...

  • 1688 Views
  • 4 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @nafikazi ,Sorry, this is not possible in community edition. Your only option is to have AWS or Azure account. 

  • 2 kudos
3 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 3630 Views
  • 6 replies
  • 7 kudos

🚀 Databricks Custom Apps! 🚀

Whether you're a data scientist or a sales executive, Databricks is making it easier than ever to build, host, and share secure data applications. With our platform, you can now run any Python code on serverless compute, share it with non-technical c...

  • 3630 Views
  • 6 replies
  • 7 kudos
Latest Reply
PiotrU
Contributor II
  • 7 kudos

Can we somehow play with hosting, and expose this app outside?

  • 7 kudos
5 More Replies
Thusharr
by New Contributor II
  • 1428 Views
  • 5 replies
  • 1 kudos

Writing append blob files to unity catalog volum

The workspace is assigned to unity catalog, and all the access to the ADLS Gen2 is now handled via unity catalog only, means no SPN, no connection string, access keys etc. I have to create append blob files in a volume, Is this is possible in a works...

  • 1428 Views
  • 5 replies
  • 1 kudos
Latest Reply
Witold
Honored Contributor
  • 1 kudos

Now I got your point. No, you can't create Append Blob files directly in Volumes, as this is a native Azure functionality. A volume is basically just an abstraction over a native storage.You will still need to use libraries like azure-storage-blob wi...

  • 1 kudos
4 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now