cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

lance-gliser
by New Contributor
  • 4523 Views
  • 9 replies
  • 0 kudos

Databricks apps - Volumes and Workspace - FileNotFound issues

I have a Databricks App I need to integrate with volumes using local python os functions. I've setup a simple test:  def __init__(self, config: ObjectStoreConfig): self.config = config # Ensure our required paths are created ...

  • 4523 Views
  • 9 replies
  • 0 kudos
Latest Reply
SarahA
New Contributor II
  • 0 kudos

I would also like to see some responses to this problem.

  • 0 kudos
8 More Replies
Anonym40
by New Contributor III
  • 71 Views
  • 2 replies
  • 3 kudos

Resolved! Ingesting data from APIs

Hi, I need to ingest some data available at API endpoint. I was thinking of this option - 1. make API call from Notebook and save data to ADLS2. use AutoLoader to load data from ADLS location. But then, i have some doubts - like I can directly write ...

  • 71 Views
  • 2 replies
  • 3 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 3 kudos

@Anonym40 - its generally a good idea to break the direct API calls to your rest of the data pipeline. By staging the data to ADLS, you are protecting your downstream to upstream processes and getting more restartability/maintenance in your e2e flow....

  • 3 kudos
1 More Replies
ymmmm
by New Contributor
  • 122 Views
  • 6 replies
  • 1 kudos

Account reset and loss of access to paid Databricks Academy Labs subscription

Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...

  • 122 Views
  • 6 replies
  • 1 kudos
Latest Reply
ymmmm
New Contributor
  • 1 kudos

Thank you for your support and tour help

  • 1 kudos
5 More Replies
Hubert-Dudek
by Databricks MVP
  • 70 Views
  • 1 replies
  • 2 kudos

Databricks Advent Calendar 2025 #18

Automatic file retention in the autoloader is one of my favourite new features of 2025. Automatically move cloud files to cold storage or just delete.

2025_18.png
  • 70 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Thanks for sharing @Hubert-Dudek ! That's a really great feature. It simplified a lot data maintenance process at one of my clients

  • 2 kudos
Prathy
by New Contributor
  • 92 Views
  • 2 replies
  • 3 kudos

AWS & Databricks Registration Issue

I have created both AWS and Databricks account but I cannot move to further steps in aws marketplace (configure and launch section)

Screenshot 2025-12-18 154546.png Screenshot 2025-12-18 151247.png
  • 92 Views
  • 2 replies
  • 3 kudos
Latest Reply
Advika
Community Manager
  • 3 kudos

Hello @Prathy!Also, please check out this video: https://www.youtube.com/watch?v=uzjHI0DNbbsRefer to the deck linked in the video’s description (https://drive.google.com/file/d/1ovZd...) and check slide no. 16, titled “Linking AWS to your Databricks ...

  • 3 kudos
1 More Replies
greengil
by New Contributor II
  • 420 Views
  • 10 replies
  • 2 kudos

Create function issue

Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...

  • 420 Views
  • 10 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...

  • 2 kudos
9 More Replies
Hubert-Dudek
by Databricks MVP
  • 73 Views
  • 0 replies
  • 0 kudos

Databricks Advent Calendar 2025 #16

For many data engineers who love PySpark, the most significant improvement of 2025 was the addition of merge to the dataframe API, so no more Delta library or SQL is needed to perform MERGE. p.s. I still prefer SQL MERGE inside spark.sql()

2025_16.png
  • 73 Views
  • 0 replies
  • 0 kudos
DataYoga
by New Contributor
  • 5035 Views
  • 4 replies
  • 0 kudos

Informatica ETLs

I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...

  • 5035 Views
  • 4 replies
  • 0 kudos
Latest Reply
zalmane
New Contributor
  • 0 kudos

We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and fin...

  • 0 kudos
3 More Replies
Hubert-Dudek
by Databricks MVP
  • 74 Views
  • 0 replies
  • 2 kudos

Databricks Advent Calendar 2025 #15

New Lakakebase experience is a game-changer for transactional databases. That functionality is fantastic. Autoscaling to zero makes it really cost-effective. Do you need to deploy to prod? Just branch the production database to the release branch, an...

2025_15.png
  • 74 Views
  • 0 replies
  • 2 kudos
Peter_Theil
by New Contributor
  • 183 Views
  • 3 replies
  • 3 kudos

Databricks partner journey for small firms

Hello,We are a team of 5 ( DE/ Architects ) exploring the idea of starting a small consulting company focused on Databricks as a SI partner and wanted to learn from others who have gone through the partnership journey.I would love to understand how t...

  • 183 Views
  • 3 replies
  • 3 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 3 kudos

If I’m being completely honest, I haven’t seen any. As you can imagine, partner organizations tend to keep things pretty close to the vest for a variety of reasons. That said, once a new partner is officially enrolled, they are granted access to an e...

  • 3 kudos
2 More Replies
Hubert-Dudek
by Databricks MVP
  • 93 Views
  • 0 replies
  • 0 kudos

Databricks Advent Calendar 2025 #14

Ingestion from SharePoint is now available directly in PySpark. Just define a connection and use spark-read or, even better, spark-readStream with an autoloader. Just specify the file type and options for that file (pdf, csv, Excel, etc.)

2025_14.png
  • 93 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels