cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ymmmm
by New Contributor III
  • 439 Views
  • 6 replies
  • 1 kudos

Resolved! Account reset and loss of access to paid Databricks Academy Labs subscription

Hello,I am facing an issue with my Databricks Academy account.During a normal sign-in using my usual email address, I was asked to re-enter my first and last name, as if my account was being created again. After that, my account appeared to be reset,...

  • 439 Views
  • 6 replies
  • 1 kudos
Latest Reply
ymmmm
New Contributor III
  • 1 kudos

Thank you for your support and tour help

  • 1 kudos
5 More Replies
Prathy
by New Contributor
  • 339 Views
  • 2 replies
  • 3 kudos

Resolved! AWS & Databricks Registration Issue

I have created both AWS and Databricks account but I cannot move to further steps in aws marketplace (configure and launch section)

Screenshot 2025-12-18 154546.png Screenshot 2025-12-18 151247.png
  • 339 Views
  • 2 replies
  • 3 kudos
Latest Reply
Advika
Community Manager
  • 3 kudos

Hello @Prathy!Also, please check out this video: https://www.youtube.com/watch?v=uzjHI0DNbbsRefer to the deck linked in the video’s description (https://drive.google.com/file/d/1ovZd...) and check slide no. 16, titled “Linking AWS to your Databricks ...

  • 3 kudos
1 More Replies
greengil
by New Contributor III
  • 776 Views
  • 10 replies
  • 2 kudos

Create function issue

Hello - I am following some online code to create a function as follows:-----------------------------------------CREATE OR REPLACE FUNCTION my_catalog.my_schema.insert_data_function(col1_value STRING,col2_value INT)RETURNS BOOLEANCOMMENT 'Inserts dat...

  • 776 Views
  • 10 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

In UC, the functions must be read-only; they cannot modify state (no INSERT, DELETE, MERGE, CREATE, VACUUM, etc). So I tried to create a PROCEDURE and call it; I was able to insert data into the table successfully. Unity Catalog tools are really jus...

  • 2 kudos
9 More Replies
DataYoga
by New Contributor
  • 5268 Views
  • 4 replies
  • 0 kudos

Informatica ETLs

I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...

  • 5268 Views
  • 4 replies
  • 0 kudos
Latest Reply
zalmane
New Contributor II
  • 0 kudos

We ended up using the tool from datayoga.io that converts these in a multi-stage approach. It converted to an intermediate representation. Then, from there it gets optimized (a lot of the Informatica actions can be optimized out or compacted) and fin...

  • 0 kudos
3 More Replies
Peter_Theil
by New Contributor II
  • 781 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks partner journey for small firms

Hello,We are a team of 5 ( DE/ Architects ) exploring the idea of starting a small consulting company focused on Databricks as a SI partner and wanted to learn from others who have gone through the partnership journey.I would love to understand how t...

  • 781 Views
  • 3 replies
  • 3 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 3 kudos

If I’m being completely honest, I haven’t seen any. As you can imagine, partner organizations tend to keep things pretty close to the vest for a variety of reasons. That said, once a new partner is officially enrolled, they are granted access to an e...

  • 3 kudos
2 More Replies
tarunnagar
by Contributor
  • 613 Views
  • 4 replies
  • 0 kudos

Best Development Strategies for Building Reusable Data Engineering Components in Databricks

I’m looking to gather insights from data engineers, architects, and developers who have experience building scalable pipelines in Databricks. Specifically, I want to understand how to design, implement, and manage reusable data engineering components...

  • 613 Views
  • 4 replies
  • 0 kudos
Latest Reply
Davidwilliamkt
New Contributor II
  • 0 kudos

The best strategy is to build modular, parameterized, Delta-optimized functions and package them into reusable Python modules, while keeping Databricks notebooks only for orchestration. This creates consistent, scalable, and easily maintainable data ...

  • 0 kudos
3 More Replies
Poorva21
by New Contributor III
  • 771 Views
  • 2 replies
  • 3 kudos

How realistic is truly end-to-end LLMOps on Databricks?

Databricks is positioning the platform as a full stack for LLM development — from data ingestion → feature/embedding pipelines → fine-tuning (Mosaic AI) → evaluation → deployment (Model Serving) → monitoring (Lakehouse Monitoring).I’m curious about r...

  • 771 Views
  • 2 replies
  • 3 kudos
Latest Reply
Poorva21
New Contributor III
  • 3 kudos

Thank You @Gecofer for taking the time to share such a clear, experience-backed breakdown of where Databricks shines and where real-world LLM Ops architectures still need supporting components. Your explanation was incredibly practical and resonates ...

  • 3 kudos
1 More Replies
DBXDeveloper111
by New Contributor III
  • 448 Views
  • 3 replies
  • 3 kudos

Resolved! Software engineering in data bricks

I'm a software engineer and a bit new to databricks.  My goal is to create a model serving endpoint, that interfaces with several ML models. Traditionally this would look like:API--> Service --> DataNow using databricks, my understanding is that it w...

  • 448 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

Just register model and then deploy service endpoint to serve this model.

  • 3 kudos
2 More Replies
bianca_unifeye
by Contributor
  • 183 Views
  • 1 replies
  • 1 kudos

Webinar: From PoC to Production: Delivering with Confidence with Databricks

  Our final webinar of December is here and we are closing the year with a powerhouse session!SpeakersAs many organisations still get stuck in the PoC phase, we’re bringing clarity, structure, and real delivery practices to help teams move from promi...

1764757532452 (1).jpg
  • 183 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Appreciate you sharing this with the community, @bianca_unifeye!

  • 1 kudos
tarunnagar
by Contributor
  • 640 Views
  • 3 replies
  • 2 kudos

Using Databricks for Real-Time App Data

I’m exploring how to handle real-time data for an application and I keep seeing Databricks recommended as a strong option — especially with its support for streaming pipelines, Delta Live Tables, and integrations with various event sources. That said...

  • 640 Views
  • 3 replies
  • 2 kudos
Latest Reply
Suheb
Contributor
  • 2 kudos

Databricks is very effective for real-time app data because it supports streaming data processing using Apache Spark and Delta Lake. It helps handle large data volumes, provides low-latency analytics, and makes it easier to build scalable event-drive...

  • 2 kudos
2 More Replies
bharathjs
by New Contributor II
  • 12094 Views
  • 7 replies
  • 2 kudos

Alter table to add/update multiple column comments

I was wondering if there's a way to alter table and add/update comments for multiple columns at once using SQL or API calls. For instance -  ALTER TABLE <table_name> CHANGE COLUMN <col1> COMMENT '<comment1>',CHANGE COLUMN <col2> COMMENT 'comment2' ; ...

  • 12094 Views
  • 7 replies
  • 2 kudos
Latest Reply
dxwell
New Contributor II
  • 2 kudos

The correct SQL syntax for this is:ALTER TABLE your_table_name ALTER COLUMN col1 COMMENT 'comment1', col2 COMMENT 'comment2', col3 COMMENT 'comment3'; 

  • 2 kudos
6 More Replies
tarunnagar
by Contributor
  • 449 Views
  • 3 replies
  • 0 kudos

How to Connect Databricks with Web and Mobile Apps

Hi everyone,I’m exploring ways to leverage Databricks for building data-driven web and mobile applications and wanted to get some insights from this community. Databricks is great for processing large datasets, running analytics, and building machine...

  • 449 Views
  • 3 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 0 kudos

Check Databricks Apps - you pass databricks resources and then use databricks-sdk to interact with them.

  • 0 kudos
2 More Replies
boitumelodikoko
by Valued Contributor
  • 705 Views
  • 1 replies
  • 3 kudos

Data Engineering Lessons

Getting into the data space can feel overwhelming, with so many tools, terms, and technologies. But after years inExpect failure. Design for it.Jobs will fail. The data will be late. Build systems that can recover gracefully, and continually monitor ...

  • 705 Views
  • 1 replies
  • 3 kudos
Latest Reply
Gecofer
Contributor II
  • 3 kudos

Hi @boitumelodikoko A few more principles I always share with people entering the data space:Observability is non-negotiable.If you can’t see what your pipelines are doing, you can’t fix what breaks.Good logging, metrics, and alerts save countless ho...

  • 3 kudos
n1399
by New Contributor II
  • 1226 Views
  • 2 replies
  • 0 kudos

On Demand Pool Configuration & Policy definition

I'm using Job cluster and created compute policies for library management and now I'm trying to use pools in databricks. I'm getting error like this : Cluster validation error: Validation failed for azure_attributes.spot_bid_max_price from pool, the ...

  • 1226 Views
  • 2 replies
  • 0 kudos
Latest Reply
Poorva21
New Contributor III
  • 0 kudos

This error occurs because instance pools require a concrete spot bid max price value, even if the cluster policy marks it as unlimited. Set an explicit value (e.g., 100) directly in the instance pool configuration, or switch the pool to on-demand nod...

  • 0 kudos
1 More Replies
mrstevegross
by Contributor III
  • 6048 Views
  • 2 replies
  • 2 kudos

How to resolve "cannot import name 'Iterable' from 'collections'" error?

I'm running a DBR/Spark job using a container. I've set docker_image.url to `docker.io/databricksruntime/standard:13.3-LTS`, as well as the Spark env var `DATABRICKS_RUNTIME_VERSION=13.3`. At runtime, however, I'm encountering this error: ImportError...

  • 6048 Views
  • 2 replies
  • 2 kudos
Latest Reply
Poorva21
New Contributor III
  • 2 kudos

Go to Compute → Your Cluster / Job ComputeChange Databricks Runtime to:Databricks Runtime 13.3 LTSRe-run your job with the same container.

  • 2 kudos
1 More Replies
Labels