cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rcostanza
by New Contributor III
  • 303 Views
  • 4 replies
  • 2 kudos

Trying to reduce latency on DLT pipelines with Autoloader and derived tables

What I'm trying to achieve: ingest files into bronze tables with Autoloader, then produce Kafka messages for each file ingested using a DLT sink.The issue: latency between file ingested and message produced get exponentially higher the more tables ar...

  • 303 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Hi, I think it is a delay of the autoloader as it doesn't know about the ingested files. It is nothing in common with the state, as it is just an autoloader and it keeps a list of processed files. Autloader scans the directory every minute, usually a...

  • 2 kudos
3 More Replies
frunzy
by New Contributor
  • 200 Views
  • 2 replies
  • 2 kudos

how to import sample notebook to azure databricks workspace

In the second onboarding video, the Quickstart Notebook is shown. I found that notebook here:https://www.databricks.com/notebooks/gcp-qs-notebook.htmlI wanted to import it to my workspace in Azure Databricks account, to play with it. However, selecti...

  • 200 Views
  • 2 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

I reported this as a bug:

  • 2 kudos
1 More Replies
thethirtyfour
by New Contributor III
  • 5869 Views
  • 3 replies
  • 3 kudos

Resolved! Configure Databricks in VSCode through WSL

Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....

  • 5869 Views
  • 3 replies
  • 3 kudos
Latest Reply
RaulMoraM
New Contributor III
  • 3 kudos

What worked for me was NOT opening the browser using the pop-up (which generated the 3-legged-OAuth flow error), but clicking on the link provided by the CLI (or copy paste the link on the browser)

  • 3 kudos
2 More Replies
Lakshmipriya_N
by New Contributor II
  • 114 Views
  • 1 replies
  • 1 kudos

Request to Extend Partner Tech Summit Lab Access

Hi Team,I would appreciate it if my Partner Tech Summit lab access could be extended, as two of the assigned labs were inaccessible. Could you please advise whom I should contact for this?Thank you.Regards,Lakshmipriya

  • 114 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Lakshmipriya_N ,Create a support ticket and wait for reply:Contact Us

  • 1 kudos
shrutigupta12
by New Contributor II
  • 5248 Views
  • 11 replies
  • 2 kudos

Resolved! DataBricks Certification Exam Got Suspended. Require Immediate support

Hello @Cert-Team  @Certificate Team,Request Id# 00432042I encountered a pathetic experience while attempting my Databricks Certified Data Engineer Professional certification exam. This is a completely unethical process to harass the examinee and lose...

  • 5248 Views
  • 11 replies
  • 2 kudos
Latest Reply
TechInspired
New Contributor II
  • 2 kudos

Hi @Cert-Team, I had similar issue. My exam got suspended too. I had already completed my exam when it got suspended. So you can either evaluate and provide the results or help me reschedule the exam. I have raised a request - #00750846, its been mor...

  • 2 kudos
10 More Replies
RaviG
by New Contributor II
  • 187 Views
  • 1 replies
  • 1 kudos

Resolved! How to install whl from volume for databricks_cluster_policy via terraform.

I would expect resource "databricks_cluster_policy" "cluster_policy" {  name = var.policy_name  libraries {    Volumes {        whl = "/Volumes/bronze/config/python.wheel-1.0.3-9-py3-none-any.whl"      }}}to work but terraform doesnt recognize "volum...

  • 187 Views
  • 1 replies
  • 1 kudos
Latest Reply
PurpleViolin
New Contributor II
  • 1 kudos

This workedresource "databricks_cluster_policy" "cluster_policy" {  name = var.policy_name  libraries {        whl = "/Volumes/bronze/config/python.wheel-1.0.3-9-py3-none-any.whl"      }}

  • 1 kudos
MisterT
by New Contributor
  • 477 Views
  • 1 replies
  • 0 kudos

Cannot get tracing to work on genai app deployed on databricks

Hi, I have a gradio app that is deployed on databricks. The app is coming from this example  provided by databricks. The app works fine, but when I want to add tracing I cannot get it to work. I keep getting the errormlflow.exceptions.MlflowException...

  • 477 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @MisterT ,  In our docs, it is mentioned we use MLflow 3(major upgrade) with GenAI monitoring enabled. Each agent endpoint is assigned an MLflow experiment, and log agent traces from the endpoint to that experiment in real-time. Internally an  MLF...

  • 0 kudos
Andreyai
by New Contributor II
  • 431 Views
  • 3 replies
  • 1 kudos

Ai Query Prompt Token and Completition token

HiI would like to know how can I get the Completition token and Prompt token quantity when using Ai_Query?Thanks

  • 431 Views
  • 3 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Hello @Andreyai good day!!For AI_queries, we have documentation from databricks. : https://docs.databricks.com/aws/en/sql/language-manual/functions/ai_query I am 100% sure you will get better insights from the documentations. But I have something for...

  • 1 kudos
2 More Replies
agilecoach360
by New Contributor II
  • 142 Views
  • 1 replies
  • 1 kudos

2025 Data + AI World Tour Atlanta

Attending how to build Intelligent Agents at Databricks Data+AI World Tour 2025#Databricks #Data+AI #DatabricksWorldTour

  • 142 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Great to hear, @agilecoach360! Please share your learnings and experience from the event with the Community, it would be really valuable for everyone. Looking forward to your insights.

  • 1 kudos
drag7ter
by Contributor
  • 2443 Views
  • 12 replies
  • 2 kudos

Parameters in dashboards data section passing via asset bundles

A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...

  • 2443 Views
  • 12 replies
  • 2 kudos
Latest Reply
Karola_de_Groot
New Contributor III
  • 2 kudos

I did however just found out there is parameterization possible.. dont know yet how to incorporate it into asset bundle deploy but at least i have a first step. You can use SELECT * FROM IDENTIFIER(:catalog || '.' || :schema || '.' || :table)Or hardc...

  • 2 kudos
11 More Replies
matte_kapa_bul
by New Contributor II
  • 292 Views
  • 2 replies
  • 1 kudos

I made an AI assistant for Databricks docs, let me know what you think!

Hello members of the Databricks community!I built this Ask AI chatbot/widget where I gave a custom LLM access to some of Databricks' docs to help answer technical questions for people using Databricks. I tried it on a couple of questions that resembl...

  • 292 Views
  • 2 replies
  • 1 kudos
Latest Reply
WiliamRosa
Contributor
  • 1 kudos

Hi @matte_kapa_bul , how are you doing?First of all, congratulations on the initiative. I’ve tried to do something similar myself, and it’s very useful for locating documentation, but it doesn’t end up being very effective in solving some issues repo...

  • 1 kudos
1 More Replies
masab019
by New Contributor II
  • 230 Views
  • 3 replies
  • 0 kudos

Got No such file or directory error while serving the endpoint

Hello Everyone!I'm using Databricks for my MLOps learning, and I'm following the tutorial, and I got an error while serving the endpoint. I need help in this.Problem Overview:I have created a basic LightGBM model and logged it in the Unity Catalog. T...

  • 230 Views
  • 3 replies
  • 0 kudos
Latest Reply
masab019
New Contributor II
  • 0 kudos

To give you a quick recap, I’ve consolidated the code into a single file for clarity. Using the Iris dataset as an example, I first create a basic model with the scikit-learn flavor. Then, I create a PyFunc wrapper from the registered model and, fina...

  • 0 kudos
2 More Replies
fiverrpromotion
by New Contributor
  • 1253 Views
  • 1 replies
  • 1 kudos

Resolved! Addressing Memory Constraints in Scaling XGBoost and LGBM: A Comprehensive Approach for High-Volume

Scaling XGBoost and LightGBM models to handle exceptionally large datasets—those comprising billions to tens of billions of rows—presents a formidable computational challenge, particularly when constrained by the limitations of in-memory processing o...

  • 1253 Views
  • 1 replies
  • 1 kudos
Latest Reply
jamesl
Databricks Employee
  • 1 kudos

Hi @fiverrpromotion, As you mention, scaling XGBoost and LightGBM for massive datasets has its challenges, especially when trying to preserve critical training capabilities such as early stopping and handling of sparse features / high-cardinality cat...

  • 1 kudos
JavierS
by New Contributor
  • 1208 Views
  • 1 replies
  • 0 kudos

Resolved! Problem with ray train and Databricks Notebook (Strange dbutils error)

Hi everyone,I'm running some code to train a multimodal Hugging Face model with SFTTrainer and TorchTrainer to use all GPU workers. When trying to execute trainer.fit() it gives me a dbutils serialization error,even I am not using dbutils directly in...

Get Started Discussions
AIR
Databricks
DeepLearning
Distributed
ray
  • 1208 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

JavierS -  The dbutils serialization error occurs in your code because dbutils is only available on the Databricks driver node and cannot be pickled or transferred to Spark or Ray worker nodes. This error can appear even if your code doesn't directly...

  • 0 kudos
JaydeepKhatri
by New Contributor II
  • 642 Views
  • 2 replies
  • 1 kudos

Resolved! Using merge Schema with spark.read.csv for inconsistent schemas

The Problem:A common data engineering challenge is reading a directory of CSV files where the schemas are inconsistent. For example, some files might have columns in a different order, or be missing certain columns altogether. The standard behavior o...

  • 642 Views
  • 2 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

  Hey @JaydeepKhatri  here are some helpful points to consider: Is this an officially supported, enhanced feature of the Databricks CSV reader? Based on internal research, this appears to be an undocumented “feature” of Spark running on Databricks. ...

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels