cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

cbhoga
by New Contributor II
  • 262 Views
  • 2 replies
  • 3 kudos

Resolved! Delta sharing with Celonis

Is there is any way/plans of Databricks use Delta sharing to provide data access to Celonis?

  • 262 Views
  • 2 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 3 kudos

Hi @cbhoga ,Delta Sharing is an open protocol for secure data sharing. Databricks already supports it natively, so you can publish data using Delta Sharing. However, whether Celonis can directly consume that shared data depends on whether Celonis sup...

  • 3 kudos
1 More Replies
ChristianRRL
by Valued Contributor III
  • 380 Views
  • 3 replies
  • 4 kudos

Performance Comparison: spark.read vs. Autoloader

Hi there, I would appreciate some help to compare the runtime performance of two approaches to performing ELT in Databricks: spark.read vs. Autoloader. We already have a process in place to extract highly nested json data into a landing path, and fro...

  • 380 Views
  • 3 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Hi @ChristianRRL ,For that kind of ingestion scenario autoloader is a winner . It will scale much better than batch approach - especially if we are talking about large number of files.If you configure autoloader with file notification mode it can sca...

  • 4 kudos
2 More Replies
ChristianRRL
by Valued Contributor III
  • 317 Views
  • 1 replies
  • 2 kudos

Resolved! AutoLoader Ingestion Best Practice

Hi there, I would appreciate some input on AutoLoader best practice. I've read that some people recommend that the latest data should be loaded in its rawest form into a raw delta table (i.e. highly nested json-like schema) and from that data the app...

  • 317 Views
  • 1 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 2 kudos

I think the key thing with holding the raw data in a table, and not transforming that table, is that you have more flexibility at your disposal. There's a great resource available via Databricks Docs for best practices in the Lakehouse. I'd highly re...

  • 2 kudos
ChristianRRL
by Valued Contributor III
  • 587 Views
  • 2 replies
  • 4 kudos

Resolved! What is `read_files`?

Bit of a silly question, but wondering if someone can help me better understand what is `read_files`?read_files table-valued function | Databricks on AWSThere's at least 3 ways to pull raw json data into a spark dataframe:df = spark.read...df = spark...

  • 587 Views
  • 2 replies
  • 4 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 4 kudos

Also, @ChristianRRL , with a slight adjustment to the syntax, it does indeed behave like Autoloaderhttps://docs.databricks.com/aws/en/ingestion/cloud-object-storage/auto-loader/patterns?language=SQL I'd also advise looking at the different options th...

  • 4 kudos
1 More Replies
Maria_fed
by New Contributor III
  • 5369 Views
  • 8 replies
  • 0 kudos

Need help migrating company customer and partner academy accounts to work properly

Hi, originally I accidentally made a customer academy account with my company that is a databricks partner. Then I made an account using my personal email and listed my company email as the partner email for the partner academy account. that account ...

  • 5369 Views
  • 8 replies
  • 0 kudos
Latest Reply
Vaishali2
New Contributor II
  • 0 kudos

Need help to merge my customer portal id with  partner mail id my case number is 00754330 

  • 0 kudos
7 More Replies
rcostanza
by New Contributor III
  • 413 Views
  • 4 replies
  • 2 kudos

Trying to reduce latency on DLT pipelines with Autoloader and derived tables

What I'm trying to achieve: ingest files into bronze tables with Autoloader, then produce Kafka messages for each file ingested using a DLT sink.The issue: latency between file ingested and message produced get exponentially higher the more tables ar...

  • 413 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Hi, I think it is a delay of the autoloader as it doesn't know about the ingested files. It is nothing in common with the state, as it is just an autoloader and it keeps a list of processed files. Autloader scans the directory every minute, usually a...

  • 2 kudos
3 More Replies
frunzy
by New Contributor
  • 304 Views
  • 2 replies
  • 2 kudos

Resolved! how to import sample notebook to azure databricks workspace

In the second onboarding video, the Quickstart Notebook is shown. I found that notebook here:https://www.databricks.com/notebooks/gcp-qs-notebook.htmlI wanted to import it to my workspace in Azure Databricks account, to play with it. However, selecti...

  • 304 Views
  • 2 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

I reported this as a bug:

  • 2 kudos
1 More Replies
thethirtyfour
by New Contributor III
  • 6028 Views
  • 3 replies
  • 3 kudos

Resolved! Configure Databricks in VSCode through WSL

Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....

  • 6028 Views
  • 3 replies
  • 3 kudos
Latest Reply
RaulMoraM
New Contributor III
  • 3 kudos

What worked for me was NOT opening the browser using the pop-up (which generated the 3-legged-OAuth flow error), but clicking on the link provided by the CLI (or copy paste the link on the browser)

  • 3 kudos
2 More Replies
Lakshmipriya_N
by New Contributor II
  • 174 Views
  • 1 replies
  • 1 kudos

Resolved! Request to Extend Partner Tech Summit Lab Access

Hi Team,I would appreciate it if my Partner Tech Summit lab access could be extended, as two of the assigned labs were inaccessible. Could you please advise whom I should contact for this?Thank you.Regards,Lakshmipriya

  • 174 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Lakshmipriya_N ,Create a support ticket and wait for reply:Contact Us

  • 1 kudos
shrutigupta12
by New Contributor II
  • 5475 Views
  • 11 replies
  • 2 kudos

Resolved! DataBricks Certification Exam Got Suspended. Require Immediate support

Hello @Cert-Team  @Certificate Team,Request Id# 00432042I encountered a pathetic experience while attempting my Databricks Certified Data Engineer Professional certification exam. This is a completely unethical process to harass the examinee and lose...

  • 5475 Views
  • 11 replies
  • 2 kudos
Latest Reply
TechInspired
New Contributor II
  • 2 kudos

Hi @Cert-Team, I had similar issue. My exam got suspended too. I had already completed my exam when it got suspended. So you can either evaluate and provide the results or help me reschedule the exam. I have raised a request - #00750846, its been mor...

  • 2 kudos
10 More Replies
RaviG
by New Contributor II
  • 255 Views
  • 1 replies
  • 1 kudos

Resolved! How to install whl from volume for databricks_cluster_policy via terraform.

I would expect resource "databricks_cluster_policy" "cluster_policy" {  name = var.policy_name  libraries {    Volumes {        whl = "/Volumes/bronze/config/python.wheel-1.0.3-9-py3-none-any.whl"      }}}to work but terraform doesnt recognize "volum...

  • 255 Views
  • 1 replies
  • 1 kudos
Latest Reply
PurpleViolin
New Contributor III
  • 1 kudos

This workedresource "databricks_cluster_policy" "cluster_policy" {  name = var.policy_name  libraries {        whl = "/Volumes/bronze/config/python.wheel-1.0.3-9-py3-none-any.whl"      }}

  • 1 kudos
MisterT
by New Contributor
  • 595 Views
  • 1 replies
  • 0 kudos

Cannot get tracing to work on genai app deployed on databricks

Hi, I have a gradio app that is deployed on databricks. The app is coming from this example  provided by databricks. The app works fine, but when I want to add tracing I cannot get it to work. I keep getting the errormlflow.exceptions.MlflowException...

  • 595 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @MisterT ,  In our docs, it is mentioned we use MLflow 3(major upgrade) with GenAI monitoring enabled. Each agent endpoint is assigned an MLflow experiment, and log agent traces from the endpoint to that experiment in real-time. Internally an  MLF...

  • 0 kudos
Andreyai
by New Contributor II
  • 568 Views
  • 3 replies
  • 1 kudos

Ai Query Prompt Token and Completition token

HiI would like to know how can I get the Completition token and Prompt token quantity when using Ai_Query?Thanks

  • 568 Views
  • 3 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Hello @Andreyai good day!!For AI_queries, we have documentation from databricks. : https://docs.databricks.com/aws/en/sql/language-manual/functions/ai_query I am 100% sure you will get better insights from the documentations. But I have something for...

  • 1 kudos
2 More Replies
agilecoach360
by New Contributor II
  • 233 Views
  • 1 replies
  • 1 kudos

2025 Data + AI World Tour Atlanta

Attending how to build Intelligent Agents at Databricks Data+AI World Tour 2025#Databricks #Data+AI #DatabricksWorldTour

  • 233 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Great to hear, @agilecoach360! Please share your learnings and experience from the event with the Community, it would be really valuable for everyone. Looking forward to your insights.

  • 1 kudos
drag7ter
by Contributor
  • 2681 Views
  • 12 replies
  • 5 kudos

Parameters in dashboards data section passing via asset bundles

A new functionality allows deploy dashboards with a asset bundles. Here is an example :# This is the contents of the resulting baby_gender_by_county.dashboard.yml file. resources: dashboards: baby_gender_by_county: display_name: "Baby gen...

  • 2681 Views
  • 12 replies
  • 5 kudos
Latest Reply
Karola_de_Groot
New Contributor III
  • 5 kudos

I did however just found out there is parameterization possible.. dont know yet how to incorporate it into asset bundle deploy but at least i have a first step. You can use SELECT * FROM IDENTIFIER(:catalog || '.' || :schema || '.' || :table)Or hardc...

  • 5 kudos
11 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels