cancel
Showing results for 
Search instead for 
Did you mean: 
Discussions
Engage in dynamic conversations covering diverse topics within the Databricks Community. Explore discussions on data engineering, machine learning, and more. Join the conversation and expand your knowledge base with insights from experts and peers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Community Discussions

Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Expl...

4559 Posts

Activity in Discussions

juliechen
by > New Contributor II
  • 36 Views
  • 2 replies
  • 0 kudos

Databricks Genie: Power Users vs. Broad Org Access

Hi everyone,I’m curious if anyone has successfully implemented Databricks Genie (chat/agent) for production use.Currently, we’ve enabled a few Genie instances for power users who are comfortable working with data outside of the data team. However, we...

  • 36 Views
  • 2 replies
  • 0 kudos
Latest Reply
juliechen
New Contributor II
  • 0 kudos

Hi @EmmaThe recommendation was from Genie code as attached below. To me, this is not curated issue, it's 1. maturity of LLM and 2. user prompt quality. I observed several times that users asked questions with confusing perspective, Genie could decode...

  • 0 kudos
1 More Replies
sharath007
by > New Contributor II
  • 42 Views
  • 5 replies
  • 0 kudos

SQL Warehouse fails to start — RESOURCE_EXHAUSTED Error

I'm unable to start my SQL warehouse (Serverless) due to a RESOURCE_EXHAUSTED error.Error message: Clusters are failing to launch. Cluster launch will be retried. Request to create a cluster failed with an exception: RESOURCE_EXHAUSTED: Cannot create...

  • 42 Views
  • 5 replies
  • 0 kudos
Latest Reply
Ashwin_DSA
Databricks Employee
  • 0 kudos

Hi @sharath007, Just checked internally. This specific RESOURCE_EXHAUSTED: Cannot create the resource, please try again later message for a serverless SQL warehouse normally indicates that the backing serverless compute pool has run out of capacity (...

  • 0 kudos
4 More Replies
Simranpreet
by > New Contributor
  • 226 Views
  • 9 replies
  • 3 kudos

Permission Inquiry for Databricks Course Content

We are an education-based company currently developing a course on Databricks, which we plan to publish on platforms such as YouTube and Udemy for educational purposes.We would like to confirm whether any formal permission is required from Databricks...

  • 226 Views
  • 9 replies
  • 3 kudos
Latest Reply
abhi_dabhi
Databricks Partner
  • 3 kudos

Hi @Simranpreet No formal permission is required to create and sell/publish an independent educational course about Databricks, but there are specific rules you must follow regarding trademarks, logos, and official Databricks course materials.here's ...

  • 3 kudos
8 More Replies
AnonymousK
by > New Contributor
  • 23 Views
  • 0 replies
  • 0 kudos

Why do you want to migrate from azure synapse analytics or Azure data factory to databricks

It's a simple answer bro. According to our analysis Azure pipelines and not books match process approximately 40% faster than the snaps analytics. If we really want to optimise your pipelines and perform cost optimisations in your team please migrate...

  • 23 Views
  • 0 replies
  • 0 kudos
vg33
by > New Contributor
  • 106 Views
  • 1 replies
  • 0 kudos

workspace config

"I am on a Premium AWS trial workspace (dbc-30503d28-2210). I have two issues:Personal access tokens are grayed out and I cannot generate themMy cluster cannot make outbound HTTP requests to external APIs (getting NameResolutionError when calling api...

  • 106 Views
  • 1 replies
  • 0 kudos
Latest Reply
emma_s
Databricks Employee
  • 0 kudos

Hi, On the personal access tokens greyed out, you need to enable this in the workspace settings. (you'll need to make sure you're a workspace admin first). This is in the advanced setting under workspaces ettings. Are you validating you're a workspa...

  • 0 kudos
LokeshChikuru
by > Databricks Partner
  • 156 Views
  • 4 replies
  • 1 kudos

Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion

Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion and looking for guidance on enforcing integration-user based data access.Observed behaviourU2M OAuth authentication succeeds when ServiceNow access is granted to the works...

  • 156 Views
  • 4 replies
  • 1 kudos
Latest Reply
LokeshChikuru
Databricks Partner
  • 1 kudos

Hi @emma_s I’ve reviewed the setup and wanted to clarify the behavior I’m seeing with the ServiceNow connector and U2M OAuth.The ServiceNow connection was created successfully using a U2M OAuth integration user, and that integration user has admin pe...

  • 1 kudos
3 More Replies
lrm_data
by > New Contributor
  • 101 Views
  • 2 replies
  • 2 kudos

Resolved! **Lakeflow Connect SQL Server — Snapshots Firing Outside Configured Full Refresh Window?**

Has anyone else seen full refresh snapshots trigger outside of their configured refresh window in Lakeflow Connect?Here's our situation:- We have a full refresh window configured to restrict snapshot operations to off-hours- On at least one occasion,...

  • 101 Views
  • 2 replies
  • 2 kudos
Latest Reply
Sumit_7
Honored Contributor III
  • 2 kudos

@lrm_data This is very unlike case for the refresh to be triggered outside the configured window. Though I would still suggest to check the Configured Window and Auto Full Refresh policy once to be sure.If still persists, then you may raise a support...

  • 2 kudos
1 More Replies
lrm_data
by > New Contributor
  • 212 Views
  • 3 replies
  • 0 kudos

Lakeflow Connect - SQL Server - Issues restarting after failure

Has anyone else run into a situation where a breaking schema change on a SQL Server source table leaves their Lakeflow Connect pipeline in a state it can't recover from — even after destroying and recreating the pipeline?Here's what happened to us:- ...

  • 212 Views
  • 3 replies
  • 0 kudos
Latest Reply
lrm_data
New Contributor
  • 0 kudos

Hello @emma_s + @abhi_dabhi ,Thank you so much! I had destroyed the bundle that included schema, ingestion pipeline and gateway. However, I did not clear out SQL Server CDC so that may have been the issue.I plan to leave the current gateway stopped a...

  • 0 kudos
2 More Replies
SahilRana3097
by > Visitor
  • 31 Views
  • 1 replies
  • 0 kudos

Databricks not able to create cluster with Amazon free trial version

Error : Cannot launch the cluster because the user specified an invalid argument.Instance ID: failed-2d901c0f-d88d-499a-aInternal error message: The VM launch request to AWS failed, please check your configuration. [details] InvalidParameterCombinati...

  • 31 Views
  • 1 replies
  • 0 kudos
Latest Reply
DivyaandData
Databricks Employee
  • 0 kudos

The error is coming from AWS, not Databricks: your AWS account is restricted to Free Tier–eligible instance types, but the node type you picked in Databricks maps to an EC2 instance that is not Free Tier–eligible, so AWS rejects the launch request wi...

  • 0 kudos
TX-Aggie-00
by > Databricks Partner
  • 119 Views
  • 2 replies
  • 0 kudos

Sharepoint Connector Site Limitation

Hey All!We are trying out the Beta connector for SharePoint and found that the connector will not work at the root-level site.  Is there a reason for this limitation.  It is unfortunately a hard blocker for us to use the native connector.  MUST_START...

  • 119 Views
  • 2 replies
  • 0 kudos
Latest Reply
saravjeet
Databricks Partner
  • 0 kudos

How you have made the connection, the reason I am asking because, We have two separete tenants (sharepoint in a separate tenant) and databricks setup in a differenet tenant, at the moment we are using the logic app to bring the data into the platform...

  • 0 kudos
1 More Replies
AdrianLobacz
by > Databricks Partner
  • 27 Views
  • 1 replies
  • 0 kudos

FileNotFoundError: [Errno 2] No such file or directory: '../00_configuration/prd/main_configuration.

Maybe someone has encountered this problem before?I’m running parallel loading for 10 objects using pool.map. Nine of them complete successfully, but one fails when trying to read a configuration file. The problem occurs occasionally and doesn’t foll...

  • 27 Views
  • 1 replies
  • 0 kudos
Latest Reply
balajij8
Contributor
  • 0 kudos

@AdrianLobacz You can read the configuration once and pass the object into your function instead of reading the same file multiple times. It eliminates the IO overhead and avoids hitting the FUSE layer. When the code triggers parallel processes, they...

  • 0 kudos
abhay2611
by > New Contributor
  • 34 Views
  • 1 replies
  • 0 kudos

Databricks Exam got suspended due to a power cut

Hello Team, During the exam there was a power failure for 5 mins due to which exam got suspended, I have created a support ticket as well but have not received any response yet , Please resolve this on high priority, this is not fair with the time, m...

  • 34 Views
  • 1 replies
  • 0 kudos
Latest Reply
abhay2611
New Contributor
  • 0 kudos

please @Cert-Team consider this request 

  • 0 kudos
sanju_shree066
by > New Contributor
  • 29 Views
  • 0 replies
  • 0 kudos

Databricks learning festival coupon

Hi team, Ive completed a learning path by taking part in Databricks learning festival which was conducted recently but Ive not received my discount voucher yet 

  • 29 Views
  • 0 replies
  • 0 kudos
seefoods
by > Valued Contributor
  • 33 Views
  • 1 replies
  • 0 kudos

databricks autoloader source files

Hello, How can handle this error when we use autoloader with spark.readStream (com.databricks.sql.cloudfiles.errors.CloudFilesException) [CF_EMPTY_DIR_FOR_SCHEMA_INFERENCE] Cannot infer schema when the input path `/Volumes/default/landing/source/bund...

  • 33 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ashwin_DSA
Databricks Employee
  • 0 kudos

Hi @seefoods, The error message seems to indicate there are no files in the source path? You can either define the schema yourself and pass it to schema(...) so Auto Loader doesn’t need to infer anything.. and as soon as files arrive, the stream will...

  • 0 kudos
john26
by > Visitor
  • 54 Views
  • 1 replies
  • 2 kudos

2026 update: Postman free plan limits and alternatives?

Hey,I’ve been rethinking my API tooling lately and realized I’ve mostly stayed with Postman out of habit.One thing that stood out again is the free plan limitations. They’re not new, but they make collaboration a bit annoying for small teams unless y...

  • 54 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @john26 ,I use Bruno quite often these days and it become the go-to Postman replacement for engineering-heavy workflows specifically because collections live as plain files in your repo. For someone working across Databricks, ADF, and Azure servic...

  • 2 kudos