cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

D4F
by Visitor
  • 12 Views
  • 1 replies
  • 0 kudos

Issue Genie API - different responses in UI and via API

Hi community,I created an agent with a genie tool, a wrapper around a GenieAgent connected to my Genie space (GENIE_SPACE_ID) that sends user questions and returns Genie’s textual response. I noticed I get 02 different responses when I post a questio...

  • 12 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hey @D4F ,  What you’re seeing is normal behavior—and the good news is there are very real, very practical ways to make your Genie-based agent more consistent without resorting to a giant, brittle prompt. Let’s dig in. First, why the UI and API can r...

  • 0 kudos
gsouza
by New Contributor II
  • 3070 Views
  • 4 replies
  • 3 kudos

Databricks asset bundle occasionally duplicating jobs

Since last year, we have adopted Databricks Asset Bundles for deploying our workflows to the production and staging environments. The tool has proven to be quite effective, and we currently use Azure DevOps Pipelines to automate bundle deployment, tr...

gsouza_0-1743021507944.png
  • 3070 Views
  • 4 replies
  • 3 kudos
Latest Reply
cmantilla
Visitor
  • 3 kudos

This is a recurring issue for my org as well.

  • 3 kudos
3 More Replies
Jarno
by Visitor
  • 34 Views
  • 1 replies
  • 0 kudos

Dangerous implicit type conversions on 17.3 LTS.

Starting with DBR 17 running Spark 4.0, spark.sql.ansi.enabled is set to true by default. With the flag enabled, strings are implicitly converted to numbers in a very dangerous manner. ConsiderSELECT 123='123';SELECT 123='123X';The first one is succe...

  • 34 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

Under ANSI rules, INT + STRING resolves to BIGINT (Long) that's why it crashes https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html. There are some examples when it works like 1Y or 1L. Regarding 4.0.1, can you double-check ansi.enabled ...

  • 0 kudos
Michael_Appiah
by Contributor II
  • 3289 Views
  • 4 replies
  • 0 kudos

Delta Tables: Time-To-Live

I have seen somewhere (might have been in a Databricks Tech Talk) a Delta Table feature which allows to specify the "expiration date" of data stored in Delta Tables. Once rows surpass their time-to-live, they are automatically deleted or archived. Do...

  • 3289 Views
  • 4 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

Yes that feature was announced on Data + AI summit - really cool.

  • 0 kudos
3 More Replies
kyeongmin_baek
by New Contributor II
  • 114 Views
  • 6 replies
  • 1 kudos

AWS_INSUFFICIENT_INSTANCE_CAPACITY_FAILURE when starting SQL Server Ingestion pipeline

 Dear Community,I’m seeing a compute error when running a Databricks ingestion pipeline (Lakeflow managed ingestion) on AWS.Cloud : AWSRegion: ap−northeast−2Source: SQL Server ingestion pipelineWhen I start the ingestion pipeline, it fails with the f...

kyeongmin_baek_0-1765269963905.png
  • 114 Views
  • 6 replies
  • 1 kudos
Latest Reply
emma_s
Databricks Employee
  • 1 kudos

Hi, I'm afraid you cannot edit compute instance type settings for SQL Server ingestion pipelines via the Databricks UI. Such changes can only be made via API.

  • 1 kudos
5 More Replies
Fatimah-Tariq
by New Contributor III
  • 17 Views
  • 0 replies
  • 0 kudos

Writing to Foreign catalog

I have a running notebook job where I am doing some processing and writing the tables in a foreign catalog. It has been running successfully for about an year. The job is scheduled and runs on job cluster with DBR 16.2Recently, I had to add new noteb...

  • 17 Views
  • 0 replies
  • 0 kudos
EAnthemNHC1
by New Contributor III
  • 277 Views
  • 3 replies
  • 0 kudos

Time Travel Error when selecting from materialized view (Azure Databricks)

Hey - running into an error this morning that was brought to my attention via failed refreshes from PowerBI. We have a materialized view that, when queried with the standard pattern of 'select col1 from {schema}.table_name', returns an error of 'Cann...

  • 277 Views
  • 3 replies
  • 0 kudos
Latest Reply
cookiebaker
New Contributor III
  • 0 kudos

Since last monday december 8th we're experiencing this same issue: Cannot time travel Delta table to version 158. Available versions: [70, 4]. However, this is just when doing a simple SELECT * statement from a gold view (without any version specifie...

  • 0 kudos
2 More Replies
ManojkMohan
by Honored Contributor II
  • 326 Views
  • 6 replies
  • 4 kudos

Resolved! Accessing Databricks data in Salesforce via zero copy

I have uploaded clickstream data as shown belowDo i have to mandatorily share via Delta sharing for values to be exposed in Salesforce ?At the Salesforce end i have confirmed that i have a working connector where i am able to see samples data , but u...

ManojkMohan_0-1763394266370.png ManojkMohan_1-1763394302680.png ManojkMohan_3-1763394433335.png ManojkMohan_4-1763394570789.png
  • 326 Views
  • 6 replies
  • 4 kudos
Latest Reply
Rash_Databrick
  • 4 kudos

HI Team ,Please help me my task is to connect Databrick and salesforce data cloud with zero copy . where we need  databricks data in Salesforce data cloud , also just to mention my databricks workspace +  ADLS stoarge is on private end point. any hel...

  • 4 kudos
5 More Replies
dikla
by New Contributor
  • 81 Views
  • 3 replies
  • 1 kudos

Resolved! Issues Creating Genie Space via API Join Specs Are Not Persisted

Hi,I’m experimenting with the new API to create a Genie Space.I’m able to successfully create the space, but the join definitions are not created, even though I’m passing a join_specs object in the same format returned by GET /spaces/{id} for an exis...

  • 81 Views
  • 3 replies
  • 1 kudos
Latest Reply
dikla
New Contributor
  • 1 kudos

@Raman_Unifeye@Raman_Unifeye Thanks for the detailed explanation — that really helps clarify why my join specs weren’t being persisted.Do you know if support for persisting join_specs, sql_snippets, and measures via the API is planned for an upcoming...

  • 1 kudos
2 More Replies
dvd_lg_bricks
by New Contributor
  • 126 Views
  • 6 replies
  • 3 kudos

Questions About Workers and Executors Configuration in Databricks

Hi everyone, sorry, I’m new here. I’m considering migrating to Databricks, but I need to clarify a few things first.When I define and launch an application, I see that I can specify the number of workers, and then later configure the number of execut...

  • 126 Views
  • 6 replies
  • 3 kudos
Latest Reply
dvd_lg_bricks
New Contributor
  • 3 kudos

I mean: while we’re at it @szymon_dybczak or @Raman_Unifeye , is there a place where all available Databricks configuration parameters are documented? I have some pipelines that rely on special settings, such as changing the serializer, enabling Apac...

  • 3 kudos
5 More Replies
Richard3
by New Contributor
  • 149 Views
  • 4 replies
  • 3 kudos

IDENTIFIER in SQL Views not supported?

Dear community,We are phasing out the dollar param `${catalog_name}` because it has been deprecated since runtime 15.2.We use this parameter in many queries and should now be replaced by the IDENTIFIER clause.In the query below where we retrieve data...

Richard3_0-1765199283388.png Richard3_1-1765199860462.png
  • 149 Views
  • 4 replies
  • 3 kudos
Latest Reply
mnorland
Valued Contributor
  • 3 kudos

There are two options you may want to consider:Switch to using SQL UDTFs from views in certain casesFor each session, dynamically recreate the view using CREATE VIEW via EXECUTE IMMEDIATE or via Python string templating:

  • 3 kudos
3 More Replies
prashant151
by New Contributor II
  • 87 Views
  • 1 replies
  • 1 kudos

Using Init Scipt to execute python notebook at all-purpose cluster level

HiWe have setup.py in my databricks workspace.This script is executed in other transformation scripts using%run /Workspace/Common/setup.pywhich consume lot of time. This setup.py internally calls other utilities notebooks using %run%run /Workspace/Co...

  • 87 Views
  • 1 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 1 kudos

@prashant151 - Unlike legacy (pre-UC) clusters, you cannot directly run a Databricks notebook (like setup.py) from a cluster init script, because init scripts only support shell commands — not %run or notebook execution.You will need to refactor your...

  • 1 kudos
venkatesh557
by New Contributor
  • 50 Views
  • 1 replies
  • 0 kudos

Is there a supported method to register a custom PySpark DataSource so that it becomes visible in th

Built a custom connector using the PySpark DataSource API (DataSource V2). The connector works programmatically, but it does not appear in the Databricks Ingestion UI (Add Data → Connectors) like the Salesforce connector.Is there a supported method t...

  • 50 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @venkatesh557 ,Unfortunately, the answer is no - there isn’t a supported way for you to “register” an arbitrary PySpark DataSource V2 so that it appears as a tile in the Databricks Add data → Connectors (Ingestion) UI right now

  • 0 kudos
tak0519
by New Contributor II
  • 288 Views
  • 6 replies
  • 6 kudos

Resolved! How can I pass parameters from DABs to something(like notebooks)?

I'm implementing DABs, Jobs, and Notebooks.For configure management, I set parameters on databricks.yml.but I can't get parameters on notebook after executed a job successfully. What I implemented ans Steps to the issue:Created "dev-catalog" on WEB U...

  • 288 Views
  • 6 replies
  • 6 kudos
Latest Reply
Taka-Yayoi
Databricks Employee
  • 6 kudos

Hi @tak0519  I think I found the issue! Don't worry - your DABs configuration looks correct. The problem is actually about how you're verifying the results, not the configuration itself. What's happening In your last comment, you mentioned: "Manuall...

  • 6 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels