cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

kulasangar
by New Contributor II
  • 1587 Views
  • 3 replies
  • 1 kudos

ERROR: Could not find a version that satisfies the requirement

I'm using Databricks workflows along with asset bundle to run my pipeline and recently we moved our pipeline from development to acceptance. We are also using JFrog as our package artifactory.Even though I do see the release candidate version under t...

kulasangar_0-1741255645400.png
  • 1587 Views
  • 3 replies
  • 1 kudos
Latest Reply
mess
New Contributor II
  • 1 kudos

I do have init script install on the cluster level as well but still getting the same error, any other solutions for this error?And I have not made any changes to my asset bundle it was running well other day all of sudden I got this error. 

  • 1 kudos
2 More Replies
AbhiBange
by New Contributor II
  • 374 Views
  • 2 replies
  • 0 kudos

Resolved! unable to create new catalog in databricks free edition

unable to create new catalog in databricks free edition

  • 374 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @AbhiBange ,I had the same issue using UI approach. Open notebook and use SQL to create catalog. It will work CREATE CATALOG your_catalog_name 

  • 0 kudos
1 More Replies
Tahseen0354
by Valued Contributor
  • 6912 Views
  • 3 replies
  • 4 kudos

Resolved! How do I track databricks cluster users ?

Hi, is there a way to find out/monitor which users has used my cluster, how long and how many times in an azure databricks workspace ?

  • 6912 Views
  • 3 replies
  • 4 kudos
Latest Reply
Ashwin_21
New Contributor II
  • 4 kudos

Do you have any query we can get the details from system table who all used the cluster in between this window

  • 4 kudos
2 More Replies
Charansai
by New Contributor III
  • 282 Views
  • 1 replies
  • 1 kudos

Resolved! Workspace Base Environment not applied when promoting DAB‑deployed notebooks to QA

Hi Team,I’m trying to understand how Workspace Base Environments interact with serverless compute when using Databricks Asset Bundles (DAB).According to the documentation:Workspace Base Environments are supported only for serverless Python, Python wh...

  • 282 Views
  • 1 replies
  • 1 kudos
Latest Reply
emma_s
Databricks Employee
  • 1 kudos

Hi,  As you've correctly identified workspace base environments aren't currently supported by DABs as they're a relatively new feature. They are more meant to give a quick base environment to workspace users rather than used to deploy notebooks as jo...

  • 1 kudos
agent007
by Databricks Partner
  • 231 Views
  • 1 replies
  • 1 kudos

Resolved! Lakeflow Ingestion Pipeline – Unable to Generate Event Log Table

For the DLT pipeline, we are successfully generating an event log table using the event_log configuration in the pipeline YAML.However, for the Salesforce pipeline, which is an ingestion pipeline (via Lakeflow Connector), we are unable to create or a...

agent007_0-1770974343963.jpeg
  • 231 Views
  • 1 replies
  • 1 kudos
Latest Reply
bianca_unifeye
Databricks MVP
  • 1 kudos

Please read https://docs.databricks.com/aws/en/ldp/monitor-event-logsLakeflow ingestion pipelines do generate event logs, but they behave differently than classic Spark Declarative Pipelines.  âœ” Lakeflow Connect pipelines include event logs as part o...

  • 1 kudos
tushar_bansal
by Contributor
  • 4252 Views
  • 22 replies
  • 18 kudos

Resolved! Copy text from the integrated Web terminal

How do I copy text from the integrated web terminal? The selection goes away as soon as I lift my finger from the mouse.

Screenshot 2025-09-09 at 11.35.37 AM.png
  • 4252 Views
  • 22 replies
  • 18 kudos
Latest Reply
tushar_bansal
Contributor
  • 18 kudos

An update here. I raised a ticket and found out this is because tmux mode is enabled by default in the web terminal. You can disable the tmux mode by adding `export DISABLE_TMUX=true` to the ~/.bashrc of the compute.When I asked them about the defaul...

  • 18 kudos
21 More Replies
yit337
by Contributor
  • 297 Views
  • 2 replies
  • 2 kudos

Resolved! Service principle Personal Access Token permissions

Hello,I'm using a generated PAT from Service Principal to access Databricks from other tools. Now I've extended the permissions of the Service Principal. Should I re-generate the PAT? Or is the PAT used only for authentication, and authorisation is d...

  • 297 Views
  • 2 replies
  • 2 kudos
Latest Reply
yit337
Contributor
  • 2 kudos

Thanks for the great answer @anshu_roy Where can I check the token permissions?

  • 2 kudos
1 More Replies
ameet9257
by Contributor
  • 259 Views
  • 2 replies
  • 2 kudos

Resolved! Automate Unity Access management Process at Schema, Tables, Cluster and Job

Hi Team,I’d like to automate the process of providing access to users and groups across:SchemasTablesJobsWorkflowsClustersCurrently, this is a manual, day-to-day process. Our goal is to implement an optimized, reliable solution.Proposed approach (con...

  • 259 Views
  • 2 replies
  • 2 kudos
Latest Reply
saurabh18cs
Honored Contributor III
  • 2 kudos

Hi @ameet9257  see if this psot can help you : https://www.linkedin.com/posts/ishashankshekhar_databricks-unitycatalog-datagovernance-activity-7334515730937229312-yVPg?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAlFpRcB2e5Ub33KKrsPHNgFLuA5WVk...

  • 2 kudos
1 More Replies
hobrob_ex
by New Contributor III
  • 505 Views
  • 3 replies
  • 2 kudos

Resolved! How to set query_tags in system.query.history

I've been trying to see if there's a way to have metadata included in query logging in a similar way to query banding in Teradata, and I've come across the field query_tags in system.query.history. However, I can't find any reference to it in the Dat...

  • 505 Views
  • 3 replies
  • 2 kudos
Latest Reply
hobrob_ex
New Contributor III
  • 2 kudos

Hey Anudeep, Good day to you too!That's great news, setting the tags via SQL is just what I need! Out of interest, was that already in the pipeline or was it implemented as a response to my query?

  • 2 kudos
2 More Replies
ripa1
by New Contributor II
  • 795 Views
  • 5 replies
  • 4 kudos

Is anyone getting up and working ? Federating Snowflake-managed Iceberg tables into Azure Databricks

I'm federating Snowflake-managed Iceberg tables into Azure Databricks Unity Catalog to query the same data from both platforms without copying it. I am getting weird error message when query table from Databricks and i have tried to put all nicely in...

Data Engineering
azure
Iceberg
snowflake
unity-catalog
  • 795 Views
  • 5 replies
  • 4 kudos
Latest Reply
ripa1
New Contributor II
  • 4 kudos

Thanks Hubert. I did check the Iceberg metadata location and Databricks can list the files, but the issue is that Snowflake’s Iceberg metadata.json contains paths like abfss://…@<acct>.blob.core.windows.net/..., and on UC Serverless Databricks then t...

  • 4 kudos
4 More Replies
abhitechkr
by New Contributor II
  • 197 Views
  • 1 replies
  • 0 kudos

Resolved! Implementation of Document Renderer Helper Class.

Hi! I am new to Databricks and trying to learn. Need help with the DocumentRenderer Helper Class implementation.

  • 197 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

Hello @abhitechkr - I believe this is a duplicate question. Please see response here. Thanks!

  • 0 kudos
nithin_1991
by New Contributor III
  • 697 Views
  • 6 replies
  • 2 kudos

Resolved! Not able to setup external location via Catalog

Hi All, I’m setting up an external location via Unity Catalog and I’m running into an issue when creating the location.Context:ADLS Gen2 account with Hierarchical Namespace is enabled.A Databricks Access Connector has been created and assigned Contri...

nithin_1991_0-1770842879242.png nithin_1991_1-1770842966053.png nithin_1991_2-1770842991733.png
  • 697 Views
  • 6 replies
  • 2 kudos
Latest Reply
saurabh18cs
Honored Contributor III
  • 2 kudos

Hi @nithin_1991 can you make sure it is storage blob data contributor granted to managed identity (dbac)then create storage credential first for this dbacthen create external location using this credential  

  • 2 kudos
5 More Replies
shoubhit
by Databricks Partner
  • 2412 Views
  • 2 replies
  • 1 kudos

Merge Databricks customer and partner account

I have created my customer academy account with databricks accidentally, but I had to create it with partner one as my company has partnership with it. I need to take certification exam urgently, please help me in merging these two accounts as I am a...

  • 2412 Views
  • 2 replies
  • 1 kudos
Latest Reply
dvdcap
Databricks Partner
  • 1 kudos

I've seen that a partner code can be added in the profile. How can it be provided to me by the entity that is the Databricks' partner?

  • 1 kudos
1 More Replies
agent007
by Databricks Partner
  • 170 Views
  • 1 replies
  • 1 kudos

Resolved! Finding Inserted OR Appended Rows in DLT Event Logs

In the event logs, is there any parameter for finding the inserted record count, e.g - num_inserted_rows ?I found these parameters - 1. num_updated_rows2. num_deleted_rows3. num_output_rowsbut not num_inserted_rows.I could not make the logic of inser...

  • 170 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 1 kudos

Hi,  Unfortunately, the current DLT event log schema does not provide a built-in way to distinguish between inserted and updated rows in the num_upserted_rows metric. There is NO num_inserted_rows parameter in DLT event logs. The num_upserted_rows me...

  • 1 kudos
Andolina
by New Contributor III
  • 469 Views
  • 4 replies
  • 2 kudos

Resolved! Identity column for streaming target tables using declarative pipeline

Hi All,I have few bronze streaming tables from which I want to load into silver target tables based on some joins. I am doing this by creating a streaming table and using a view to retrieve fields from each bronze table. After that I am using AUTO-CD...

  • 469 Views
  • 4 replies
  • 2 kudos
Latest Reply
aleksandra_ch
Databricks Employee
  • 2 kudos

@Andolina , Potential risks of identity columns are: Concurrent writes are not allowed. That is, if multiple flows write to the table with generated identity columns, they will either run sequentially or fail on conflict.There is a slight overhead wh...

  • 2 kudos
3 More Replies
Labels