cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ashraf1395
by Honored Contributor
  • 1660 Views
  • 2 replies
  • 1 kudos

Resolved! Not readable format of event_logs traceback in dlt pipeline

This is my dlt pipeline event_log - why is it not in readable foramt how can i correct it.This is my pipeline code :   import logging logger = logging.getLogger(__name__) logger.info("Error") raise "Error is error"    

ashraf1395_0-1740044260033.png
  • 1660 Views
  • 2 replies
  • 1 kudos
Latest Reply
jorperort
Contributor
  • 1 kudos

Hi @ashraf1395 ,I'm working with Delta Live Tables (DLT) and the event_log table. I would like to know if it is possible to access the event handler that DLT uses to write custom logs and send them to this table when events are published.If this is n...

  • 1 kudos
1 More Replies
susanne
by Contributor
  • 948 Views
  • 2 replies
  • 1 kudos

How to write event_log destination into DLT Settings JSON via Asset Bundles

Hi all,I would like to publish the event_log of my DLT Pipeline to a specific schema in Unity Catalog.Following this article (https://docs.databricks.com/gcp/en/dlt/observability#query-the-event-log) this can be done by writing this into the DLTs set...

  • 948 Views
  • 2 replies
  • 1 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 1 kudos

Hi @susanne , indeed , i tried to create it using dabs as well. This feature is not available using dabs I guess, maybe they will add it once event_logs moves to ga from public preview.databricks API will be a good alternative but if you try it using...

  • 1 kudos
1 More Replies
21f3001806
by New Contributor III
  • 801 Views
  • 1 replies
  • 1 kudos

Resolved! Creating event_log for dlt pipeline using dabs

Hi there, i have dlt pipeline and I recently came to know about the event_log feature, i want to deploy my dlt pipeline along with the event_log using databricks asset bundles but i am not able to find any resources for it.If anyone has tried it , yo...

  • 801 Views
  • 1 replies
  • 1 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 1 kudos

Hi there @21f3001806 ,I guess you are talking about the this : https://docs.databricks.com/api/workspace/pipelines/create#event_logIts still in public preview I tried creating it through UI or by chaing the pipeline settings it worked. But when I imp...

  • 1 kudos
DBStudent
by New Contributor II
  • 755 Views
  • 3 replies
  • 0 kudos

Data migration from S3 to Databricks

I currently have an S3 bucket with around ~80 tables, each of which has hive-style partition columns  S3RootFolder/Table1Name/Year=2024/Month=12/Day=1/xxx.parquetS3RootFolder/Table1Name/Year=2024/Month=12/Day=2/xxx.parquetS3RootFolder/Table2Name/Year...

  • 755 Views
  • 3 replies
  • 0 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 0 kudos

Hi DBStudent,You’re absolutely right—coalescing during write won’t help with the initial bottleneck, since the issue is really with the read side, where Spark has to list and open 110k small files. If you can’t pre-compact them before reading, then o...

  • 0 kudos
2 More Replies
Volker
by Contributor
  • 2095 Views
  • 1 replies
  • 0 kudos

From Partitioning to Liquid Clustering

We had some delta tables that where previously partitioned on year, month, day, and hour. This resulted in quite small partitions and we now switched to liquid clustering.We followed these steps:Remove partitioning by doing REPLACEALTER TABLE --- CLU...

  • 2095 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hey @Volker ,First of all, I’d recommend considering Auto Liquid Clustering, as it can simplify the process of defining clustering keys.You can read more about it in the Databricks documentation (it’s currently in Public Preview, but you can probably...

  • 0 kudos
PavloR
by New Contributor III
  • 2877 Views
  • 10 replies
  • 6 kudos

ISOLATION_STARTUP_FAILURE for Serverless

Hi, I'm getting an error in my job which is using Serverless cluster: `[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSS`It was working fine and then, this error appeared on...

  • 2877 Views
  • 10 replies
  • 6 kudos
Latest Reply
dileeshm
New Contributor II
  • 6 kudos

@Alberto_Umana Looks like the issue resurfaced again. We started seeing "[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSS`" error with serverless compute from last night.Co...

  • 6 kudos
9 More Replies
tingwei
by New Contributor II
  • 7008 Views
  • 5 replies
  • 5 kudos

ISOLATION_STARTUP_FAILURE

Hi I'm getting error in my data pipeline[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSSit was working fine and suddenly it is keep failing. Please advice. 

  • 7008 Views
  • 5 replies
  • 5 kudos
Latest Reply
Jean-Boutros
New Contributor II
  • 5 kudos

I am running on serverless and only yesterday I started seeing this error. Any thoughts?[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSS

  • 5 kudos
4 More Replies
jeffn
by New Contributor
  • 668 Views
  • 1 replies
  • 0 kudos

Unable to create Cluster via REST API

I have tried all 3 of the example payloads listed here to create a Compute Cluster: https://docs.databricks.com/api/azure/workspace/clusters/createAll return the same error: Invalid JSON given in the body of the request - expected a mapOther Compute ...

  • 668 Views
  • 1 replies
  • 0 kudos
Latest Reply
Panda
Valued Contributor
  • 0 kudos

@jeffn  - I was able to successfully create a cluster using the same Azure documentation, both via the REST API and the Databricks SDK. Hopefully it helps.from databricks.sdk import WorkspaceClient from databricks.sdk.service.compute import CreateClu...

  • 0 kudos
sriramnedunuri
by New Contributor III
  • 796 Views
  • 1 replies
  • 0 kudos

Resolved! Regexp_replaces pattern only once

select  regexp_replace('asdfhsdf&&1&&asdfasdf&&2&&asdf','&&[0-100]&&','') output here it replaces first pattern but not the 2nd patterno/poutputasdfhsdfasdfasdf&&2&&asdf  even if we use position its not working  something like  both wont work select ...

  • 796 Views
  • 1 replies
  • 0 kudos
Latest Reply
sriramnedunuri
New Contributor III
  • 0 kudos

changing to [0-9] works this can be closed.

  • 0 kudos
dkxxx-rc
by Contributor
  • 5635 Views
  • 3 replies
  • 2 kudos

Resolved! CREATE TEMP TABLE

The Databricks assistant tells me (sometimes) that `CREATE TEMP TABLE` is a valid SQL operation.  And other sources (e.g., https://www.freecodecamp.org/news/sql-temp-table-how-to-create-a-temporary-sql-table/) say the same.But in actual practice, thi...

  • 5635 Views
  • 3 replies
  • 2 kudos
Latest Reply
dkxxx-rc
Contributor
  • 2 kudos

In addition to accepting KaranamS's answer, I will note a longer and useful discussion, with caveats, at https://community.databricks.com/t5/data-engineering/how-do-temp-views-actually-work/m-p/20137/highlight/true#M13584.

  • 2 kudos
2 More Replies
Upendra_Dwivedi
by Contributor
  • 1471 Views
  • 5 replies
  • 0 kudos

Remote SQL Server Instance Connection using JDBC

Hi All,I am connecting Remote SQL Server Instance using JDBC Driver, I have enabled TCP/IP and Setup the Firewall Rule. When i am querying the instance i am getting this error:(com.microsoft.sqlserver.jdbc.SQLServerException) The TCP/IP connection to...

  • 1471 Views
  • 5 replies
  • 0 kudos
Latest Reply
turagittech
Contributor
  • 0 kudos

If you want to access a local SQL server, you'll need a Private Link to access the server. If it's on your own local machine, that's likely not possible. Creating a VPN to your machine is a unique problem, and you would be better off using a VM or a ...

  • 0 kudos
4 More Replies
sanq
by New Contributor II
  • 6798 Views
  • 3 replies
  • 7 kudos

what formatter is used to format SQL cell in databricks

Databricks launched formatter Black which formats python cells, I can also see SQL cell getting formatted, but not sure which formatter is being used for SQL cell formatting. No clarity given on docs.

  • 6798 Views
  • 3 replies
  • 7 kudos
Latest Reply
mitch_DE
New Contributor II
  • 7 kudos

The formatter is mentioned here: Develop code in Databricks notebooks - Azure Databricks | Microsoft LearnIt is this npm package: @gethue/sql-formatter - npm

  • 7 kudos
2 More Replies
BobCat62
by New Contributor III
  • 2605 Views
  • 8 replies
  • 3 kudos

Resolved! How to copy notebooks from local to the tarrget folder via asset bundles

Hi all,I am able to deploy Databricks assets to the target workspace. Jobs and workflows can also be created successfully.But I have aspecial requirement, that I copy the note books to the target folder on databricks workspace.Example:on Local I have...

  • 2605 Views
  • 8 replies
  • 3 kudos
Latest Reply
kmodelew
New Contributor III
  • 3 kudos

What are the permissions to this databricks directory? Can someone delete this directory or any file? On Shared workspace everyone can delete bundle files or bundle directory, even if in databricks.yml I provided permissions only to admins ('CAN MANA...

  • 3 kudos
7 More Replies
TomHauf
by New Contributor II
  • 697 Views
  • 1 replies
  • 1 kudos

Sending my weather data to a clients cloud storage

Hi, One of our clients is asking to switch from our API feed to have weather data delivered automatically to their Cloud Storage.  What steps do I need to take from my end?  Do I need to join Databricks to do so? Thanks. Tom

  • 697 Views
  • 1 replies
  • 1 kudos
Latest Reply
XP
Databricks Employee
  • 1 kudos

Hey @TomHauf, while it may not be essential in your case, you should at least consider using Databricks to facilitate loading data into your customers cloud storage. Databricks gives you a few options to make sharing with third parties simple as per ...

  • 1 kudos
Long_Tran
by New Contributor
  • 3617 Views
  • 2 replies
  • 0 kudos

Can job 'run_as' be assigned to users/principals who actually run it?

Can job 'run_as' be assigned to users/principals who actually run it? instead of always a fixed creator/user/pricipal?When a job is run, I would like to see in the job setting "run_as" the name of the actual user/principal who runs it.Currently, "run...

  • 3617 Views
  • 2 replies
  • 0 kudos
Latest Reply
701153
New Contributor II
  • 0 kudos

Yeah, the functionality is odd. You can't change the Run As user to anyone but yourself. But you can run it using the Run As setting previously used. This sort of makes sense if the workflow is created to be run as a service principal with specific p...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels