cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Volker
by Contributor
  • 2351 Views
  • 1 replies
  • 0 kudos

From Partitioning to Liquid Clustering

We had some delta tables that where previously partitioned on year, month, day, and hour. This resulted in quite small partitions and we now switched to liquid clustering.We followed these steps:Remove partitioning by doing REPLACEALTER TABLE --- CLU...

  • 2351 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hey @Volker ,First of all, I’d recommend considering Auto Liquid Clustering, as it can simplify the process of defining clustering keys.You can read more about it in the Databricks documentation (it’s currently in Public Preview, but you can probably...

  • 0 kudos
PavloR
by New Contributor III
  • 3284 Views
  • 10 replies
  • 6 kudos

ISOLATION_STARTUP_FAILURE for Serverless

Hi, I'm getting an error in my job which is using Serverless cluster: `[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSS`It was working fine and then, this error appeared on...

  • 3284 Views
  • 10 replies
  • 6 kudos
Latest Reply
dileeshm
New Contributor II
  • 6 kudos

@Alberto_Umana Looks like the issue resurfaced again. We started seeing "[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSS`" error with serverless compute from last night.Co...

  • 6 kudos
9 More Replies
tingwei
by New Contributor II
  • 7287 Views
  • 5 replies
  • 5 kudos

ISOLATION_STARTUP_FAILURE

Hi I'm getting error in my data pipeline[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSSit was working fine and suddenly it is keep failing. Please advice. 

  • 7287 Views
  • 5 replies
  • 5 kudos
Latest Reply
Jean-Boutros
New Contributor II
  • 5 kudos

I am running on serverless and only yesterday I started seeing this error. Any thoughts?[ISOLATION_STARTUP_FAILURE] Failed to start isolated execution environment. Please contact Databricks support. SQLSTATE: XXKSS

  • 5 kudos
4 More Replies
jeffn
by New Contributor
  • 766 Views
  • 1 replies
  • 0 kudos

Unable to create Cluster via REST API

I have tried all 3 of the example payloads listed here to create a Compute Cluster: https://docs.databricks.com/api/azure/workspace/clusters/createAll return the same error: Invalid JSON given in the body of the request - expected a mapOther Compute ...

  • 766 Views
  • 1 replies
  • 0 kudos
Latest Reply
Panda
Valued Contributor
  • 0 kudos

@jeffn  - I was able to successfully create a cluster using the same Azure documentation, both via the REST API and the Databricks SDK. Hopefully it helps.from databricks.sdk import WorkspaceClient from databricks.sdk.service.compute import CreateClu...

  • 0 kudos
sriramnedunuri
by New Contributor III
  • 886 Views
  • 1 replies
  • 0 kudos

Resolved! Regexp_replaces pattern only once

select  regexp_replace('asdfhsdf&&1&&asdfasdf&&2&&asdf','&&[0-100]&&','') output here it replaces first pattern but not the 2nd patterno/poutputasdfhsdfasdfasdf&&2&&asdf  even if we use position its not working  something like  both wont work select ...

  • 886 Views
  • 1 replies
  • 0 kudos
Latest Reply
sriramnedunuri
New Contributor III
  • 0 kudos

changing to [0-9] works this can be closed.

  • 0 kudos
dkxxx-rc
by Contributor
  • 7854 Views
  • 3 replies
  • 2 kudos

Resolved! CREATE TEMP TABLE

The Databricks assistant tells me (sometimes) that `CREATE TEMP TABLE` is a valid SQL operation.  And other sources (e.g., https://www.freecodecamp.org/news/sql-temp-table-how-to-create-a-temporary-sql-table/) say the same.But in actual practice, thi...

  • 7854 Views
  • 3 replies
  • 2 kudos
Latest Reply
dkxxx-rc
Contributor
  • 2 kudos

In addition to accepting KaranamS's answer, I will note a longer and useful discussion, with caveats, at https://community.databricks.com/t5/data-engineering/how-do-temp-views-actually-work/m-p/20137/highlight/true#M13584.

  • 2 kudos
2 More Replies
Upendra_Dwivedi
by Contributor
  • 1700 Views
  • 5 replies
  • 0 kudos

Remote SQL Server Instance Connection using JDBC

Hi All,I am connecting Remote SQL Server Instance using JDBC Driver, I have enabled TCP/IP and Setup the Firewall Rule. When i am querying the instance i am getting this error:(com.microsoft.sqlserver.jdbc.SQLServerException) The TCP/IP connection to...

  • 1700 Views
  • 5 replies
  • 0 kudos
Latest Reply
turagittech
Contributor
  • 0 kudos

If you want to access a local SQL server, you'll need a Private Link to access the server. If it's on your own local machine, that's likely not possible. Creating a VPN to your machine is a unique problem, and you would be better off using a VM or a ...

  • 0 kudos
4 More Replies
sanq
by New Contributor II
  • 7115 Views
  • 3 replies
  • 7 kudos

what formatter is used to format SQL cell in databricks

Databricks launched formatter Black which formats python cells, I can also see SQL cell getting formatted, but not sure which formatter is being used for SQL cell formatting. No clarity given on docs.

  • 7115 Views
  • 3 replies
  • 7 kudos
Latest Reply
mitch_DE
New Contributor II
  • 7 kudos

The formatter is mentioned here: Develop code in Databricks notebooks - Azure Databricks | Microsoft LearnIt is this npm package: @gethue/sql-formatter - npm

  • 7 kudos
2 More Replies
BobCat62
by New Contributor III
  • 3127 Views
  • 8 replies
  • 3 kudos

Resolved! How to copy notebooks from local to the tarrget folder via asset bundles

Hi all,I am able to deploy Databricks assets to the target workspace. Jobs and workflows can also be created successfully.But I have aspecial requirement, that I copy the note books to the target folder on databricks workspace.Example:on Local I have...

  • 3127 Views
  • 8 replies
  • 3 kudos
Latest Reply
kmodelew
New Contributor III
  • 3 kudos

What are the permissions to this databricks directory? Can someone delete this directory or any file? On Shared workspace everyone can delete bundle files or bundle directory, even if in databricks.yml I provided permissions only to admins ('CAN MANA...

  • 3 kudos
7 More Replies
TomHauf
by New Contributor II
  • 766 Views
  • 1 replies
  • 1 kudos

Sending my weather data to a clients cloud storage

Hi, One of our clients is asking to switch from our API feed to have weather data delivered automatically to their Cloud Storage.  What steps do I need to take from my end?  Do I need to join Databricks to do so? Thanks. Tom

  • 766 Views
  • 1 replies
  • 1 kudos
Latest Reply
XP
Databricks Employee
  • 1 kudos

Hey @TomHauf, while it may not be essential in your case, you should at least consider using Databricks to facilitate loading data into your customers cloud storage. Databricks gives you a few options to make sharing with third parties simple as per ...

  • 1 kudos
Long_Tran
by New Contributor
  • 3919 Views
  • 2 replies
  • 0 kudos

Can job 'run_as' be assigned to users/principals who actually run it?

Can job 'run_as' be assigned to users/principals who actually run it? instead of always a fixed creator/user/pricipal?When a job is run, I would like to see in the job setting "run_as" the name of the actual user/principal who runs it.Currently, "run...

  • 3919 Views
  • 2 replies
  • 0 kudos
Latest Reply
701153
New Contributor II
  • 0 kudos

Yeah, the functionality is odd. You can't change the Run As user to anyone but yourself. But you can run it using the Run As setting previously used. This sort of makes sense if the workflow is created to be run as a service principal with specific p...

  • 0 kudos
1 More Replies
JJ_LVS1
by New Contributor III
  • 1996 Views
  • 1 replies
  • 0 kudos

CLOUD_PROVIDER_RESOURCE_STOCKOUT (Azure)

Hey All,Anyone ran into this 'out of stock' error on certain types of clusters?  We've spent months building on  Standard_D8ads_v5 (delta cache) and this morning a see of red because there are none available.  I can't even spin up a small interactive...

  • 1996 Views
  • 1 replies
  • 0 kudos
Latest Reply
Prabakar
Databricks Employee
  • 0 kudos

Hi JJ,  The CLOUD_PROVIDER_RESOURCE_STOCKOUT error code indicates that the cloud provider is out of physical capacity underneath virtual machines. The failure was caused by the cloud provider and I would recommend you reaching out to the respective c...

  • 0 kudos
valjas
by New Contributor III
  • 3858 Views
  • 3 replies
  • 0 kudos

Warehouse Name in System Tables

Hello.I am creating a table to monitor the usage of All-purpose Compute and SQL Warehouses. From the tables in 'system' catalog, I can get cluster_name and cluster_id. However only warehouse_id is available and not warehouse name. Is there a way to g...

  • 3858 Views
  • 3 replies
  • 0 kudos
Latest Reply
aranjan99
Contributor
  • 0 kudos

I have enabled the compute system schema, but i dont see this tabke. What am i missing?

  • 0 kudos
2 More Replies
cmathieu
by New Contributor III
  • 1444 Views
  • 4 replies
  • 0 kudos

DAB - All projects files deployed

I have an issue with DAB where all the project files, starting from root ., get deployed to the /files folder in the bundle. I would prefer being able to deploy certain util notebooks, but not all the files of the project. I'm able to not deploy any ...

  • 1444 Views
  • 4 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

@cmathieu , It will support  deployment of whole directory and not others as well.

  • 0 kudos
3 More Replies
DylanStout
by Contributor
  • 1045 Views
  • 2 replies
  • 0 kudos

Resolved! Error while reading file from Cloud Storage

The code we are executing: df = spark.read.format("parquet").load("/mnt/g/drb/HN/") df.write.mode('overwrite').saveAsTable("bronze.HN")the error it throws:org.apache.spark.SparkException: Job aborted due to stage failure: Task 44 in stage 642.0 faile...

  • 1045 Views
  • 2 replies
  • 0 kudos
Latest Reply
DylanStout
Contributor
  • 0 kudos

spark.conf.set("spark.sql.parquet.enableVectorizedReader", "false")

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels