cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jdlogos
by New Contributor II
  • 171 Views
  • 1 replies
  • 1 kudos

apply_changes_from_snapshot with expectations

Hi,Question: Are expectations supposed to function in conjunction with create_streaming_table() and apply_changes_from_snapshot?Our team is investigating Delta Live Tables and we have a working prototype using Autoloader to ingest some files from a m...

  • 171 Views
  • 1 replies
  • 1 kudos
Latest Reply
Stefan-Koch
Valued Contributor II
  • 1 kudos

Hi @jdlogos Were you able to find a solution? I work on the same issue and would love to hear if you found a solution.

  • 1 kudos
JJ_LVS1
by New Contributor III
  • 4 Views
  • 0 replies
  • 0 kudos

CLOUD_PROVIDER_RESOURCE_STOCKOUT (Azure)

Hey All,Anyone ran into this 'out of stock' error on certain types of clusters?  We've spent months building on  Standard_D8ads_v5 (delta cache) and this morning a see of red because there are none available.  I can't even spin up a small interactive...

  • 4 Views
  • 0 replies
  • 0 kudos
pt16
by New Contributor
  • 74 Views
  • 2 replies
  • 0 kudos

Enable automatic identity management in Azure Databricks

We have Databricks account admin access but not able to see the option from Databricks admin console to enable automatic identity management.Using the Previews page wanted to enable and fallowed below steps:1. As an account admin, log in to the accou...

  • 74 Views
  • 2 replies
  • 0 kudos
Latest Reply
pt16
New Contributor
  • 0 kudos

After raising Databrick ticket, today I am able to see the Automatic Identity Management  public preview option 

  • 0 kudos
1 More Replies
mrstevegross
by Contributor
  • 124 Views
  • 2 replies
  • 0 kudos

Attempt to use a custom container with an instance pool fails

I am trying to run a job with (1) custom containers, and (2) via an instance pool. Here's the setup:The custom container is just the DBR-provided `databricksruntime/standard:12.2-LTS`The instance pool is defined via the UI (see screenshot, below).At ...

mrstevegross_0-1742914043598.png
  • 124 Views
  • 2 replies
  • 0 kudos
Latest Reply
mrstevegross
Contributor
  • 0 kudos

>To preload container services in a pool, you must do it via the Databricks API, since this option is not available through the UI.I'm not trying to "preload" it, I just want to my cluster to (1) use a container, and (2) use my pool. I'm aware that t...

  • 0 kudos
1 More Replies
verargulla
by New Contributor III
  • 13167 Views
  • 5 replies
  • 4 kudos

Azure Databricks: Error Creating Cluster

We have provisioned a new workspace in Azure using our own VNet. Upon creating the first cluster, I encounter this error:Control Plane Request Failure: Failed to get instance bootstrap steps from the Databricks Control Plane. Please check that instan...

  • 13167 Views
  • 5 replies
  • 4 kudos
Latest Reply
Mohamednazeer
New Contributor III
  • 4 kudos

We are also facing the same issue.

  • 4 kudos
4 More Replies
ayushmangal72
by New Contributor
  • 168 Views
  • 2 replies
  • 1 kudos

Resolved! Revert cluster DBR version to last DBR

Hi Team,We have updated our clusters DBR version, later we got to know that some of our jobs started failing, now we wanted to revert to DBR version to the previos one only but we forgot the DBR version on which job was running fine.Is there any way ...

  • 168 Views
  • 2 replies
  • 1 kudos
Latest Reply
ayushmangal72
New Contributor
  • 1 kudos

Thank you for your reply, I also found an another solution, checked the event_logs and there old DBR versions was mentioned.

  • 1 kudos
1 More Replies
Kayla
by Valued Contributor II
  • 3766 Views
  • 5 replies
  • 0 kudos

Errors When Using R on Unity Catalog Clusters

We are running into errors when running workflows with multiple jobs using the same notebook/different parameters. They are reading from tables we still have in hive_metastore, there's no Unity Catalog tables or functionality referenced anywhere. We'...

  • 3766 Views
  • 5 replies
  • 0 kudos
Latest Reply
Anwarchubb
New Contributor II
  • 0 kudos

R enabled cluster only supports single user group so please check permission at your group level 

  • 0 kudos
4 More Replies
balu_9309
by Visitor
  • 152 Views
  • 2 replies
  • 0 kudos

databricks job runs connect with powerbi

Hi i have databricks jobs run how to connect power bi app or that runs save in blob or delta table

  • 152 Views
  • 2 replies
  • 0 kudos
Latest Reply
chexa_Wee
New Contributor
  • 0 kudos

You can connect Databricks to Power BI using the "Get Data" option. To do this, you need to provide the necessary cluster details and then connect to the Delta tables in Databricks. This allows Power BI to access and analyze data stored in Databricks...

  • 0 kudos
1 More Replies
stef2
by New Contributor III
  • 10044 Views
  • 14 replies
  • 5 kudos

Resolved! 2023-03-22 10:29:23 | Error 403 | https://customer-academy.databricks.com/

I would like to know why I am getting this error when I tried to earn badges for lakehouse fundamentals. I can't access the quiz page. Can you please help on this?

  • 10044 Views
  • 14 replies
  • 5 kudos
Latest Reply
dkn_data
New Contributor II
  • 5 kudos

Login by you gmail account in customer-academy.databricks.com and search the LakeHouse short course and enroll free

  • 5 kudos
13 More Replies
slimbnsalah
by New Contributor
  • 250 Views
  • 1 replies
  • 0 kudos

Use Salesforce Lakeflow Connector with a Salesforce Connected App

Hello, I'm trying to use the new Salesforce Lakeflow connector to ingest data into my Databricks account.However I see only the option to connect using a normal user, whereas I want to use a Salesforce App, just like how it is described here Run fede...

  • 250 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 0 kudos

@slimbnsalah Please select Connection type as of Salesforce Data Cloud then you will be asked for details 

  • 0 kudos
ashap551
by New Contributor II
  • 136 Views
  • 1 replies
  • 0 kudos

Streaming vs Batch with Continuous Trigger

Not sure what the concrete advantage there is for me to create a streaming table vs static one.  In my case, I designed a table with a job that extracts the most lastest files from an s3 location and then appends them to a delta table.  I set the job...

  • 136 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 0 kudos

@ashap551 You're essentially implementing a well-optimized micro-batching process, and functionally, it's very similar to what readStream() with Autoloader would do. However, there are some advantages to using Autoloader and a proper streaming table ...

  • 0 kudos
cmathieu
by New Contributor II
  • 173 Views
  • 1 replies
  • 0 kudos

DAB - All projects files deployed

I have an issue with DAB where all the project files, starting from root ., get deployed to the /files folder in the bundle. I would prefer being able to deploy certain util notebooks, but not all the files of the project. I'm able to not deploy any ...

  • 173 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @cmathieu ,You can use sync  -paths to specify the files you want to deploy in the files folders instead of all using the include or exclude folder. -paths works better for me I can use it to deploy any files in my local to workspace even ou...

  • 0 kudos
Nik21
by New Contributor II
  • 106 Views
  • 2 replies
  • 1 kudos

warning message when secrets are added in cluster

when i try to add secrets in cluster config , databricks is showing error that secrets should not be hardcoded, although it is not halting to save those ,it is showing warning even if they are not hardcoded. 

Nik21_0-1742954637148.png
  • 106 Views
  • 2 replies
  • 1 kudos
Latest Reply
chexa_Wee
New Contributor
  • 1 kudos

Hi Nik21,You can use Key Vaults in azure to store your secrets then import them to data bricks.

  • 1 kudos
1 More Replies
BobCat62
by New Contributor II
  • 142 Views
  • 1 replies
  • 0 kudos

How to copy notebooks from local to the tarrget folder via asset bundles

Hi all,I am able to deploy Databricks assets to the target workspace. Jobs and workflows can also be created successfully.But I have aspecial requirement, that I copy the note books to the target folder on databricks workspace.Example:on Local I have...

  • 142 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi @BobCat62, we meet again ,I hope you are doing great.You can deploy your notebooks to your workspace, which are even outside your databricks.yml (bundle root path) using the sync paths mapping. Though by default all these resources go to your spec...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels