cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

HoussemBL
by New Contributor III
  • 348 Views
  • 3 replies
  • 0 kudos

External tables in DLT pipelines

Hello community,I have implemented a DLT pipeline.In the "Destination" setting of the pipeline I have specified a unity catalog with target schema of type external referring to an S3 destination.My DLT pipeline works well. Yet, I noticed that all str...

  • 348 Views
  • 3 replies
  • 0 kudos
Latest Reply
Sushil_saini
  • 0 kudos

This won't work.best approach is create dlt sink to write to delta external table. This pipeline should only be 1 step. Read table and append flow using data sink. It works fine. 

  • 0 kudos
2 More Replies
BobCat62
by New Contributor III
  • 165 Views
  • 2 replies
  • 1 kudos

How to copy notebooks from local to the tarrget folder via asset bundles

Hi all,I am able to deploy Databricks assets to the target workspace. Jobs and workflows can also be created successfully.But I have aspecial requirement, that I copy the note books to the target folder on databricks workspace.Example:on Local I have...

  • 165 Views
  • 2 replies
  • 1 kudos
Latest Reply
BobCat62
New Contributor III
  • 1 kudos

Hello @ashraf1395 ,Nice to hear you and thank you for your hints.Actually with your idea, I could reach half of my aim you can see here the folder structure in my VS code:and here is part of my `databrick.yml` file:targets:  dev:    # The default tar...

  • 1 kudos
1 More Replies
a_user12
by New Contributor III
  • 32 Views
  • 1 replies
  • 0 kudos

databricks bundle Deploy: exit code 0 even if an error occurs

We have a CI/CD pipeline where we run:databricks bundle deploy [...]The code works fine, however, if we missconfigure it, we see in the output an error message such asDeploying resources... Updating deployment state... Warning: Detected unresolved va...

Data Engineering
asset bundle
  • 32 Views
  • 1 replies
  • 0 kudos
Latest Reply
a_user12
New Contributor III
  • 0 kudos

you can close it: it was an ci/cd issue

  • 0 kudos
gsouza
by Visitor
  • 24 Views
  • 0 replies
  • 0 kudos

Databricks asset bundle occasionally duplicating jobs

Since last year, we have adopted Databricks Asset Bundles for deploying our workflows to the production and staging environments. The tool has proven to be quite effective, and we currently use Azure DevOps Pipelines to automate bundle deployment, tr...

gsouza_0-1743021507944.png
  • 24 Views
  • 0 replies
  • 0 kudos
IGRACH
by New Contributor II
  • 19 Views
  • 0 replies
  • 0 kudos

Unable to delete a table

When I try to delete a table, I'm getting this error:[ErrorClass=INVALID_STATE] TABLE catalog.schema.table_name cannot be deleted because it is being shared via Delta Sharing.I have checked on the internet about it, but could not find any info about ...

  • 19 Views
  • 0 replies
  • 0 kudos
Rajt1
by Visitor
  • 3 Views
  • 0 replies
  • 0 kudos

Job , Task, Stage Creation

I am running below code -df = spark.read.json('xyz.json')df.countI want to understand the actual working of the spark. How many jobs & stages will be created. I want to understand the detailed & easier concept of how it works?

  • 3 Views
  • 0 replies
  • 0 kudos
mrstevegross
by Contributor
  • 159 Views
  • 3 replies
  • 0 kudos

Attempt to use a custom container with an instance pool fails

I am trying to run a job with (1) custom containers, and (2) via an instance pool. Here's the setup:The custom container is just the DBR-provided `databricksruntime/standard:12.2-LTS`The instance pool is defined via the UI (see screenshot, below).At ...

mrstevegross_0-1742914043598.png
  • 159 Views
  • 3 replies
  • 0 kudos
Latest Reply
mrstevegross
Contributor
  • 0 kudos

I think I have solved this. I added a URL for `preloaded_docker_image` to my instance pool, and the job worked correctly.This suggests that the DBR docs for preloaded_docker_image are incomplete; they should clarify that a user must add an entry in o...

  • 0 kudos
2 More Replies
matanper
by New Contributor III
  • 4247 Views
  • 6 replies
  • 1 kudos

Custom docker image fails to initalize

I'm trying to use a custom docker image for my job. This is my docker file:FROM databricksruntime/standard:12.2-LTS COPY . . RUN /databricks/python3/bin/pip install -U pip RUN /databricks/python3/bin/pip install -r requirements.txt USER rootMy job ...

  • 4247 Views
  • 6 replies
  • 1 kudos
Latest Reply
mrstevegross
Contributor
  • 1 kudos

Did y'all ever figure this out? I'm running in a similar issue.

  • 1 kudos
5 More Replies
p_romm
by New Contributor III
  • 26 Views
  • 0 replies
  • 0 kudos

INVALID_HANDLE.SESSION_NOT_FOUND

We run several workflows and tasks parallel using serverless compute. In many different places of code we started to get errors as below. It looks like that when one task fails, every other that run at the same moment fails as well. After retry on on...

  • 26 Views
  • 0 replies
  • 0 kudos
cmathieu
by New Contributor II
  • 199 Views
  • 2 replies
  • 0 kudos

DAB - All projects files deployed

I have an issue with DAB where all the project files, starting from root ., get deployed to the /files folder in the bundle. I would prefer being able to deploy certain util notebooks, but not all the files of the project. I'm able to not deploy any ...

  • 199 Views
  • 2 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @cmathieu ,You can use sync  -paths to specify the files you want to deploy in the files folders instead of all using the include or exclude folder. -paths works better for me I can use it to deploy any files in my local to workspace even ou...

  • 0 kudos
1 More Replies
badari_narayan
by New Contributor II
  • 107 Views
  • 1 replies
  • 0 kudos

Having an issue assigning databricks_current_metastore with terraform provider

I am trying to assign my databricks_current_metastore on terraform and I get the following error back as an output Error: cannot read current metastore: cannot get client current metastore: invalid Databricks Workspace configurationwith data.databric...

  • 107 Views
  • 1 replies
  • 0 kudos
Latest Reply
Panda
Valued Contributor
  • 0 kudos

@badari_narayan Based on above terraform code, you are trying to use the databricks.accounts provider to read the current workspace metastore, which is incorrect — the databricks_current_metastore data source is a workspace-level resource, and must b...

  • 0 kudos
jdlogos
by New Contributor II
  • 185 Views
  • 2 replies
  • 1 kudos

apply_changes_from_snapshot with expectations

Hi,Question: Are expectations supposed to function in conjunction with create_streaming_table() and apply_changes_from_snapshot?Our team is investigating Delta Live Tables and we have a working prototype using Autoloader to ingest some files from a m...

  • 185 Views
  • 2 replies
  • 1 kudos
Latest Reply
jdlogos
New Contributor II
  • 1 kudos

Hi Stefan-Koch,We reached out to our account rep and was instructed to create an Azure support ticket since we do not yet have a paid support plan.  We are hoping to negotiate for paid support.  However, I do not believe the documentation surrounding...

  • 1 kudos
1 More Replies
johschmidt42
by New Contributor II
  • 344 Views
  • 2 replies
  • 0 kudos

Autoloader cloudFiles.maxFilesPerTrigger ignored with .trigger(availableNow=True)?

Hi, I'm using the Auto Loader feature to read streaming data from Delta Lake files and process them in a batch. The trigger is set to availableNow to include all new data from the checkpoint offset but I limit the amount of delta files for the batch ...

  • 344 Views
  • 2 replies
  • 0 kudos
Latest Reply
p_romm
New Contributor III
  • 0 kudos

In doc it is: "cloudFiles.maxFilesPerTrigger" https://docs.databricks.com/aws/en/ingestion/cloud-object-storage/auto-loader/options

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels