cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NarenderKumar
by New Contributor III
  • 6719 Views
  • 3 replies
  • 1 kudos

Resolved! Unable to read data from ADLS using databricks serverless sql pool

I have a data bricks workspace and an Azure data lake storage account.Both are present in the same Vnet.Unity catalog is enabled in the worksapce.I have created some tables in unity catalog.I am able to query the data from the tables when I use the a...

  • 6719 Views
  • 3 replies
  • 1 kudos
Latest Reply
saiV06
New Contributor III
  • 1 kudos

I'm having the same issue and tried to follow the document shared above, but quite not sure what I'm missing, as I can't make it work. Can someone please help me here? TIA.

  • 1 kudos
2 More Replies
jar
by New Contributor III
  • 2351 Views
  • 4 replies
  • 0 kudos

Data contract implementation best practices

Hi all.We've written some .yml files for our data products in a UC-enabled workspace (dev and prod). We've constructed a directory identical to the one containing the scripts which ultimately creates these products and put them there, initially for g...

  • 2351 Views
  • 4 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

Thank you for your follow-up question. Yes, if it helps, this would be a good starting point/demo: import yaml import pytest # Load the data contract with open('data_contract.yml', 'r') as file: data_contract = yaml.safe_load(file) # Example da...

  • 0 kudos
3 More Replies
minhhung0507
by Contributor
  • 830 Views
  • 5 replies
  • 1 kudos

Resolved! Delta Log Files in GCS Not Deleting Automatically Despite Configuration

Hello Databricks Community,I am experiencing an issue with Delta Lake where the _delta_log files are not being deleted automatically in GCS bucket, even though I have set the table properties to enable this behavior. Here is the configuration I used:...

  • 830 Views
  • 5 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

Glad it helps, and agree to monitoring this behaviour closely. Should you need further assistance, please don't hesitate to reach out.

  • 1 kudos
4 More Replies
Boopathiram
by New Contributor
  • 437 Views
  • 1 replies
  • 0 kudos

Not able to create external location in unity catalog

You do not have the CREATE EXTERNAL LOCATION privilege for this credential. Contact your metastore administrator to grant you the privilege to this credential.  -- My user id is having access to Create external location then also i am getting the sam...

  • 437 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

If you go to the specific storage credential you are trying to use to create this External Location, under permissions does it actually show you have All privileges or the CREATE EXTERNAL LOCATION permission?

  • 0 kudos
sslyle
by New Contributor III
  • 5822 Views
  • 8 replies
  • 5 kudos

Resolved! Combining multiple Academy profiles

I have this profile @gmail.com; my personal professional profile.I also have a @mycompany.com profile.How do I combine both so I can leave my current job for a better life without losing the accolades I'm accumulated under my @mycompany.com login giv...

  • 5822 Views
  • 8 replies
  • 5 kudos
Latest Reply
SparkSeeker
New Contributor II
  • 5 kudos

I have the same issue.I would like to merge my @hotmail.com profile with me @MyCompany profile. Can't seem to find that option on my own.Could someone assist me please?

  • 5 kudos
7 More Replies
mkEngineer
by New Contributor III
  • 330 Views
  • 2 replies
  • 0 kudos

Implement SCD Type 2 in Bronze Layer of DLT Pipeline with Structured Streaming

Hi everyone,I am implementing SCD Type 2 in the Bronze layer of a Delta Live Table (DLT) pipeline using Structured Streaming. I am curious about the necessity of having a table or view before loading data into the Bronze table. Without this, it seems...

  • 330 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Optimizing SCD Type 2: Ensure that the column used for sequencing is a sortable data type.Handle out-of-sequence records by specifying a column in the source data that represents the proper ordering of the source data.1Use the track_history_except_co...

  • 0 kudos
1 More Replies
Isa1
by New Contributor II
  • 149 Views
  • 1 replies
  • 0 kudos

Serverless compute for file notification mode

I am creating a table that ingests data from aws s3 using the 'file notification mode'. With a single user cluster, it works. I would like to use Serverless compute, but I get an error about authentication. Is it possible to do this, or are there alt...

  • 149 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Isa1, Using Serverless compute with Auto Loader in file notification mode can indeed present authentication challenges. Based on the context provided, here are some insights and alternatives:   Authentication Issues with Serverless Compute:Server...

  • 0 kudos
elkaganeva
by New Contributor
  • 203 Views
  • 1 replies
  • 0 kudos

Unity Catalog with Structured Streaming

Hi,Our project uses spark structured streaming scala notebooks to process files stored in an S3 bucket, with the jobs running in Single User access mode.For one of the jobs, we need to use a file arrival trigger. To enable this, the S3 location must ...

  • 203 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

@elkaganeva, When you register an S3 bucket as an external location in Unity Catalog, you can directly access Delta tables stored in that bucket using the spark.readStream and spark.writeStream methods. The metadata for the Delta tables is managed th...

  • 0 kudos
ConfusedZebra
by New Contributor II
  • 698 Views
  • 3 replies
  • 0 kudos

[Databricks Asset Bundles] Changes are not showing when deploying for a second time

Hi allI've followed this guide https://docs.databricks.com/en/dev-tools/bundles/work-tasks.html and managed to deploy a notebook using DABs, but I then changed the cluster settings and ran the deploy line again and it didn't change the cluster.I dele...

  • 698 Views
  • 3 replies
  • 0 kudos
Latest Reply
ConfusedZebra
New Contributor II
  • 0 kudos

Apologies if I'm running these in the wrong place but it doesn't seem to find databricks bundle clean  or databricks bundle build - it shows:Usage:  databricks bundle [command] Available Commands:  deploy      Deploy bundle  deployment  Deployment re...

  • 0 kudos
2 More Replies
Fatimah-Tariq
by New Contributor III
  • 432 Views
  • 2 replies
  • 1 kudos

Resolved! Need an advice of someone with practical experience in DLT

Hi, I'm facing this scenario in my DLT pipeline where in my silver layer I'm doing some filtering to prevent my test data to go to silver schema and then in the end i'm using apply_changes to create the tables and I'm using sequence_by clause within ...

  • 432 Views
  • 2 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

To address the issue of outdated records moving forward to the silver schema in your Delta Live Tables (DLT) pipeline, you can consider the following approach Modify the Filtering Logic: Instead of filtering out the test records before the apply_chan...

  • 1 kudos
1 More Replies
rimaissa
by New Contributor III
  • 740 Views
  • 2 replies
  • 1 kudos

Autoloader file notification mode error using UC

We have a DLT pipeline we've created that is using autoloader file notification mode. The pipeline ran fine before moving it to UC. Now that we're using UC, we are getting an AWS permissions issue when the autoloader file notification mode is set to ...

  • 740 Views
  • 2 replies
  • 1 kudos
Latest Reply
cgrant
Databricks Employee
  • 1 kudos

Please make sure you are using Dedicated (single-user) clusters when authenticating to the file notifications service when attempting to authenticate to SQS via instance profile authentication. This likely will change in the future, so stay posted.

  • 1 kudos
1 More Replies
taschi
by New Contributor III
  • 11976 Views
  • 6 replies
  • 6 kudos

Resolved! How can I trigger the execution of a specific step within a Databricks Workflow job?

I'm investigating methods to test a Job starting from a particular step. For instance, if I've made modifications midway through a 50+ step Job, is there a way to test the Job without running the steps that precede the one with the modification?

  • 11976 Views
  • 6 replies
  • 6 kudos
Latest Reply
alan-nousot
New Contributor II
  • 6 kudos

Really interested in this feature. I'd love to be able to programmatically orchestrate tasks with more granularity.

  • 6 kudos
5 More Replies
sreedata
by New Contributor III
  • 3232 Views
  • 4 replies
  • 5 kudos

Resolved! Databricks -->Workflows-->Job Runs

In Databricks -->Workflows-->Job Runs we have a column "Run As".From where does this value come. We are getting a user id here but need to change it to a generic account. Any help would be appreciated. Thanks

  • 3232 Views
  • 4 replies
  • 5 kudos
Latest Reply
Leon_K
New Contributor II
  • 5 kudos

I'm surprised why there no options to select "Run as" as something like "system user". Why all this complication with Service Principal? Where to report this ?@DataBricks  

  • 5 kudos
3 More Replies
Kayla
by Valued Contributor II
  • 161 Views
  • 1 replies
  • 0 kudos

Version Control For Alerts, Queries

Is there any inbuilt option for version control for Databricks SQL Queries and Alerts? Tried moving the files into a repo and Git did not recognize the file types.

  • 161 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Currently, Databricks does not have an inbuilt option for version control specifically for SQL Queries and Alerts.

  • 0 kudos
Sanjeev
by New Contributor II
  • 178 Views
  • 1 replies
  • 0 kudos

Sending customized mail with databricks notebook with images

How can i send customized message from within the databricks notebook. SQL alerts is not helping.

  • 178 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

You can refer to KB https://kb.databricks.com/en_US/notebooks/send-email  

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels