cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

karolinalbinsso
by New Contributor II
  • 2801 Views
  • 2 replies
  • 3 kudos

Resolved! How to access the job-Scheduling Date from within the notebook?

I have created a job that contains a notebook that reads a file from Azure Storage. The file-name contains the date of when the file was transferred to the storage. A new file arrives every Monday, and the read-job is scheduled to run every Monday. I...

  • 2801 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Hi, I guess the files are in the same directory structure so that you can use cloud files autoloader. It will incrementally read only new files https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/auto-loaderSo it will ...

  • 3 kudos
1 More Replies
vichus1995
by New Contributor
  • 6029 Views
  • 2 replies
  • 0 kudos

Mounted Azure Storage shows mount.err inside folder while reading from Azure Databricks

I'm using Azure Databricks notebook to read a excel file from a folder inside a mounted Azure blob storage. The mounted excel location is like : "/mnt/2023-project/dashboard/ext/Marks.xlsx". 2023-project is the mount point and dashboard is the name o...

  • 6029 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @vichus1995​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 0 kudos
1 More Replies
Arty
by New Contributor II
  • 5724 Views
  • 5 replies
  • 6 kudos

Resolved! How to make Autoloader delete files after a successful load

Hi AllCan you please advise how I can arrange loaded file deletion from Azure Storage upon its successful load via Autoloader? As I understood, Spark streaming "cleanSource" option is unavailable for Autoloader, so I'm trying to find the best way to ...

  • 5724 Views
  • 5 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Artem Sachuk​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 6 kudos
4 More Replies
prasadvaze
by Valued Contributor II
  • 2942 Views
  • 2 replies
  • 2 kudos

Resolved! Delta sharing (databricks-to-databricks) between azure regions issue

we have 2 unity meta stores in 2 regions ( useast2 contains data and westeurope contains clusters) and enabled delta sharing between them. We use azure storage firewall / vent whitelisting to allow secure connection to storage from compute cluste...

  • 2942 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@prasad vaze​ :Delta sharing between Unity metastores in different regions can be achieved in several ways, depending on your specific requirements and constraints. One common approach is to use Azure Private Link to establish a private connection be...

  • 2 kudos
1 More Replies
saikrishna3390
by New Contributor II
  • 6411 Views
  • 2 replies
  • 2 kudos

How do I configure managed identity to databricks cluster and access azure storage using spark config

Partner want to use adf managed identity to connect to my databricks cluster and connect to my azure storage and copy the data from my azure storage to their azure storage storage

  • 6411 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @SAI PUSALA​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 2 kudos
1 More Replies
Mado
by Valued Contributor II
  • 2547 Views
  • 2 replies
  • 2 kudos

Should I create Azure storage & Metastore in the same region?

I am going to create a Metasore following the documentation. Regarding the storage account, I don't understand if it should be in the same region as Metastore. From documentation:You can create no more than one metastore per region. It is recommended...

image
  • 2547 Views
  • 2 replies
  • 2 kudos
Latest Reply
Mado
Valued Contributor II
  • 2 kudos

Thanks. But the question is about the region for Storage Account & Metastore.

  • 2 kudos
1 More Replies
KVNARK
by Honored Contributor II
  • 2842 Views
  • 1 replies
  • 6 kudos

Resolved! grant the access permissions for specific container and also for specific folder within container in Azure Blob storage

Hi,regarding permissions for Azure Storage.we have created the Storage account (blob storage) and within the account we are going to create many containers and in which container we are going to have multiple folders and files.we want to grant permis...

  • 2842 Views
  • 1 replies
  • 6 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 6 kudos

Hi @KVNARK .​ ,You can use the service principle in the azure active directory to grant specific access to that app and use that app credentials to create a new mount point.That will help you to give specific storage permission to users.

  • 6 kudos
SM
by New Contributor III
  • 4269 Views
  • 3 replies
  • 10 kudos

How to use Azure Data lake as a storage location to store the Delta Live Tables?

I am trying write data into Azure Datalake. I am reading files from Azure Blob Storage however when I try to create the Delta Live Table to Azure Datalake I get error the following errorshaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contrac...

image
  • 4269 Views
  • 3 replies
  • 10 kudos
Latest Reply
RThornton
New Contributor III
  • 10 kudos

@Kaniz Fatma​ I don't think you quite understand the question. I'm running into the same problem. When creating a Delta Live Table pipeline to write to Azure Data Lake Storage (abfss://etc...) as the Storage Location, the pipeline fails with the erro...

  • 10 kudos
2 More Replies
a2_ish
by New Contributor II
  • 1343 Views
  • 2 replies
  • 2 kudos

How to write the delta files for managed table? how can I define the sink

I have tried below code to write data in a delta table and save the delta files in a sink. I tried using azure storage as sink but I get error as not enough access, I can confirm that I have enough access to azure storage, however I can run the below...

  • 1343 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Ankit Kumar​ Does @Hubert Dudek​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 2 kudos
1 More Replies
Labels