cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Bas1
by New Contributor III
  • 12682 Views
  • 16 replies
  • 20 kudos

Resolved! network security for DBFS storage account

In Azure Databricks the DBFS storage account is open to all networks. Changing that to use a private endpoint or minimizing access to selected networks is not allowed.Is there any way to add network security to this storage account? Alternatively, is...

  • 12682 Views
  • 16 replies
  • 20 kudos
Latest Reply
Odee79
New Contributor II
  • 20 kudos

How can we secure the storage account in the managed resource group which holds the DBFS with restricted network access, since access from all networks is blocked by our Azure storage account policy?

  • 20 kudos
15 More Replies
jllo
by New Contributor III
  • 5141 Views
  • 6 replies
  • 3 kudos

Azure Storage Account inside Databricks cannot enable soft-delete.

Hello,When deploying any databricks workspace inside Azure, the storage account inside the databricks managed resource group is unable to apply any changes, including enabling soft-delete. Is there a way to enable it?Best regards,Jon

  • 5141 Views
  • 6 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi, Default storage withing default RG cannot be altered.

  • 3 kudos
5 More Replies
ChrisS
by New Contributor III
  • 7069 Views
  • 2 replies
  • 1 kudos

Trying to mount Azure Data Lake Storage Gen 2 to Azure Databricks

I have validated many many many times all my credentials and I am still getting the following error (at the very end). ChatGPT said to basically recheck everything and I did. The one thing I didn't do was grant the permissions which I have since done...

  • 7069 Views
  • 2 replies
  • 1 kudos
Latest Reply
User16752245772
Contributor
  • 1 kudos

Hi @Chris Sarrico​ Could you please specify the container name before the storage account name like this :so the source looks like this. :source = "abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/"https://learn.microsoft.com/en-u...

  • 1 kudos
1 More Replies
vanessafvg
by New Contributor III
  • 2104 Views
  • 1 replies
  • 3 kudos

Extracting data from excel in datalake storage using openpyxl

i am trying to extract some data into databricks but tripping all over openpyxl, newish user of databricks..from openpyxl import load_workbookdirectory_id="hidden"scope="hidden"client_id="hidden"service_credential_key="hidden"container_name="hidden"s...

  • 2104 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Vanessa Van Gelder​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 3 kudos
sintsan
by New Contributor II
  • 2397 Views
  • 3 replies
  • 0 kudos

Azure Databricks DBFS Root, Storage Account Networking

For an Azure Databricks with vnet injection, we would like to change the networking on the default managed Azure Databricks storage account (dbstorage) from Enabled from all networks to Enabled from selected virtual networks and IP addresses.Can this...

  • 2397 Views
  • 3 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@Sander Sintjorissen​ usually root storage bucket has below directories present in article https://learn.microsoft.com/en-us/azure/databricks/dbfs/root-locationsto store logs related to auditing you can create another storage and add that. hope this ...

  • 0 kudos
2 More Replies
a2_ish
by New Contributor II
  • 2363 Views
  • 1 replies
  • 0 kudos

Where are delta lake files stored by given path?

I have below code which works for the path below but fails for path = azure storage account path. i have enough access to write and update the storage account. I would like to know what wrong am I doing and the path below which works , how can i phys...

  • 2363 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Ankit Kumar​ :The error message you received indicates that the user does not have sufficient permission to access the Azure Blob Storage account. You mentioned that you have enough access to write and update the storage account, but it's possible t...

  • 0 kudos
bchaubey
by Contributor II
  • 3909 Views
  • 1 replies
  • 0 kudos

unable to connect with Azure Storage with Scala

Hi Team, I am unable to connect Storage account with scala in Databricks, getting bellow error.AbfsRestOperationException: Status code: -1 error code: null error message: Cannot resolve hostname: ptazsg5gfcivcrstrlrs.dfs.core.windows.netCaused by: Un...

  • 3909 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Bhagwan Chaubey​ :The error message suggests that the hostname for your Azure Storage account could not be resolved. This could happen if there is a network issue, or if the hostname is incorrect.Here are some steps you can try to resolve the issue:...

  • 0 kudos
alexlod
by New Contributor III
  • 6476 Views
  • 2 replies
  • 3 kudos

Getting error "User is not an owner of Account" when creating a storage credential in Azure Databricks

I'm using Azure Databricks.I've followed this guide to create an Azure Storage Account and an Access Connector for Azure Databricks. I've given the `Storage Blob Data Contributor` role to the Access Connector in the Storage Account. When I go to the ...

Screenshot 2023-02-06 at 5.26.49 PM
  • 6476 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Alex Loddengaard​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us...

  • 3 kudos
1 More Replies
Retko
by Contributor
  • 6372 Views
  • 1 replies
  • 1 kudos

Error when using SAS token to connect to azure Storage Account: Unable to load SAS token provider class: java.lang.IllegalArgumentException

Hi, I am trying to connect to the Storage Account using the SAS token, and receive this error: Unable to load SAS token provider class: java.lang.IllegalArgumentException - more on the picture.I couldnt find anything on the web for this error.I also ...

image.png
  • 6372 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Retko Okter​ :It seems that there is an issue with the SAS token provider class. This error can occur when the SAS token is not correctly formatted or is invalid.Here are some steps you can try to resolve the issue:Verify that the SAS token is corre...

  • 1 kudos
Mado
by Valued Contributor II
  • 2773 Views
  • 2 replies
  • 2 kudos

Should I create Azure storage & Metastore in the same region?

I am going to create a Metasore following the documentation. Regarding the storage account, I don't understand if it should be in the same region as Metastore. From documentation:You can create no more than one metastore per region. It is recommended...

image
  • 2773 Views
  • 2 replies
  • 2 kudos
Latest Reply
Mado
Valued Contributor II
  • 2 kudos

Thanks. But the question is about the region for Storage Account & Metastore.

  • 2 kudos
1 More Replies
Rahul_Samant
by Contributor
  • 6015 Views
  • 8 replies
  • 1 kudos

Mounting File Share in init script of cluster

we have a flow where we have to process chunk of files from file share. currently we are moving the files first to storage account and then post processing move files back to file share again. this is adding to the execution time for moving files bac...

  • 6015 Views
  • 8 replies
  • 1 kudos
Latest Reply
Samirshaikh
New Contributor II
  • 1 kudos

Hi @Rahul Samant is this issue solved Please help we are also facing same issues

  • 1 kudos
7 More Replies
venkad
by Contributor
  • 1318 Views
  • 0 replies
  • 4 kudos

Default location for Schema/Database in Unity

Hello Bricksters,We organize the delta lake in multiple storage accounts. One storage account per data domain and one container per database. This helps us to isolate the resources and cost on the business domain level.Earlier, when a schema/database...

  • 1318 Views
  • 0 replies
  • 4 kudos
fsm
by New Contributor II
  • 7388 Views
  • 4 replies
  • 2 kudos

Resolved! Implementation of a stable Spark Structured Streaming Application

Hi folks,I have an issue. It's not critical but's annoying.We have implemented a Spark Structured Streaming Application.This application will be triggered wire Azure Data Factory (every 8 minutes). Ok, this setup sounds a little bit weird and it's no...

  • 7388 Views
  • 4 replies
  • 2 kudos
Latest Reply
brickster_2018
Databricks Employee
  • 2 kudos

@Markus Freischlad​  Looks like the spark driver was stuck. It will be good to capture the thread dump of the Spark driver to understand what operation is stuck

  • 2 kudos
3 More Replies
MohitAnchlia
by New Contributor II
  • 1091 Views
  • 0 replies
  • 1 kudos

Change AWS storage setting and account

I am seeing a super weird behaviour in databricks. We initially configured the following: 1. Account X in Account Console -> AWS Account arn:aws:iam::X:role/databricks-s3 2. We setup databricks-s3 as S3 bucket in Account Console -> AWS Storage 3. W...

  • 1091 Views
  • 0 replies
  • 1 kudos
rba76
by New Contributor
  • 19289 Views
  • 2 replies
  • 0 kudos

Python spark.read.text Path does not exist

Dear all, I want to read files with python from a storage account. I followed this instruction https://docs.microsoft.com/en-us/azure/azure-databricks/store-secrets-azure-key-vault. This is my python code: dbutils.fs.mount(source = "wasbs://contain...

  • 19289 Views
  • 2 replies
  • 0 kudos
Latest Reply
PRADEEPCHEEKATL
New Contributor II
  • 0 kudos

@rba76​  Make sure helloworld.txt file exists in the container1 folderI'm able to view the text file using the same commands as follows:Mount Blob Storage:dbutils.fs.mount( source = "wasbs://sampledata@azure.blob.core.windows.net/Azure", mount_po...

  • 0 kudos
1 More Replies
Labels