cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jllo
by New Contributor III
  • 4899 Views
  • 6 replies
  • 3 kudos

Azure Storage Account inside Databricks cannot enable soft-delete.

Hello,When deploying any databricks workspace inside Azure, the storage account inside the databricks managed resource group is unable to apply any changes, including enabling soft-delete. Is there a way to enable it?Best regards,Jon

  • 4899 Views
  • 6 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi, Default storage withing default RG cannot be altered.

  • 3 kudos
5 More Replies
alexlod
by New Contributor III
  • 6212 Views
  • 2 replies
  • 3 kudos

Getting error "User is not an owner of Account" when creating a storage credential in Azure Databricks

I'm using Azure Databricks.I've followed this guide to create an Azure Storage Account and an Access Connector for Azure Databricks. I've given the `Storage Blob Data Contributor` role to the Access Connector in the Storage Account. When I go to the ...

Screenshot 2023-02-06 at 5.26.49 PM
  • 6212 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Alex Loddengaard​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us...

  • 3 kudos
1 More Replies
shamly
by New Contributor III
  • 4148 Views
  • 4 replies
  • 4 kudos

Urgent - Use Python Variable in shell command in databricks notebook

I am trying to read a csv and do an activity from azure storage account using databricks shell script. I wanted to add this shell script into my big python code for other sources as well. I have created widgets for file path in python. I have created...

  • 4148 Views
  • 4 replies
  • 4 kudos
Latest Reply
SS2
Valued Contributor
  • 4 kudos

You can mount the storage account and then can set env level variable and can do the operation that you want.

  • 4 kudos
3 More Replies
pkgltn
by New Contributor III
  • 957 Views
  • 0 replies
  • 0 kudos

Mounting a Azure Storage Account path on Databricks

Hi,I have a Databricks instance and I mounted the Azure Storage Account. When I run the following command, the output is ExecutionError: An error occurred while calling o1168.ls.: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: java.util...

  • 957 Views
  • 0 replies
  • 0 kudos
ebg
by New Contributor III
  • 3570 Views
  • 3 replies
  • 10 kudos

I have an Azure storage account, and I need to list the containers on that storage account using scala from databricks notebook

I tried to run azure-cli on databricks (i am using Credential passthrough and my account needs MFA ) and do the following code:az storage container list --account-name "account_name" --auth-mode login --query "[].name" --output tsvHowever, it outputs...

  • 3570 Views
  • 3 replies
  • 10 kudos
Latest Reply
Anonymous
Not applicable
  • 10 kudos

Hi @elias bou ghosn​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 10 kudos
2 More Replies
baatchus
by New Contributor III
  • 4009 Views
  • 4 replies
  • 0 kudos

Resolved! parameterize azure storage account name in spark cluster config databricks

wondering if this is to parameterize the azure storage account name part in the spark cluster config in Databricks?I have a working example where the values are referencing secret scopes:spark.hadoop.fs.azure.account.oauth2.client.id.<azurestorageacc...

  • 4009 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Fantastic! Thanks for letting us know!

  • 0 kudos
3 More Replies
GoldenTuna
by New Contributor II
  • 3919 Views
  • 5 replies
  • 2 kudos

Resolved! Mounting an Azure Storage Account in a cluster init script?

We are trying to configure our environment so when our cluster starts up, it checks to see if we have mounted our Azure storage account container and if is not, mount it. We can do this fine in a notebook however have no luck doing this through an in...

  • 3919 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@David Kruetzkamp​ - Would you be happy to mark whichever answer helped the most as best? That will help other members find the solution more quickly.

  • 2 kudos
4 More Replies
tarente
by New Contributor III
  • 3333 Views
  • 3 replies
  • 3 kudos

Partitioned parquet table (folder) with different structure

Hi,We have a parquet table (folder) in Azure Storage Account.The table is partitioned by column PeriodId (represents a day in the format YYYYMMDD) and has data from 20181001 until 20211121 (yesterday).We have a new development that adds a new column ...

  • 3333 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

I think problem is in overwrite as when you overwrite it overwrites all folders. Solution is to mix append with dynamic overwrite so it will overwrite only folders which have data and doesn't affect old partitions:spark.conf.set("spark.sql.sources.pa...

  • 3 kudos
2 More Replies
Labels