cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Yatoom
by New Contributor II
  • 2012 Views
  • 2 replies
  • 2 kudos

Disable access to mount point for client code

We are building a platform where we automatically execute Databricks jobs using Python packages delivered by our end-users. We want to create a mount point so that we can deliver the cluster's driver logs to an external storage. However, we don't wan...

  • 2012 Views
  • 2 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

Check with cloud providers

  • 2 kudos
1 More Replies
irfanijaz
by New Contributor
  • 661 Views
  • 0 replies
  • 0 kudos

Differently named storage accounts in different environments

Hi,I have a solution design question on which I am looking for some help. We have 2 environments in azure (dev and prod), each env has its own ADLS storage account with a different name of course. Within Databricks code we are NOT leveraging the mou...

  • 661 Views
  • 0 replies
  • 0 kudos
Bhanu1
by New Contributor III
  • 4099 Views
  • 3 replies
  • 6 kudos

Resolved! Is it possible to mount different Azure Storage Accounts for different clusters in the same workspace?

We have a development and a production data lake. Is it possible to have a production or development cluster access only respective mounts using init scripts?

  • 4099 Views
  • 3 replies
  • 6 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 6 kudos

Yes it is possible. Additionally mount is permanent and done in dbfs so it is enough to run it one time. you can have for example following configuration:In Azure you can have 2 databricks workspace,cluster in every workspace can have env variable is...

  • 6 kudos
2 More Replies
al_joe
by Contributor
  • 3422 Views
  • 2 replies
  • 0 kudos

Where / how does DBFS store files?

I tried to use %fs head to print the contents of a CSV file used in a training%fs head "/mnt/path/file.csv"but got an error saying cannot head a directory!?Then I did %fs ls on the same CSV file and got a list of 4 files under a directory named as a ...

screenshot image
  • 3422 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16753725182
Databricks Employee
  • 0 kudos

Hi @Al Jo​ , are you still seeing the error while printing the contents of te CSV file?

  • 0 kudos
1 More Replies
frank26364
by New Contributor III
  • 33021 Views
  • 5 replies
  • 4 kudos

Resolved! Export Databricks results to Blob in a csv file

Hello everyone,I want to export my data from Databricks to the blob. My Databricks commands select some pdf from my blob, run Form Recognizer and export the output results in my blob. Here is the code: %pip install azure.storage.blob %pip install...

  • 33021 Views
  • 5 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

@Francis Bouliane​ - Thank you for sharing the solution.

  • 4 kudos
4 More Replies
Anonymous
by Not applicable
  • 6260 Views
  • 2 replies
  • 4 kudos

Cluster does not have proper permissions to view DBFS mount point to Azure ADLS Gen 2.

I've created other mount points and am now trying to use the OAUTH method. I'm able to define the mount point using the OAUTH Mount to ADLS Gen 2 Storage.I've created an App Registration with Secret, added the App Registration as Contributor to the ...

  • 6260 Views
  • 2 replies
  • 4 kudos
Latest Reply
Gerbastanovic
New Contributor II
  • 4 kudos

Also check if you set the right permissions for the app on the containers ACLhttps://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control

  • 4 kudos
1 More Replies
Labels