cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

UC Volumes: writing xlsx file to volume

DylanStout
New Contributor III

How to write a DataFrame to a Volume in a catalog?

We tried the following code with our pandas Dataframe:

dbutils.fs.put('dbfs:/Volumes/xxxx/default/input_bestanden/x test.xlsx', pandasDf.to_excel('/Volumes/xxxx/default/input_bestanden/x test.xlsx'))
 
This results in the error:
([Errno 95] Operation not supported)

on

pandasDf.to_excel('/Volumes/xxxx/default/input_bestanden/x test.xlsx'))

 
 
2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @DylanStout, The error message “([Errno 95] Operation not supported)” suggests that the operation you’re trying to perform is not supported in the current context.

To write a DataFrame to a Databricks Volume, you’ll need to follow a different approach.

To write a DataFrame to a Volume in a catalog in Databricks, you can follow these steps:

  1. Create a New Notebook:

  2. Define Variables:

    • In your notebook, define the following variables:
      catalog = "<catalog_name>"
      schema = "<schema_name>"
      volume = "<volume_name>"
      file_name = "new_baby_names.csv"
      table_name = "baby_names"
      path_volume = f"/Volumes/{catalog}/{schema}/{volume}"
      path_table = f"{catalog}.{schema}"
      
      Replace <catalog_name>, <schema_name>, and <volume_name> with your actual Unity Catalog volume names. You’ll save the baby name data into the specified table later1.
  3. Add Data to Your Unity Catalog Volume:

  4. Load Data into DataFrame:

    • Create a DataFrame (let’s call it df) by reading the CSV file from your Unity Catalog volume:
      df = spark.read.csv(f"{path_volume}/{file_name}", header=True, inferSchema=True)
      
  5. Insert Data into an Existing Table:

    • Insert the data from the DataFrame into an existing table in Unity Catalog:
      df.write.insertInto(path_table + "." + table_name)
      

This should allow you to write your DataFrame to the specified volume in your Unity Catalog. If you encounter any issues, please check your permissions and ensure that Unity Catalog is enabled ....

Let me know if you need further assistance! 😊

 

Did you use an AI to create this answer? You are posting a solution that is irrelevant to the question asked.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group