cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

UC Volumes: writing xlsx file to volume

DylanStout
New Contributor III

How to write a DataFrame to a Volume in a catalog?

We tried the following code with our pandas Dataframe:

dbutils.fs.put('dbfs:/Volumes/xxxx/default/input_bestanden/x test.xlsx', pandasDf.to_excel('/Volumes/xxxx/default/input_bestanden/x test.xlsx'))
 
This results in the error:
([Errno 95] Operation not supported)

on

pandasDf.to_excel('/Volumes/xxxx/default/input_bestanden/x test.xlsx'))

 
 
1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @DylanStout, The error message “([Errno 95] Operation not supported)” suggests that the operation you’re trying to perform is not supported in the current context.

To write a DataFrame to a Databricks Volume, you’ll need to follow a different approach.

To write a DataFrame to a Volume in a catalog in Databricks, you can follow these steps:

  1. Create a New Notebook:

  2. Define Variables:

    • In your notebook, define the following variables:
      catalog = "<catalog_name>"
      schema = "<schema_name>"
      volume = "<volume_name>"
      file_name = "new_baby_names.csv"
      table_name = "baby_names"
      path_volume = f"/Volumes/{catalog}/{schema}/{volume}"
      path_table = f"{catalog}.{schema}"
      
      Replace <catalog_name>, <schema_name>, and <volume_name> with your actual Unity Catalog volume names. You’ll save the baby name data into the specified table later1.
  3. Add Data to Your Unity Catalog Volume:

  4. Load Data into DataFrame:

    • Create a DataFrame (let’s call it df) by reading the CSV file from your Unity Catalog volume:
      df = spark.read.csv(f"{path_volume}/{file_name}", header=True, inferSchema=True)
      
  5. Insert Data into an Existing Table:

    • Insert the data from the DataFrame into an existing table in Unity Catalog:
      df.write.insertInto(path_table + "." + table_name)
      

This should allow you to write your DataFrame to the specified volume in your Unity Catalog. If you encounter any issues, please check your permissions and ensure that Unity Catalog is enabled ....

Let me know if you need further assistance! 😊

 
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!