cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Workspace API

CaptainJack
New Contributor II

Hello friends. 

I am having problem with Workspace API. I have many folders inside my /Workspace (200+) which I would like to copy my Program, whole Program folder, which includes 20 spark scripts are Databricks notebooks. I tried Workspace API and I am getting this error:

Could not parse request objectL Failed to Decode VALUE_STRING as base 64.

How could I decode it as base64. Please take a note that these are Databricks notebooks. Thanks everyone for help.

my_body looks like that

payload = {
"path": "/Workspace/Destination/Program",
"format": "SOURCE",
"language": "Python",
"content": "/Workspace/Source/Program",
"overwrite": True
}

 

response = requests.post("{}{}".format(workspace_url, api), data=json.dumps(payload), headers=auth)

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @CaptainJack, You’re encountering an issue with the Workspace API in Databricks. 

 

The error message you received, “Could not parse request object: Failed to Decode VALUE_STRING as base 64,” indicates an issue with the data you’re sending.

 

Let’s break down the problem and find a solution:

 

Base64 Encoding:

  • Base64 encoding is commonly used to represent binary data (such as files or images) as text. When working with APIs, especially those that handle file content, you might need to encode or decode data as base64.
  • In your case, it appears that the content you’re sending (Databricks Notebooks) needs to be encoded in base64 format.

Payload Structure:

  • Your payload includes several parameters, such as path, format, language, content, and overwrite.
  • The critical part here is the content field, which should contain the base64-encoded content of your Databricks Notebooks.

Solution:

Before sending the payload, ensure that the content of your notebooks is correctly encoded in base64.

 

You can use Python’s base64 module to encode your notebook content. 

 

API Endpoint:

  • Ensure that the API variable you’re using (/API/2.0/workspace/import) corresponds to the correct API endpoint for importing notebooks.

Remember to replace the file path (/Workspace/Source/Program) with the actual path to your Databricks Notebooks. If you follow these steps, your notebooks should be successfully imported using the Workspace API.

 

Feel free to ask if you need further assistance! 😊

 

View solution in original post

2 REPLIES 2

CaptainJack
New Contributor II

I am using this as api = /api/2.0/workspace/import

Hi @CaptainJack, You’re encountering an issue with the Workspace API in Databricks. 

 

The error message you received, “Could not parse request object: Failed to Decode VALUE_STRING as base 64,” indicates an issue with the data you’re sending.

 

Let’s break down the problem and find a solution:

 

Base64 Encoding:

  • Base64 encoding is commonly used to represent binary data (such as files or images) as text. When working with APIs, especially those that handle file content, you might need to encode or decode data as base64.
  • In your case, it appears that the content you’re sending (Databricks Notebooks) needs to be encoded in base64 format.

Payload Structure:

  • Your payload includes several parameters, such as path, format, language, content, and overwrite.
  • The critical part here is the content field, which should contain the base64-encoded content of your Databricks Notebooks.

Solution:

Before sending the payload, ensure that the content of your notebooks is correctly encoded in base64.

 

You can use Python’s base64 module to encode your notebook content. 

 

API Endpoint:

  • Ensure that the API variable you’re using (/API/2.0/workspace/import) corresponds to the correct API endpoint for importing notebooks.

Remember to replace the file path (/Workspace/Source/Program) with the actual path to your Databricks Notebooks. If you follow these steps, your notebooks should be successfully imported using the Workspace API.

 

Feel free to ask if you need further assistance! 😊

 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!