Hello,
I am looking to replicate the functionality provided by the databricks_cli Python package using the Python SDK. Previously, using the databricks_cli WorkspaceApi object, I could use the import_workspace or import_workspace_dir methods to move a Python file, SQL file, or directory to my workspace. I am unsure how to do this using the SDK. I have tried a combination of the following:
import os
from databricks.sdk import WorkspaceClient
import databricks.sdk
from zipfile import ZipFile
import io
prod_w = WorkspaceClient(host=dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiUrl().get(),
token=dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiToken().get())
down = prod_w.dbfs.download('/FileStore/tmp/my_file.py')
# Try this?
prod_w.workspace.import_('/Users/myworkspace@email.com/my_file', content=down, format=databricks.sdk.service.workspace.ImportFormat.AUTO, language=databricks.sdk.service.workspace.Language.PYTHON, overwrite=True)
# Or this?
prod_w.workspace.upload('/Users/myworkspace@email.com/my_file', content=down, format=databricks.sdk.service.workspace.ImportFormat.AUTO, language=databricks.sdk.service.workspace.Language.PYTHON, overwrite=True)
I get errors about the ImportFormat or the argument provided to the content parameter. In the event that I do get it to work, the file is empty.
Please don't get too caught up in the details of the above code; I'm just trying my best to provide some tangible example of what I'm trying to do: import a valid dbfs file to my workspace using the Databricks Python SDK. I'm hoping someone more knowledgeable on the SDK can provide a working example of what I'm trying to do.
Thank you!
Kurt