Hi everyone,
I'm attempting to use MLFlow experiment tracking from a local machine, but I'm encountering difficulties in uploading artifacts.
I've tried a sample code as simple as the following.
import mlflow
import os
os.environ["DATABRICKS_HOST"] = "https://XXXXXX.cloud.databricks.com/"
os.environ["DATABRICKS_TOKEN"] = "dapiXXXXX"
mlflow.set_tracking_uri("databricks")
mlflow.set_experiment("XXXX")
with mlflow.start_run() as run:
mlflow.log_param("param1", 5)
mlflow.log_metric("foo", 1, step=0)
mlflow.log_metric("foo", 2, step=1)
mlflow.log_metric("foo", 3, step=2)
mlflow.log_metric("foo", 4, step=3)
mlflow.log_metric("foo", 5, step=4)
mlflow.log_artifact("main.py")
This code successfully created a new run in the target MLFlow experiment, and logged the parameters "param1" and metric "foo" correctly. However, it failed to log the artifact and displayed an error message like the following.
mlflow.exceptions.MlflowException: 403 Client Error: Forbidden for url: https://dbstorage-prod-whkxn.s3.ap-southeast-2.amazonaws.com/ws/xxxxxxxxxxxxxxxxx (an AWS presigned URL). Response text: <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>xxxxxxxxxxxxxxxx</RequestId><HostId>xxxxxxxxxxxxx</HostId></Error>
Do I need any further setting to make artifact logging available?