cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting error when running databricks deploy bundle command

jitenjha11
New Contributor II

HI all,

I am trying to implement MLOps project using https://github.com/databricks/mlops-stacks repo.

I have created azure databricks with Premium (+ Role-based access controls) (Click to change) and following bundle creation and deploy using uRL: https://docs.databricks.com/aws/en/dev-tools/bundles/mlops-stacks

terraform provider version 
{
"terraform": {
"required_providers": {
"databricks": {
"source": "databricks/databricks",
"version": "1.100.0"

I have tried to delete terraform.tfstate from databrick GUI and terminal too.

databrickscfg file 
[DEFAULT]
host = https://adb-XXXXXXXXXXXXX.azuredatabricks.net/
token = dapXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXc

When running databricks deploy bundle -t dev command getting below error

root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform# databricks bundle deploy -t dev
Warning: unknown field: description
at resources.experiments.experiment
in resources/ml-artifacts-resource.yml:28:7

Uploading bundle files to /Workspace/Users/jiteXXXX@g.com/.bundle/my_mlops_project/dev/files...
Deploying resources...
Error: terraform apply: exit status 1

Error: cannot create permissions: ACLs for job are disabled or not available in this tier

with databricks_permissions.job_batch_inference_job,
on bundle.tf.json line 268, in resource.databricks_permissions.job_batch_inference_job:
268: },


Error: cannot create permissions: ACLs for job are disabled or not available in this tier

with databricks_permissions.job_model_training_job,
on bundle.tf.json line 281, in resource.databricks_permissions.job_model_training_job:
281: },


Error: cannot create permissions: ACLs for job are disabled or not available in this tier

with databricks_permissions.job_write_feature_table_job,
on bundle.tf.json line 294, in resource.databricks_permissions.job_write_feature_table_job:
294: },

Error: cannot create permissions: ACLs for mlflowExperiment are disabled or not available in this tier

with databricks_permissions.mlflow_experiment_experiment,
on bundle.tf.json line 307, in resource.databricks_permissions.mlflow_experiment_experiment:
307: }


Error: cannot create registered model: No metastore assigned for the current workspace.

with databricks_registered_model.model,
on bundle.tf.json line 315, in resource.databricks_registered_model.model:
315: }

 

Updating deployment state...
root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/t...

When running debug command output

root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform# databricks bundle deploy -t dev --debug
21:14:39 Info: start pid=15488 version=0.281.0 args="databricks, bundle, deploy, -t, dev, --debug"
21:14:39 Debug: Found bundle root at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project (file /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/databricks.yml) pid=15488
21:14:39 Info: Phase: load pid=15488
21:14:39 Debug: Apply pid=15488 mutator=EntryPoint
21:14:39 Debug: Apply pid=15488 mutator=scripts.preinit
21:14:39 Debug: No script defined for preinit, skipping pid=15488 mutator=scripts.preinit
21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes
21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/batch-inference-workflow-resource.yml)
21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/ml-artifacts-resource.yml)
Warning: unknown field: description
at resources.experiments.experiment
in resources/ml-artifacts-resource.yml:28:7

21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/model-workflow-resource.yml)
21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/feature-engineering-workflow-resource.yml)
21:14:39 Debug: Apply pid=15488 mutator=VerifyCliVersion
21:14:39 Debug: Apply pid=15488 mutator=EnvironmentsToTargets
21:14:39 Debug: Apply pid=15488 mutator=ComputeIdToClusterId
21:14:39 Debug: Apply pid=15488 mutator=InitializeVariables
21:14:39 Debug: Apply pid=15488 mutator=DefineDefaultTarget(default)
21:14:39 Debug: Apply pid=15488 mutator=validate:unique_resource_keys
21:14:39 Debug: Apply pid=15488 mutator=SelectTarget(dev)
21:14:39 Debug: Loading profile DEFAULT because of host match pid=15488
21:14:39 Debug: Apply pid=15488 mutator=<func>
21:14:39 Debug: Apply pid=15488 mutator=<func>
21:14:39 Info: Phase: initialize pid=15488
21:14:39 Debug: Apply pid=15488 mutator=validate:AllResourcesHaveValues
21:14:39 Debug: Apply pid=15488 mutator=validate:interpolation_in_auth_config
21:14:39 Debug: Apply pid=15488 mutator=validate:no_interpolation_in_bundle_name
21:14:39 Debug: Apply pid=15488 mutator=validate:scripts
21:14:39 Debug: Apply pid=15488 mutator=RewriteSyncPaths
21:14:39 Debug: Apply pid=15488 mutator=SyncDefaultPath
21:14:39 Debug: Apply pid=15488 mutator=SyncInferRoot
21:14:39 Debug: Apply pid=15488 mutator=InitializeCache
21:14:39 Debug: Apply pid=15488 mutator=PopulateCurrentUser
21:14:39 Debug: [Local Cache] using cache key: d251ef2678fa2f2c47ecaf2d7b8c7323fc81c48ba32c06cc1177ed90e9cb3e38 pid=15488 mutator=PopulateCurrentUser
21:14:39 Debug: [Local Cache] cache hit pid=15488 mutator=PopulateCurrentUser
21:14:39 Debug: GET /api/2.0/preview/scim/v2/Me
< HTTP/2.0 200 OK
< {
< "active": true,
< "displayName": "jj",
< "emails": [
< {
< "primary": true,
< "type": "work",
< "value": "jiXXX11@g.com"
< }
< ],
< "externalId": "b0bbed0c-6XXXXXXXXXXX-XXXXXXba6e",
< "groups": [
< {
< "$ref": "Groups/155617304374915",
< "display": "users",
< "type": "direct",
< "value": "155617304374915"
< },
< {
< "$ref": "Groups/153393333329242",
< "display": "admins",
< "type": "direct",
< "value": "153393333329242"
< }
< ],
< "id": "147985615315574",
< "name": {
< "familyName": "Jha",
< "givenName": "Jitendra"
< },
< "schemas": [
< "urn:ietf:params:scim:schemas:core:2.0:User",
< "urn:ietf:params:scim:schemas:extension:workspace:2.0:User"
< ],
< "userName": "jiXXX11@g.com"
< } pid=15488 mutator=PopulateCurrentUser sdk=true
21:14:39 Debug: [Local Cache] computed and stored result pid=15488 mutator=PopulateCurrentUser
21:14:39 Debug: Apply pid=15488 mutator=LoadGitDetails
21:14:39 Debug: Apply pid=15488 mutator=ApplySourceLinkedDeploymentPreset
21:14:39 Debug: Apply pid=15488 mutator=DefineDefaultWorkspaceRoot
21:14:39 Debug: Apply pid=15488 mutator=ExpandWorkspaceRoot
21:14:39 Debug: Apply pid=15488 mutator=DefaultWorkspacePaths
21:14:39 Debug: Apply pid=15488 mutator=PrependWorkspacePrefix
21:14:39 Debug: Apply pid=15488 mutator=RewriteWorkspacePrefix
21:14:39 Debug: Apply pid=15488 mutator=SetVariables
21:14:39 Debug: Apply pid=15488 mutator=ResolveVariableReferences
21:14:39 Debug: Apply pid=15488 mutator=ResolveLookupVariables
21:14:39 Debug: Apply pid=15488 mutator=ResolveVariableReferences
21:14:39 Debug: Apply pid=15488 mutator=validate:volume-path
21:14:39 Debug: Apply pid=15488 mutator=ApplyTargetMode
21:14:39 Info: Development mode: disabling deployment lock since bundle.deployment.lock.enabled is not set to true pid=15488 mutator=ApplyTargetMode
21:14:39 Debug: Apply pid=15488 mutator=ConfigureWSFS
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=LogResourceReferences
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=NormalizePaths
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=TranslatePathsDashboards
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=validate:SingleNodeCluster
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ExpandPipelineGlobPaths
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobClusters
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobParameters
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobTasks
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergePipelineClusters
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeApps
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=CaptureSchemaDependency
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ConfigureDashboardSerializedDashboard
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=JobClustersFixups
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ClusterFixups
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ModelServingEndpointFixups
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=SetRunAs
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=OverrideCompute
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ApplyPresets
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.dashboards.*, parent_path, /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/resources)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.dashboards.*, embed_credentials, false)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.volumes.*, volume_type, MANAGED)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.alerts.*, parent_path, /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/resources)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*, name, Untitled)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*, max_concurrent_runs, 1)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.schedule, pause_status, UNPAUSED)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.trigger, pause_status, UNPAUSED)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.continuous, pause_status, UNPAUSED)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.task[*].dbt_task, schema, default)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.task[*].for_each_task.task.dbt_task, schema, default)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.job_clusters[*].new_cluster.workload_type.clients, notebooks, true)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.job_clusters[*].new_cluster.workload_type.clients, jobs, true)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.pipelines.*, edition, ADVANCED)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.pipelines.*, channel, CURRENT)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, auto_stop_mins, 120)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, enable_photon, true)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, max_num_clusters, 1)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, spot_instance_policy, COST_OPTIMIZED)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.apps.*, description, )"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*, autotermination_minutes, 60)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*.workload_type.clients, notebooks, true)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*.workload_type.clients, jobs, true)"
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=DefaultQueueing
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=DashboardFixups
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ApplyBundlePermissions
21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=FixPermissions
21:14:39 Debug: Apply pid=15488 mutator=PythonMutator(load_resources)
21:14:39 Debug: Apply pid=15488 mutator=PythonMutator(apply_mutators)
21:14:39 Debug: Apply pid=15488 mutator=validate:required
21:14:39 Debug: Apply pid=15488 mutator=validate:enum
21:14:39 Debug: Apply pid=15488 mutator=validate:validate_dashboard_etags
21:14:39 Debug: Apply pid=15488 mutator=CheckPermissions
21:14:39 Debug: Apply pid=15488 mutator=TranslatePaths
21:14:39 Debug: Apply pid=15488 mutator=PythonWrapperWarning
21:14:39 Debug: Apply pid=15488 mutator=ApplyArtifactsDynamicVersion
21:14:39 Debug: Apply pid=15488 mutator=artifacts.Prepare
21:14:39 Info: No local tasks in databricks.yml config, skipping auto detect pid=15488 mutator=artifacts.Prepare
21:14:39 Debug: Apply pid=15488 mutator=apps.Validate
21:14:39 Debug: Apply pid=15488 mutator=ValidateTargetMode
21:14:39 Debug: Apply pid=15488 mutator=ValidateSharedRootPermissions
21:14:39 Debug: Apply pid=15488 mutator=metadata.AnnotateJobs
21:14:39 Debug: Apply pid=15488 mutator=metadata.AnnotatePipelines
21:14:39 Debug: Apply pid=15488 mutator=scripts.postinit
21:14:39 Debug: No script defined for postinit, skipping pid=15488 mutator=scripts.postinit
21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate&return_export_info=true
< HTTP/2.0 200 OK
< {
< "created_at": 1767035426041,
< "modified_at": 1767039104962,
< "object_id": 490013057061183,
< "object_type": "FILE",
< "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate",
< "resource_id": "490013057061183",
< "size": 35097
< } pid=15488 sdk=true
21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.json&return_export_info=true
< HTTP/2.0 404 Not Found
< {
< "error_code": "RESOURCE_DOES_NOT_EXIST",
< "message": "Path (/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.js... (18 more bytes)"
< } pid=15488 sdk=true
21:14:40 Debug: non-retriable error: Path (/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.json) doesn't exist. pid=15488 sdk=true
21:14:40 Debug: GET /api/2.0/workspace-files/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate
< HTTP/2.0 200 OK
< <Streaming response> pid=15488 sdk=true
21:14:40 Debug: read terraform.tfstate: terraform.tfstate: remote state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6" pid=15488
21:14:40 Info: Available resource state files (from least to most preferred): [terraform.tfstate: remote terraform state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6" /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform/terraform.tfstate: local terraform state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6"] pid=15488
21:14:40 Debug: Apply pid=15488 mutator=fast_validate(readonly)
21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:job_cluster_key_defined
21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:job_task_cluster_spec
21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:artifact_paths
21:14:40 Info: Phase: build pid=15488
21:14:40 Debug: Apply pid=15488 mutator=scripts.prebuild
21:14:40 Debug: No script defined for prebuild, skipping pid=15488 mutator=scripts.prebuild
21:14:40 Debug: Apply pid=15488 mutator=artifacts.Build
21:14:40 Debug: Apply pid=15488 mutator=scripts.postbuild
21:14:40 Debug: No script defined for postbuild, skipping pid=15488 mutator=scripts.postbuild
21:14:40 Debug: Apply pid=15488 mutator=ResolveVariableReferences
21:14:40 Debug: Apply pid=15488 mutator=ResolveVariableReferences(resources)
21:14:40 Debug: Apply pid=15488 mutator=libraries.ExpandGlobReferences
21:14:40 Debug: Apply pid=15488 mutator=CheckForSameNameLibraries
21:14:40 Debug: Apply pid=15488 mutator=SwitchToPatchedWheels
21:14:40 Debug: Apply pid=15488 mutator=TransformWheelTask
21:14:40 Debug: Apply pid=15488 mutator=CheckDashboardsModifiedRemotely
21:14:40 Debug: Apply pid=15488 mutator=SecretScopeFixups
21:14:40 Debug: Apply pid=15488 mutator=deploy:state-pull
21:14:40 Info: Opening remote deployment state file pid=15488 mutator=deploy:state-pull
21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json&return_export_info=true
< HTTP/2.0 200 OK
< {
< "created_at": 1767035422053,
< "modified_at": 1767039101084,
< "object_id": 490013057061181,
< "object_type": "FILE",
< "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json",
< "resource_id": "490013057061181",
< "size": 2732
< } pid=15488 mutator=deploy:state-pull sdk=true
21:14:40 Debug: GET /api/2.0/workspace-files/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json
< HTTP/2.0 200 OK
< <Streaming response> pid=15488 mutator=deploy:state-pull sdk=true
21:14:40 Info: Local deployment state is the same or newer, ignoring remote state pid=15488 mutator=deploy:state-pull
21:14:40 Debug: Apply pid=15488 mutator=ValidateGitDetails
21:14:40 Debug: Apply pid=15488 mutator=check-running-resources
21:14:40 Info: Phase: deploy pid=15488
21:14:40 Debug: Apply pid=15488 mutator=scripts.predeploy
21:14:40 Debug: No script defined for predeploy, skipping pid=15488 mutator=scripts.predeploy
21:14:40 Debug: Apply pid=15488 mutator=lock:acquire
21:14:40 Info: Skipping; locking is disabled pid=15488 mutator=lock:acquire
21:14:40 Debug: Apply pid=15488 mutator=artifacts.CleanUp
21:14:40 Debug: POST /api/2.0/workspace/delete
> {
> "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/artifacts/.internal",
> "recursive": true
> }
< HTTP/2.0 200 OK
< {} pid=15488 mutator=artifacts.CleanUp sdk=true
21:14:40 Debug: POST /api/2.0/workspace/mkdirs
> {
> "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/artifacts/.internal"
> }
< HTTP/2.0 200 OK
< {} pid=15488 mutator=artifacts.CleanUp sdk=true
21:14:40 Debug: Apply pid=15488 mutator=libraries.Upload
21:14:40 Debug: Apply pid=15488 mutator=files.Upload
Uploading bundle files to /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files...
21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files
< HTTP/2.0 200 OK
< {
< "object_id": 490013057061179,
< "object_type": "DIRECTORY",
< "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files",
< "resource_id": "490013057061179"
< } pid=15488 mutator=files.Upload sdk=true
21:14:40 Debug: Path /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files has type directory (ID: 490013057061179) pid=15488 mutator=files.Upload
21:14:40 Info: Uploaded bundle files pid=15488 mutator=files.Upload
21:14:40 Debug: Apply pid=15488 mutator=deploy:state-update
21:14:40 Info: Loading deployment state from /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/deployment.json pid=15488 mutator=deploy:state-update
21:14:40 Debug: Apply pid=15488 mutator=deploy:state-push
21:14:40 Info: Writing local deployment state file to remote state directory pid=15488 mutator=deploy:state-push
21:14:40 Debug: POST /api/2.0/workspace-files/import-file/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json?overwrite=true
> {
> "cli_version": "0.281.0",
> "files": [
> {
> "is_notebook": true,
> "local_path": "deployment/batch_inference/notebooks/BatchInference.py"
> },
> {
> "is_notebook": false,
> "local_path": "feature_engineering/features/__init__.py"
> },
> {
> "is_notebook": false,
> "local_path": "project_params.json"
> },
> {
> "is_notebook": false,
> "local_path": "requirements.txt"
> },
> {
> "is_notebook": false,
> "local_path": "tests/training/__init__.py"
> },
> {
> "is_notebook": true,
> "local_path": "training/notebooks/TrainWithFeatureStore.py"
> },
> {
> "is_notebook": true,
> "local_path": "validation/notebooks/ModelValidation.py"
> },
> {
> "is_notebook": false,
> "local_path": "validation/validation.py"
> },
> {
> "is_notebook": true,
> "local_path": "feature_engineering/notebooks/GenerateAndWriteFeatures.py"
> },
> {
> "is_notebook": false,
> "local_path": "pytest.ini"
> },
> {
> "is_notebook": false,
> "local_path": "resources/model-workflow-resource.yml"
> },
> {
> "is_notebook": false,
> "local_path": "resources/monitoring-resource.yml"
> },
> {
> "is_notebook": false,
> "local_path": "feature_engineering/features/pickup_features.py"
> },
> {
> "is_notebook": false,
> "local_path": "tests/feature_engineering/pickup_features_test.py"
> },
> {
> "is_notebook": false,
> "local_path": "README.md"
> },
> "... (21 additional elements)"
> ],
> "id": "5d0dd384-d632-4fda-bf10-d6e604f7292c",
> "seq": 26,
> "timestamp": "2025-12-29T20:14:40.647459783Z",
> "version": 1
> }
< HTTP/2.0 200 OK pid=15488 mutator=deploy:state-push sdk=true
21:14:40 Debug: Apply pid=15488 mutator=ApplyWorkspaceRootPermissions
21:14:40 Debug: Apply pid=15488 mutator=trackUsedCompute
21:14:40 Debug: Apply pid=15488 mutator=deploy:resource_path_mkdir
21:14:40 Debug: Apply pid=15488 mutator=terraform.Interpolate
21:14:40 Debug: Apply pid=15488 mutator=terraform.Write
21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write
21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write
21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write
21:14:40 Debug: experiment normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write
21:14:40 Debug: registered model normalization diagnostic: unknown field: grants pid=15488 mutator=terraform.Write
21:14:40 Debug: Apply pid=15488 mutator=terraform.Plan
21:14:40 Debug: Using Terraform at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/bin/terraform pid=15488 mutator=terraform.Plan
21:14:40 Debug: DATABRICKS_TF_CLI_CONFIG_FILE is not defined pid=15488 mutator=terraform.Plan
21:14:40 Debug: Environment variables for Terraform: DATABRICKS_HOST, DATABRICKS_TOKEN, DATABRICKS_AUTH_TYPE, HOME, PATH, DATABRICKS_USER_AGENT_EXTRA pid=15488 mutator=terraform.Plan
21:14:42 Debug: Planning complete and persisted at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform/plan
pid=15488 mutator=terraform.Plan
Deploying resources...
21:14:43 Debug: Apply pid=15488 mutator=terraform.Apply
Error: terraform apply: exit status 1

Error: cannot create permissions: ACLs for job are disabled or not available in this tier

with databricks_permissions.job_batch_inference_job,
on bundle.tf.json line 268, in resource.databricks_permissions.job_batch_inference_job:
268: },


Error: cannot create permissions: ACLs for job are disabled or not available in this tier

with databricks_permissions.job_model_training_job,
on bundle.tf.json line 281, in resource.databricks_permissions.job_model_training_job:
281: },


Error: cannot create permissions: ACLs for job are disabled or not available in this tier

with databricks_permissions.job_write_feature_table_job,
on bundle.tf.json line 294, in resource.databricks_permissions.job_write_feature_table_job:
294: },


Error: cannot create permissions: ACLs for mlflowExperiment are disabled or not available in this tier

with databricks_permissions.mlflow_experiment_experiment,
on bundle.tf.json line 307, in resource.databricks_permissions.mlflow_experiment_experiment:
307: }


Error: cannot create registered model: No metastore assigned for the current workspace.

with databricks_registered_model.model,
on bundle.tf.json line 315, in resource.databricks_registered_model.model:
315: }

 

Updating deployment state...
21:14:44 Debug: POST /api/2.0/workspace-files/import-file/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate?overwrite=true
> {
> "check_results": null,
> "lineage": "6d87b75c-525a-aa1e-5ef1-346faf226db6",
> "outputs": {},
> "resources": [
> {
> "instances": null,
> "mode": "managed",
> "name": "registered_model_model",
> "provider": "provider[\"registry.terraform.io/databricks/databricks\"]",
> "type": "databricks_grants"
> },
> {
> "instances": [
> {
> "attributes": {
> "always_running": false,
> "budget_policy_id": null,
> "continuous": null,
> "control_run_state": false,
> "dbt_task": null,
> "deployment": [
> {
> "kind": "BUNDLE",
> "metadata_file_path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/metadata.json"
> }
> ],
> "description": null,
> "edit_mode": "UI_LOCKED",
> "email_notifications": [
> {
> "no_alert_for_skipped_runs": false,
> "on_duration_warning_threshold_exceeded": null,
> "on_failure": null,
> "on_start": null,
> "on_streaming_backlog_exceeded": null,
> "on_success": null
> }
> ],
> "environment": null,
> "existing_cluster_id": null,
> "format": "MULTI_TASK",
> "git_source": null,
> "health": null,
> "id": "837432832789473",
> "job_cluster": null,
> "library": null,
> "max_concurrent_runs": 4,
> "max_retries": 0,
> "min_retry_interval_millis": 0,
> "name": "[dev jiXXX11] dev-my_mlops_project-batch-inference-job",
> "new_cluster": null,
> "notebook_task": null,
> "notification_settings": null,
> "parameter": null,
> "performance_target": null,
> "pipeline_task": null,
> "provider_config": null,
> "python_wheel_task": null,
> "queue": [
> {
> "enabled": true
> }
> ],
> "retry_on_timeout": false,
> "run_as": [
> {
> "service_principal_name": "",
> "user_name": "jiXXX11@g.com"
> }
> ],
> "run_job_task": null,
> "schedule": [
> {
> "pause_status": "PAUSED",
> "quartz_cron_expression": "0 0 11 * * ?",
> "timezone_id": "UTC"
> }
> ],
> "spark_jar_task": null,
> "spark_python_task": null,
> "spark_submit_task": null,
> "tags": {
> "dev": "jiXXX11"
> },
> "task": [
> {
> "clean_rooms_notebook_task": null,
> "condition_task": null,
> "dashboard_task": null,
> "dbt_cloud_task": null,
> "dbt_platform_task": null,
> "dbt_task": null,
> "depends_on": null,
> "description": "",
> "disable_auto_optimization": false,
> "disabled": false,
> "email_notifications": [
> {
> "no_alert_for_skipped_runs": false,
> "on_duration_warning_threshold_exceeded": null,
> "on_failure": null,
> "on_start": null,
> "on_streaming_backlog_exceeded": null,
> "on_success": null
> }
> ],
> "environment_key": "",
> "existing_cluster_id": "",
> "for_each_task": null,
> "gen_ai_compute_task": null,
> "health": null,
> "job_cluster_key": "",
> "library": null,
> "max_retries": 0,
> "min_retry_interval_millis": 0,
> "new_cluster": [
> {
> "__apply_policy_default_values_allow_list": null,
> "apply_policy_default_values": false,
> "autoscale": null,
> "aws_attributes": null,
> "azure_attributes": [
> {
> "availability": "ON_DEMAND_AZURE",
> "first_on_demand": 0,
> "log_analytics_info": null,
> "spot_bid_max_price": 0
> }
> ],
> "cluster_id": "",
> "cluster_log_conf": null,
> "cluster_mount_info": null,
> "cluster_name": "",
> "custom_tags": {
> "clusterSource": "mlops-stacks_0.4"
> },
> "data_security_mode": "SINGLE_USER",
> "docker_image": null,
> "driver_instance_pool_id": "",
> "driver_node_type_id": "",
> "enable_elastic_disk": true,
> "enable_local_disk_encryption": false,
> "gcp_attributes": null,
> "idempotency_token": "",
> "init_scripts": null,
> "instance_pool_id": "",
> "is_single_node": false,
> "kind": "",
> "library": null,
> "node_type_id": "Standard_D3_v2",
> "num_workers": 3,
> "policy_id": "",
> "provider_config": null,
> "remote_disk_throughput": 0,
> "runtime_engine": "",
> "single_user_name": "",
> "spark_conf": {},
> "spark_env_vars": {},
> "spark_version": "15.3.x-cpu-ml-scala2.12",
> "ssh_public_keys": null,
> "total_initial_remote_disk_size": 0,
> "use_ml_runtime": false,
> "workload_type": null
> }
> ],
> "notebook_task": [
> {
> "base_parameters": {
> "env": "dev",
> "git_source_info": "url:; branch:; commit:",
> "input_table_name": "dev.my_mlops_project.feature_store_inference_input",
> "model_name": "dev.my_mlops_project.my_mlops_project-model",
> "output_table_name": "dev.my_mlops_project.predictions"
> },
> "notebook_path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files/deployment/batch_i... (33 more bytes)",
> "source": "WORKSPACE",
> "warehouse_id": ""
> }
> ],
> "notification_settings": null,
> "pipeline_task": null,
> "power_bi_task": null,
> "python_wheel_task": null,
> "retry_on_timeout": false,
> "run_if": "ALL_SUCCESS",
> "run_job_task": null,
> "spark_jar_task": null,
> "spark_python_task": null,
> "spark_submit_task": null,
> "sql_task": null,
> "task_key": "batch_inference_job",
> "timeout_seconds": 0,
> "webhook_notifications": null
> }
> ],
> "timeout_seconds": 0,
> "timeouts": null,
> "trigger": null,
> "url": "https://adb-7405612555097742.2.azuredatabricks.net/#job/837432832789473",
> "usage_policy_id": null,
> "webhook_notifications": [
> {
> "on_duration_warning_threshold_exceeded": null,
> "on_failure": null,
> "on_start": null,
> "on_streaming_backlog_exceeded": null,
> "on_success": null
> }
> ]
> },
> "private": "eyJlMmJmYjczMC1lY2FhLTExZTYtOGY4OC0zNDM2M2JjN2M0YzAiOnsiY3JlYXRlIjoxODAwMDAwMDAwMDAwLCJ1cGRhdGUi... (52 more bytes)",
> "schema_version": 2,
> "sensitive_attributes": null
> }
> ],
> "mode": "managed",
> "name": "batch_inference_job",
> "provider": "provider[\"registry.terraform.io/databricks/databricks\"]",
> "type": "databricks_job"
> },
> "... (3 additional elements)"
> ],
> "serial": 79,
> "terraform_version": "1.5.5",
> "version": 4
> }
< HTTP/2.0 200 OK pid=15488 sdk=true
21:14:44 Debug: Apply pid=15488 mutator=lock:release
21:14:44 Info: Skipping; locking is disabled pid=15488 mutator=lock:release
21:14:44 Debug: failed execution pid=15488 exit_code=1
21:14:44 Debug: POST /telemetry-ext
> {
> "items": null,
> "protoLogs": [
> "{\"frontend_log_event_id\":\"ee12ba17-3006-4b4f-821c-26ae931fd547\",\"entry\":{\"databricks_cli_log\":{\"... (581 more bytes)"
> ],
> "uploadTime": 1767039284849
> }
< HTTP/2.0 200 OK
< {
< "errors": null,
< "numProtoSuccess": 1,
< "numRealtimeSuccess": 0,
< "numSuccess": 0
< } pid=15488 sdk=true
21:14:44 Debug: All 1 logs uploaded successfully pid=15488
root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform#

Kindly help me to fix it and also provide how can I implement MLOPs project for azure databrick with gitlab CICD pipeline, please share document or uRL which I can follow for implement it

0 REPLIES 0

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now