<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Getting error when running databricks deploy bundle command in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/142676#M4507</link>
    <description>&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;P&gt;HI all,&lt;/P&gt;&lt;P&gt;I am trying to implement MLOps project using&amp;nbsp;&lt;A href="https://github.com/databricks/mlops-stacks" target="_blank" rel="nofollow noopener noreferrer"&gt;https://github.com/databricks/mlops-stacks&lt;/A&gt;&amp;nbsp;repo.&lt;/P&gt;&lt;P&gt;I have created azure databricks with&amp;nbsp;Premium (+ Role-based access controls) (Click to change) and following bundle creation and deploy using uRL:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/mlops-stacks" target="_blank" rel="nofollow noopener noreferrer"&gt;https://docs.databricks.com/aws/en/dev-tools/bundles/mlops-stacks&lt;/A&gt;&lt;/P&gt;&lt;P&gt;terraform provider version&amp;nbsp;&lt;BR /&gt;{&lt;BR /&gt;"terraform": {&lt;BR /&gt;"required_providers": {&lt;BR /&gt;"databricks": {&lt;BR /&gt;"source": "databricks/databricks",&lt;BR /&gt;"version": "1.100.0"&lt;BR /&gt;&lt;BR /&gt;I have tried to delete terraform.tfstate from databrick GUI and terminal too.&lt;/P&gt;&lt;P&gt;databrickscfg file&amp;nbsp;&lt;BR /&gt;[DEFAULT]&lt;BR /&gt;host = &lt;A href="https://adb-XXXXXXXXXXXXX.azuredatabricks.net/" target="_blank" rel="nofollow noopener noreferrer"&gt;https://adb-XXXXXXXXXXXXX.azuredatabricks.net/&lt;/A&gt;&lt;BR /&gt;token = dapXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXc&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;When running databricks deploy bundle -t dev command getting below error&lt;/P&gt;&lt;P&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform# databricks bundle deploy -t dev&lt;BR /&gt;Warning: unknown field: description&lt;BR /&gt;at resources.experiments.experiment&lt;BR /&gt;in resources/ml-artifacts-resource.yml:28:7&lt;/P&gt;&lt;P&gt;Uploading bundle files to /Workspace/Users/jiteXXXX@g.com/.bundle/my_mlops_project/dev/files...&lt;BR /&gt;Deploying resources...&lt;BR /&gt;Error: terraform apply: exit status 1&lt;/P&gt;&lt;P&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_batch_inference_job,&lt;BR /&gt;on bundle.tf.json line 268, in resource.databricks_permissions.job_batch_inference_job:&lt;BR /&gt;268: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_model_training_job,&lt;BR /&gt;on bundle.tf.json line 281, in resource.databricks_permissions.job_model_training_job:&lt;BR /&gt;281: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_write_feature_table_job,&lt;BR /&gt;on bundle.tf.json line 294, in resource.databricks_permissions.job_write_feature_table_job:&lt;BR /&gt;294: },&lt;/P&gt;&lt;P&gt;Error: cannot create permissions: ACLs for mlflowExperiment are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.mlflow_experiment_experiment,&lt;BR /&gt;on bundle.tf.json line 307, in resource.databricks_permissions.mlflow_experiment_experiment:&lt;BR /&gt;307: }&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create registered model: No metastore assigned for the current workspace.&lt;/P&gt;&lt;P&gt;with databricks_registered_model.model,&lt;BR /&gt;on bundle.tf.json line 315, in resource.databricks_registered_model.model:&lt;BR /&gt;315: }&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Updating deployment state...&lt;BR /&gt;&lt;A href="mailto:root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform" target="_blank" rel="nofollow noopener noreferrer"&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/t...&lt;/A&gt;&lt;/P&gt;&lt;P&gt;When running debug command output&lt;/P&gt;&lt;P&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform# databricks bundle deploy -t dev --debug&lt;BR /&gt;21:14:39 Info: start pid=15488 version=0.281.0 args="databricks, bundle, deploy, -t, dev, --debug"&lt;BR /&gt;21:14:39 Debug: Found bundle root at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project (file /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/databricks.yml) pid=15488&lt;BR /&gt;21:14:39 Info: Phase: load pid=15488&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=EntryPoint&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=scripts.preinit&lt;BR /&gt;21:14:39 Debug: No script defined for preinit, skipping pid=15488 mutator=scripts.preinit&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/batch-inference-workflow-resource.yml)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/ml-artifacts-resource.yml)&lt;BR /&gt;Warning: unknown field: description&lt;BR /&gt;at resources.experiments.experiment&lt;BR /&gt;in resources/ml-artifacts-resource.yml:28:7&lt;/P&gt;&lt;P&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/model-workflow-resource.yml)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/feature-engineering-workflow-resource.yml)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=VerifyCliVersion&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=EnvironmentsToTargets&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ComputeIdToClusterId&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=InitializeVariables&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=DefineDefaultTarget(default)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:unique_resource_keys&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SelectTarget(dev)&lt;BR /&gt;21:14:39 Debug: Loading profile DEFAULT because of host match pid=15488&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=&amp;lt;func&amp;gt;&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=&amp;lt;func&amp;gt;&lt;BR /&gt;21:14:39 Info: Phase: initialize pid=15488&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:AllResourcesHaveValues&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:interpolation_in_auth_config&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:no_interpolation_in_bundle_name&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:scripts&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=RewriteSyncPaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SyncDefaultPath&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SyncInferRoot&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=InitializeCache&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: [Local Cache] using cache key: d251ef2678fa2f2c47ecaf2d7b8c7323fc81c48ba32c06cc1177ed90e9cb3e38 pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: [Local Cache] cache hit pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: GET /api/2.0/preview/scim/v2/Me&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "active": true,&lt;BR /&gt;&amp;lt; "displayName": "jj",&lt;BR /&gt;&amp;lt; "emails": [&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "primary": true,&lt;BR /&gt;&amp;lt; "type": "work",&lt;BR /&gt;&amp;lt; "value": "jiXXX11@g.com"&lt;BR /&gt;&amp;lt; }&lt;BR /&gt;&amp;lt; ],&lt;BR /&gt;&amp;lt; "externalId": "b0bbed0c-6XXXXXXXXXXX-XXXXXXba6e",&lt;BR /&gt;&amp;lt; "groups": [&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "$ref": "Groups/155617304374915",&lt;BR /&gt;&amp;lt; "display": "users",&lt;BR /&gt;&amp;lt; "type": "direct",&lt;BR /&gt;&amp;lt; "value": "155617304374915"&lt;BR /&gt;&amp;lt; },&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "$ref": "Groups/153393333329242",&lt;BR /&gt;&amp;lt; "display": "admins",&lt;BR /&gt;&amp;lt; "type": "direct",&lt;BR /&gt;&amp;lt; "value": "153393333329242"&lt;BR /&gt;&amp;lt; }&lt;BR /&gt;&amp;lt; ],&lt;BR /&gt;&amp;lt; "id": "147985615315574",&lt;BR /&gt;&amp;lt; "name": {&lt;BR /&gt;&amp;lt; "familyName": "Jha",&lt;BR /&gt;&amp;lt; "givenName": "Jitendra"&lt;BR /&gt;&amp;lt; },&lt;BR /&gt;&amp;lt; "schemas": [&lt;BR /&gt;&amp;lt; "urn:ietf:params:scim:schemas:core:2.0:User",&lt;BR /&gt;&amp;lt; "urn:ietf:params:scim:schemas:extension:workspace:2.0:User"&lt;BR /&gt;&amp;lt; ],&lt;BR /&gt;&amp;lt; "userName": "jiXXX11@g.com"&lt;BR /&gt;&amp;lt; } pid=15488 mutator=PopulateCurrentUser sdk=true&lt;BR /&gt;21:14:39 Debug: [Local Cache] computed and stored result pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=LoadGitDetails&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ApplySourceLinkedDeploymentPreset&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=DefineDefaultWorkspaceRoot&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ExpandWorkspaceRoot&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=DefaultWorkspacePaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PrependWorkspacePrefix&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=RewriteWorkspacePrefix&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SetVariables&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ResolveVariableReferences&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ResolveLookupVariables&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ResolveVariableReferences&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:volume-path&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ApplyTargetMode&lt;BR /&gt;21:14:39 Info: Development mode: disabling deployment lock since bundle.deployment.lock.enabled is not set to true pid=15488 mutator=ApplyTargetMode&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ConfigureWSFS&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=LogResourceReferences&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=NormalizePaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=TranslatePathsDashboards&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=validate:SingleNodeCluster&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ExpandPipelineGlobPaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobClusters&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobParameters&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobTasks&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergePipelineClusters&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeApps&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=CaptureSchemaDependency&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ConfigureDashboardSerializedDashboard&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=JobClustersFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ClusterFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ModelServingEndpointFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=SetRunAs&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=OverrideCompute&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ApplyPresets&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.dashboards.*, parent_path, /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/resources)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.dashboards.*, embed_credentials, false)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.volumes.*, volume_type, MANAGED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.alerts.*, parent_path, /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/resources)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*, name, Untitled)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*, max_concurrent_runs, 1)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.schedule, pause_status, UNPAUSED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.trigger, pause_status, UNPAUSED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.continuous, pause_status, UNPAUSED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.task[*].dbt_task, schema, default)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.task[*].for_each_task.task.dbt_task, schema, default)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.job_clusters[*].new_cluster.workload_type.clients, notebooks, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.job_clusters[*].new_cluster.workload_type.clients, jobs, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.pipelines.*, edition, ADVANCED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.pipelines.*, channel, CURRENT)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, auto_stop_mins, 120)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, enable_photon, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, max_num_clusters, 1)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, spot_instance_policy, COST_OPTIMIZED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.apps.*, description, )"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*, autotermination_minutes, 60)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*.workload_type.clients, notebooks, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*.workload_type.clients, jobs, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=DefaultQueueing&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=DashboardFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ApplyBundlePermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=FixPermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PythonMutator(load_resources)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PythonMutator(apply_mutators)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:required&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:enum&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:validate_dashboard_etags&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=CheckPermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=TranslatePaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PythonWrapperWarning&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ApplyArtifactsDynamicVersion&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=artifacts.Prepare&lt;BR /&gt;21:14:39 Info: No local tasks in databricks.yml config, skipping auto detect pid=15488 mutator=artifacts.Prepare&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=apps.Validate&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ValidateTargetMode&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ValidateSharedRootPermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=metadata.AnnotateJobs&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=metadata.AnnotatePipelines&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=scripts.postinit&lt;BR /&gt;21:14:39 Debug: No script defined for postinit, skipping pid=15488 mutator=scripts.postinit&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate&amp;amp;return_export_info=true&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "created_at": 1767035426041,&lt;BR /&gt;&amp;lt; "modified_at": 1767039104962,&lt;BR /&gt;&amp;lt; "object_id": 490013057061183,&lt;BR /&gt;&amp;lt; "object_type": "FILE",&lt;BR /&gt;&amp;lt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate",&lt;BR /&gt;&amp;lt; "resource_id": "490013057061183",&lt;BR /&gt;&amp;lt; "size": 35097&lt;BR /&gt;&amp;lt; } pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.json&amp;amp;return_export_info=true&lt;BR /&gt;&amp;lt; HTTP/2.0 404 Not Found&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "error_code": "RESOURCE_DOES_NOT_EXIST",&lt;BR /&gt;&amp;lt; "message": "Path (/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.js... (18 more bytes)"&lt;BR /&gt;&amp;lt; } pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: non-retriable error: Path (/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.json) doesn't exist. pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace-files/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; &amp;lt;Streaming response&amp;gt; pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: read terraform.tfstate: terraform.tfstate: remote state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6" pid=15488&lt;BR /&gt;21:14:40 Info: Available resource state files (from least to most preferred): [terraform.tfstate: remote terraform state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6" /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform/terraform.tfstate: local terraform state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6"] pid=15488&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=fast_validate(readonly)&lt;BR /&gt;21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:job_cluster_key_defined&lt;BR /&gt;21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:job_task_cluster_spec&lt;BR /&gt;21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:artifact_paths&lt;BR /&gt;21:14:40 Info: Phase: build pid=15488&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=scripts.prebuild&lt;BR /&gt;21:14:40 Debug: No script defined for prebuild, skipping pid=15488 mutator=scripts.prebuild&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=artifacts.Build&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=scripts.postbuild&lt;BR /&gt;21:14:40 Debug: No script defined for postbuild, skipping pid=15488 mutator=scripts.postbuild&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ResolveVariableReferences&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ResolveVariableReferences(resources)&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=libraries.ExpandGlobReferences&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=CheckForSameNameLibraries&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=SwitchToPatchedWheels&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=TransformWheelTask&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=CheckDashboardsModifiedRemotely&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=SecretScopeFixups&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:state-pull&lt;BR /&gt;21:14:40 Info: Opening remote deployment state file pid=15488 mutator=deploy:state-pull&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json&amp;amp;return_export_info=true&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "created_at": 1767035422053,&lt;BR /&gt;&amp;lt; "modified_at": 1767039101084,&lt;BR /&gt;&amp;lt; "object_id": 490013057061181,&lt;BR /&gt;&amp;lt; "object_type": "FILE",&lt;BR /&gt;&amp;lt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json",&lt;BR /&gt;&amp;lt; "resource_id": "490013057061181",&lt;BR /&gt;&amp;lt; "size": 2732&lt;BR /&gt;&amp;lt; } pid=15488 mutator=deploy:state-pull sdk=true&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace-files/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; &amp;lt;Streaming response&amp;gt; pid=15488 mutator=deploy:state-pull sdk=true&lt;BR /&gt;21:14:40 Info: Local deployment state is the same or newer, ignoring remote state pid=15488 mutator=deploy:state-pull&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ValidateGitDetails&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=check-running-resources&lt;BR /&gt;21:14:40 Info: Phase: deploy pid=15488&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=scripts.predeploy&lt;BR /&gt;21:14:40 Debug: No script defined for predeploy, skipping pid=15488 mutator=scripts.predeploy&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=lock:acquire&lt;BR /&gt;21:14:40 Info: Skipping; locking is disabled pid=15488 mutator=lock:acquire&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=artifacts.CleanUp&lt;BR /&gt;21:14:40 Debug: POST /api/2.0/workspace/delete&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/artifacts/.internal",&lt;BR /&gt;&amp;gt; "recursive": true&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {} pid=15488 mutator=artifacts.CleanUp sdk=true&lt;BR /&gt;21:14:40 Debug: POST /api/2.0/workspace/mkdirs&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/artifacts/.internal"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {} pid=15488 mutator=artifacts.CleanUp sdk=true&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=libraries.Upload&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=files.Upload&lt;BR /&gt;Uploading bundle files to /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files...&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "object_id": 490013057061179,&lt;BR /&gt;&amp;lt; "object_type": "DIRECTORY",&lt;BR /&gt;&amp;lt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files",&lt;BR /&gt;&amp;lt; "resource_id": "490013057061179"&lt;BR /&gt;&amp;lt; } pid=15488 mutator=files.Upload sdk=true&lt;BR /&gt;21:14:40 Debug: Path /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files has type directory (ID: 490013057061179) pid=15488 mutator=files.Upload&lt;BR /&gt;21:14:40 Info: Uploaded bundle files pid=15488 mutator=files.Upload&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:state-update&lt;BR /&gt;21:14:40 Info: Loading deployment state from /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/deployment.json pid=15488 mutator=deploy:state-update&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:state-push&lt;BR /&gt;21:14:40 Info: Writing local deployment state file to remote state directory pid=15488 mutator=deploy:state-push&lt;BR /&gt;21:14:40 Debug: POST /api/2.0/workspace-files/import-file/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json?overwrite=true&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "cli_version": "0.281.0",&lt;BR /&gt;&amp;gt; "files": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "deployment/batch_inference/notebooks/BatchInference.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "feature_engineering/features/__init__.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "project_params.json"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "requirements.txt"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "tests/training/__init__.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "training/notebooks/TrainWithFeatureStore.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "validation/notebooks/ModelValidation.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "validation/validation.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "feature_engineering/notebooks/GenerateAndWriteFeatures.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "pytest.ini"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "resources/model-workflow-resource.yml"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "resources/monitoring-resource.yml"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "feature_engineering/features/pickup_features.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "tests/feature_engineering/pickup_features_test.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "README.md"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "... (21 additional elements)"&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "id": "5d0dd384-d632-4fda-bf10-d6e604f7292c",&lt;BR /&gt;&amp;gt; "seq": 26,&lt;BR /&gt;&amp;gt; "timestamp": "2025-12-29T20:14:40.647459783Z",&lt;BR /&gt;&amp;gt; "version": 1&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK pid=15488 mutator=deploy:state-push sdk=true&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ApplyWorkspaceRootPermissions&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=trackUsedCompute&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:resource_path_mkdir&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=terraform.Interpolate&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: experiment normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: registered model normalization diagnostic: unknown field: grants pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:40 Debug: Using Terraform at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/bin/terraform pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:40 Debug: DATABRICKS_TF_CLI_CONFIG_FILE is not defined pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:40 Debug: Environment variables for Terraform: DATABRICKS_HOST, DATABRICKS_TOKEN, DATABRICKS_AUTH_TYPE, HOME, PATH, DATABRICKS_USER_AGENT_EXTRA pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:42 Debug: Planning complete and persisted at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform/plan&lt;BR /&gt;pid=15488 mutator=terraform.Plan&lt;BR /&gt;Deploying resources...&lt;BR /&gt;21:14:43 Debug: Apply pid=15488 mutator=terraform.Apply&lt;BR /&gt;Error: terraform apply: exit status 1&lt;/P&gt;&lt;P&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_batch_inference_job,&lt;BR /&gt;on bundle.tf.json line 268, in resource.databricks_permissions.job_batch_inference_job:&lt;BR /&gt;268: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_model_training_job,&lt;BR /&gt;on bundle.tf.json line 281, in resource.databricks_permissions.job_model_training_job:&lt;BR /&gt;281: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_write_feature_table_job,&lt;BR /&gt;on bundle.tf.json line 294, in resource.databricks_permissions.job_write_feature_table_job:&lt;BR /&gt;294: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for mlflowExperiment are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.mlflow_experiment_experiment,&lt;BR /&gt;on bundle.tf.json line 307, in resource.databricks_permissions.mlflow_experiment_experiment:&lt;BR /&gt;307: }&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create registered model: No metastore assigned for the current workspace.&lt;/P&gt;&lt;P&gt;with databricks_registered_model.model,&lt;BR /&gt;on bundle.tf.json line 315, in resource.databricks_registered_model.model:&lt;BR /&gt;315: }&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Updating deployment state...&lt;BR /&gt;21:14:44 Debug: POST /api/2.0/workspace-files/import-file/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate?overwrite=true&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "check_results": null,&lt;BR /&gt;&amp;gt; "lineage": "6d87b75c-525a-aa1e-5ef1-346faf226db6",&lt;BR /&gt;&amp;gt; "outputs": {},&lt;BR /&gt;&amp;gt; "resources": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "instances": null,&lt;BR /&gt;&amp;gt; "mode": "managed",&lt;BR /&gt;&amp;gt; "name": "registered_model_model",&lt;BR /&gt;&amp;gt; "provider": "provider[\"registry.terraform.io/databricks/databricks\"]",&lt;BR /&gt;&amp;gt; "type": "databricks_grants"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "instances": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "attributes": {&lt;BR /&gt;&amp;gt; "always_running": false,&lt;BR /&gt;&amp;gt; "budget_policy_id": null,&lt;BR /&gt;&amp;gt; "continuous": null,&lt;BR /&gt;&amp;gt; "control_run_state": false,&lt;BR /&gt;&amp;gt; "dbt_task": null,&lt;BR /&gt;&amp;gt; "deployment": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "kind": "BUNDLE",&lt;BR /&gt;&amp;gt; "metadata_file_path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/metadata.json"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "description": null,&lt;BR /&gt;&amp;gt; "edit_mode": "UI_LOCKED",&lt;BR /&gt;&amp;gt; "email_notifications": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "no_alert_for_skipped_runs": false,&lt;BR /&gt;&amp;gt; "on_duration_warning_threshold_exceeded": null,&lt;BR /&gt;&amp;gt; "on_failure": null,&lt;BR /&gt;&amp;gt; "on_start": null,&lt;BR /&gt;&amp;gt; "on_streaming_backlog_exceeded": null,&lt;BR /&gt;&amp;gt; "on_success": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "environment": null,&lt;BR /&gt;&amp;gt; "existing_cluster_id": null,&lt;BR /&gt;&amp;gt; "format": "MULTI_TASK",&lt;BR /&gt;&amp;gt; "git_source": null,&lt;BR /&gt;&amp;gt; "health": null,&lt;BR /&gt;&amp;gt; "id": "837432832789473",&lt;BR /&gt;&amp;gt; "job_cluster": null,&lt;BR /&gt;&amp;gt; "library": null,&lt;BR /&gt;&amp;gt; "max_concurrent_runs": 4,&lt;BR /&gt;&amp;gt; "max_retries": 0,&lt;BR /&gt;&amp;gt; "min_retry_interval_millis": 0,&lt;BR /&gt;&amp;gt; "name": "[dev jiXXX11] dev-my_mlops_project-batch-inference-job",&lt;BR /&gt;&amp;gt; "new_cluster": null,&lt;BR /&gt;&amp;gt; "notebook_task": null,&lt;BR /&gt;&amp;gt; "notification_settings": null,&lt;BR /&gt;&amp;gt; "parameter": null,&lt;BR /&gt;&amp;gt; "performance_target": null,&lt;BR /&gt;&amp;gt; "pipeline_task": null,&lt;BR /&gt;&amp;gt; "provider_config": null,&lt;BR /&gt;&amp;gt; "python_wheel_task": null,&lt;BR /&gt;&amp;gt; "queue": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "enabled": true&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "retry_on_timeout": false,&lt;BR /&gt;&amp;gt; "run_as": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "service_principal_name": "",&lt;BR /&gt;&amp;gt; "user_name": "jiXXX11@g.com"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "run_job_task": null,&lt;BR /&gt;&amp;gt; "schedule": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "pause_status": "PAUSED",&lt;BR /&gt;&amp;gt; "quartz_cron_expression": "0 0 11 * * ?",&lt;BR /&gt;&amp;gt; "timezone_id": "UTC"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "spark_jar_task": null,&lt;BR /&gt;&amp;gt; "spark_python_task": null,&lt;BR /&gt;&amp;gt; "spark_submit_task": null,&lt;BR /&gt;&amp;gt; "tags": {&lt;BR /&gt;&amp;gt; "dev": "jiXXX11"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "task": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "clean_rooms_notebook_task": null,&lt;BR /&gt;&amp;gt; "condition_task": null,&lt;BR /&gt;&amp;gt; "dashboard_task": null,&lt;BR /&gt;&amp;gt; "dbt_cloud_task": null,&lt;BR /&gt;&amp;gt; "dbt_platform_task": null,&lt;BR /&gt;&amp;gt; "dbt_task": null,&lt;BR /&gt;&amp;gt; "depends_on": null,&lt;BR /&gt;&amp;gt; "description": "",&lt;BR /&gt;&amp;gt; "disable_auto_optimization": false,&lt;BR /&gt;&amp;gt; "disabled": false,&lt;BR /&gt;&amp;gt; "email_notifications": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "no_alert_for_skipped_runs": false,&lt;BR /&gt;&amp;gt; "on_duration_warning_threshold_exceeded": null,&lt;BR /&gt;&amp;gt; "on_failure": null,&lt;BR /&gt;&amp;gt; "on_start": null,&lt;BR /&gt;&amp;gt; "on_streaming_backlog_exceeded": null,&lt;BR /&gt;&amp;gt; "on_success": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "environment_key": "",&lt;BR /&gt;&amp;gt; "existing_cluster_id": "",&lt;BR /&gt;&amp;gt; "for_each_task": null,&lt;BR /&gt;&amp;gt; "gen_ai_compute_task": null,&lt;BR /&gt;&amp;gt; "health": null,&lt;BR /&gt;&amp;gt; "job_cluster_key": "",&lt;BR /&gt;&amp;gt; "library": null,&lt;BR /&gt;&amp;gt; "max_retries": 0,&lt;BR /&gt;&amp;gt; "min_retry_interval_millis": 0,&lt;BR /&gt;&amp;gt; "new_cluster": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "__apply_policy_default_values_allow_list": null,&lt;BR /&gt;&amp;gt; "apply_policy_default_values": false,&lt;BR /&gt;&amp;gt; "autoscale": null,&lt;BR /&gt;&amp;gt; "aws_attributes": null,&lt;BR /&gt;&amp;gt; "azure_attributes": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "availability": "ON_DEMAND_AZURE",&lt;BR /&gt;&amp;gt; "first_on_demand": 0,&lt;BR /&gt;&amp;gt; "log_analytics_info": null,&lt;BR /&gt;&amp;gt; "spot_bid_max_price": 0&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "cluster_id": "",&lt;BR /&gt;&amp;gt; "cluster_log_conf": null,&lt;BR /&gt;&amp;gt; "cluster_mount_info": null,&lt;BR /&gt;&amp;gt; "cluster_name": "",&lt;BR /&gt;&amp;gt; "custom_tags": {&lt;BR /&gt;&amp;gt; "clusterSource": "mlops-stacks_0.4"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "data_security_mode": "SINGLE_USER",&lt;BR /&gt;&amp;gt; "docker_image": null,&lt;BR /&gt;&amp;gt; "driver_instance_pool_id": "",&lt;BR /&gt;&amp;gt; "driver_node_type_id": "",&lt;BR /&gt;&amp;gt; "enable_elastic_disk": true,&lt;BR /&gt;&amp;gt; "enable_local_disk_encryption": false,&lt;BR /&gt;&amp;gt; "gcp_attributes": null,&lt;BR /&gt;&amp;gt; "idempotency_token": "",&lt;BR /&gt;&amp;gt; "init_scripts": null,&lt;BR /&gt;&amp;gt; "instance_pool_id": "",&lt;BR /&gt;&amp;gt; "is_single_node": false,&lt;BR /&gt;&amp;gt; "kind": "",&lt;BR /&gt;&amp;gt; "library": null,&lt;BR /&gt;&amp;gt; "node_type_id": "Standard_D3_v2",&lt;BR /&gt;&amp;gt; "num_workers": 3,&lt;BR /&gt;&amp;gt; "policy_id": "",&lt;BR /&gt;&amp;gt; "provider_config": null,&lt;BR /&gt;&amp;gt; "remote_disk_throughput": 0,&lt;BR /&gt;&amp;gt; "runtime_engine": "",&lt;BR /&gt;&amp;gt; "single_user_name": "",&lt;BR /&gt;&amp;gt; "spark_conf": {},&lt;BR /&gt;&amp;gt; "spark_env_vars": {},&lt;BR /&gt;&amp;gt; "spark_version": "15.3.x-cpu-ml-scala2.12",&lt;BR /&gt;&amp;gt; "ssh_public_keys": null,&lt;BR /&gt;&amp;gt; "total_initial_remote_disk_size": 0,&lt;BR /&gt;&amp;gt; "use_ml_runtime": false,&lt;BR /&gt;&amp;gt; "workload_type": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "notebook_task": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "base_parameters": {&lt;BR /&gt;&amp;gt; "env": "dev",&lt;BR /&gt;&amp;gt; "git_source_info": "url:; branch:; commit:",&lt;BR /&gt;&amp;gt; "input_table_name": "dev.my_mlops_project.feature_store_inference_input",&lt;BR /&gt;&amp;gt; "model_name": "dev.my_mlops_project.my_mlops_project-model",&lt;BR /&gt;&amp;gt; "output_table_name": "dev.my_mlops_project.predictions"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "notebook_path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files/deployment/batch_i... (33 more bytes)",&lt;BR /&gt;&amp;gt; "source": "WORKSPACE",&lt;BR /&gt;&amp;gt; "warehouse_id": ""&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "notification_settings": null,&lt;BR /&gt;&amp;gt; "pipeline_task": null,&lt;BR /&gt;&amp;gt; "power_bi_task": null,&lt;BR /&gt;&amp;gt; "python_wheel_task": null,&lt;BR /&gt;&amp;gt; "retry_on_timeout": false,&lt;BR /&gt;&amp;gt; "run_if": "ALL_SUCCESS",&lt;BR /&gt;&amp;gt; "run_job_task": null,&lt;BR /&gt;&amp;gt; "spark_jar_task": null,&lt;BR /&gt;&amp;gt; "spark_python_task": null,&lt;BR /&gt;&amp;gt; "spark_submit_task": null,&lt;BR /&gt;&amp;gt; "sql_task": null,&lt;BR /&gt;&amp;gt; "task_key": "batch_inference_job",&lt;BR /&gt;&amp;gt; "timeout_seconds": 0,&lt;BR /&gt;&amp;gt; "webhook_notifications": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "timeout_seconds": 0,&lt;BR /&gt;&amp;gt; "timeouts": null,&lt;BR /&gt;&amp;gt; "trigger": null,&lt;BR /&gt;&amp;gt; "url": "&lt;A href="https://adb-7405612555097742.2.azuredatabricks.net/#job/837432832789473" target="_blank" rel="nofollow noopener noreferrer"&gt;https://adb-7405612555097742.2.azuredatabricks.net/#job/837432832789473&lt;/A&gt;",&lt;BR /&gt;&amp;gt; "usage_policy_id": null,&lt;BR /&gt;&amp;gt; "webhook_notifications": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "on_duration_warning_threshold_exceeded": null,&lt;BR /&gt;&amp;gt; "on_failure": null,&lt;BR /&gt;&amp;gt; "on_start": null,&lt;BR /&gt;&amp;gt; "on_streaming_backlog_exceeded": null,&lt;BR /&gt;&amp;gt; "on_success": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ]&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "private": "eyJlMmJmYjczMC1lY2FhLTExZTYtOGY4OC0zNDM2M2JjN2M0YzAiOnsiY3JlYXRlIjoxODAwMDAwMDAwMDAwLCJ1cGRhdGUi... (52 more bytes)",&lt;BR /&gt;&amp;gt; "schema_version": 2,&lt;BR /&gt;&amp;gt; "sensitive_attributes": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "mode": "managed",&lt;BR /&gt;&amp;gt; "name": "batch_inference_job",&lt;BR /&gt;&amp;gt; "provider": "provider[\"registry.terraform.io/databricks/databricks\"]",&lt;BR /&gt;&amp;gt; "type": "databricks_job"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "... (3 additional elements)"&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "serial": 79,&lt;BR /&gt;&amp;gt; "terraform_version": "1.5.5",&lt;BR /&gt;&amp;gt; "version": 4&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK pid=15488 sdk=true&lt;BR /&gt;21:14:44 Debug: Apply pid=15488 mutator=lock:release&lt;BR /&gt;21:14:44 Info: Skipping; locking is disabled pid=15488 mutator=lock:release&lt;BR /&gt;21:14:44 Debug: failed execution pid=15488 exit_code=1&lt;BR /&gt;21:14:44 Debug: POST /telemetry-ext&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "items": null,&lt;BR /&gt;&amp;gt; "protoLogs": [&lt;BR /&gt;&amp;gt; "{\"frontend_log_event_id\":\"ee12ba17-3006-4b4f-821c-26ae931fd547\",\"entry\":{\"databricks_cli_log\":{\"... (581 more bytes)"&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "uploadTime": 1767039284849&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "errors": null,&lt;BR /&gt;&amp;lt; "numProtoSuccess": 1,&lt;BR /&gt;&amp;lt; "numRealtimeSuccess": 0,&lt;BR /&gt;&amp;lt; "numSuccess": 0&lt;BR /&gt;&amp;lt; } pid=15488 sdk=true&lt;BR /&gt;21:14:44 Debug: All 1 logs uploaded successfully pid=15488&lt;BR /&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform#&lt;/P&gt;&lt;P&gt;Kindly help me to fix it and also provide how can I implement MLOPs project for azure databrick with gitlab CICD pipeline, please share document or uRL which I can follow for implement it&lt;/P&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
    <pubDate>Tue, 30 Dec 2025 07:39:49 GMT</pubDate>
    <dc:creator>jitenjha11</dc:creator>
    <dc:date>2025-12-30T07:39:49Z</dc:date>
    <item>
      <title>Getting error when running databricks deploy bundle command</title>
      <link>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/142676#M4507</link>
      <description>&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;P&gt;HI all,&lt;/P&gt;&lt;P&gt;I am trying to implement MLOps project using&amp;nbsp;&lt;A href="https://github.com/databricks/mlops-stacks" target="_blank" rel="nofollow noopener noreferrer"&gt;https://github.com/databricks/mlops-stacks&lt;/A&gt;&amp;nbsp;repo.&lt;/P&gt;&lt;P&gt;I have created azure databricks with&amp;nbsp;Premium (+ Role-based access controls) (Click to change) and following bundle creation and deploy using uRL:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/mlops-stacks" target="_blank" rel="nofollow noopener noreferrer"&gt;https://docs.databricks.com/aws/en/dev-tools/bundles/mlops-stacks&lt;/A&gt;&lt;/P&gt;&lt;P&gt;terraform provider version&amp;nbsp;&lt;BR /&gt;{&lt;BR /&gt;"terraform": {&lt;BR /&gt;"required_providers": {&lt;BR /&gt;"databricks": {&lt;BR /&gt;"source": "databricks/databricks",&lt;BR /&gt;"version": "1.100.0"&lt;BR /&gt;&lt;BR /&gt;I have tried to delete terraform.tfstate from databrick GUI and terminal too.&lt;/P&gt;&lt;P&gt;databrickscfg file&amp;nbsp;&lt;BR /&gt;[DEFAULT]&lt;BR /&gt;host = &lt;A href="https://adb-XXXXXXXXXXXXX.azuredatabricks.net/" target="_blank" rel="nofollow noopener noreferrer"&gt;https://adb-XXXXXXXXXXXXX.azuredatabricks.net/&lt;/A&gt;&lt;BR /&gt;token = dapXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXc&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;When running databricks deploy bundle -t dev command getting below error&lt;/P&gt;&lt;P&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform# databricks bundle deploy -t dev&lt;BR /&gt;Warning: unknown field: description&lt;BR /&gt;at resources.experiments.experiment&lt;BR /&gt;in resources/ml-artifacts-resource.yml:28:7&lt;/P&gt;&lt;P&gt;Uploading bundle files to /Workspace/Users/jiteXXXX@g.com/.bundle/my_mlops_project/dev/files...&lt;BR /&gt;Deploying resources...&lt;BR /&gt;Error: terraform apply: exit status 1&lt;/P&gt;&lt;P&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_batch_inference_job,&lt;BR /&gt;on bundle.tf.json line 268, in resource.databricks_permissions.job_batch_inference_job:&lt;BR /&gt;268: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_model_training_job,&lt;BR /&gt;on bundle.tf.json line 281, in resource.databricks_permissions.job_model_training_job:&lt;BR /&gt;281: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_write_feature_table_job,&lt;BR /&gt;on bundle.tf.json line 294, in resource.databricks_permissions.job_write_feature_table_job:&lt;BR /&gt;294: },&lt;/P&gt;&lt;P&gt;Error: cannot create permissions: ACLs for mlflowExperiment are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.mlflow_experiment_experiment,&lt;BR /&gt;on bundle.tf.json line 307, in resource.databricks_permissions.mlflow_experiment_experiment:&lt;BR /&gt;307: }&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create registered model: No metastore assigned for the current workspace.&lt;/P&gt;&lt;P&gt;with databricks_registered_model.model,&lt;BR /&gt;on bundle.tf.json line 315, in resource.databricks_registered_model.model:&lt;BR /&gt;315: }&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Updating deployment state...&lt;BR /&gt;&lt;A href="mailto:root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform" target="_blank" rel="nofollow noopener noreferrer"&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/t...&lt;/A&gt;&lt;/P&gt;&lt;P&gt;When running debug command output&lt;/P&gt;&lt;P&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform# databricks bundle deploy -t dev --debug&lt;BR /&gt;21:14:39 Info: start pid=15488 version=0.281.0 args="databricks, bundle, deploy, -t, dev, --debug"&lt;BR /&gt;21:14:39 Debug: Found bundle root at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project (file /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/databricks.yml) pid=15488&lt;BR /&gt;21:14:39 Info: Phase: load pid=15488&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=EntryPoint&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=scripts.preinit&lt;BR /&gt;21:14:39 Debug: No script defined for preinit, skipping pid=15488 mutator=scripts.preinit&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/batch-inference-workflow-resource.yml)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/ml-artifacts-resource.yml)&lt;BR /&gt;Warning: unknown field: description&lt;BR /&gt;at resources.experiments.experiment&lt;BR /&gt;in resources/ml-artifacts-resource.yml:28:7&lt;/P&gt;&lt;P&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/model-workflow-resource.yml)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessRootIncludes mutator=ProcessInclude(resources/feature-engineering-workflow-resource.yml)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=VerifyCliVersion&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=EnvironmentsToTargets&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ComputeIdToClusterId&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=InitializeVariables&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=DefineDefaultTarget(default)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:unique_resource_keys&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SelectTarget(dev)&lt;BR /&gt;21:14:39 Debug: Loading profile DEFAULT because of host match pid=15488&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=&amp;lt;func&amp;gt;&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=&amp;lt;func&amp;gt;&lt;BR /&gt;21:14:39 Info: Phase: initialize pid=15488&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:AllResourcesHaveValues&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:interpolation_in_auth_config&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:no_interpolation_in_bundle_name&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:scripts&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=RewriteSyncPaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SyncDefaultPath&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SyncInferRoot&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=InitializeCache&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: [Local Cache] using cache key: d251ef2678fa2f2c47ecaf2d7b8c7323fc81c48ba32c06cc1177ed90e9cb3e38 pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: [Local Cache] cache hit pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: GET /api/2.0/preview/scim/v2/Me&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "active": true,&lt;BR /&gt;&amp;lt; "displayName": "jj",&lt;BR /&gt;&amp;lt; "emails": [&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "primary": true,&lt;BR /&gt;&amp;lt; "type": "work",&lt;BR /&gt;&amp;lt; "value": "jiXXX11@g.com"&lt;BR /&gt;&amp;lt; }&lt;BR /&gt;&amp;lt; ],&lt;BR /&gt;&amp;lt; "externalId": "b0bbed0c-6XXXXXXXXXXX-XXXXXXba6e",&lt;BR /&gt;&amp;lt; "groups": [&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "$ref": "Groups/155617304374915",&lt;BR /&gt;&amp;lt; "display": "users",&lt;BR /&gt;&amp;lt; "type": "direct",&lt;BR /&gt;&amp;lt; "value": "155617304374915"&lt;BR /&gt;&amp;lt; },&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "$ref": "Groups/153393333329242",&lt;BR /&gt;&amp;lt; "display": "admins",&lt;BR /&gt;&amp;lt; "type": "direct",&lt;BR /&gt;&amp;lt; "value": "153393333329242"&lt;BR /&gt;&amp;lt; }&lt;BR /&gt;&amp;lt; ],&lt;BR /&gt;&amp;lt; "id": "147985615315574",&lt;BR /&gt;&amp;lt; "name": {&lt;BR /&gt;&amp;lt; "familyName": "Jha",&lt;BR /&gt;&amp;lt; "givenName": "Jitendra"&lt;BR /&gt;&amp;lt; },&lt;BR /&gt;&amp;lt; "schemas": [&lt;BR /&gt;&amp;lt; "urn:ietf:params:scim:schemas:core:2.0:User",&lt;BR /&gt;&amp;lt; "urn:ietf:params:scim:schemas:extension:workspace:2.0:User"&lt;BR /&gt;&amp;lt; ],&lt;BR /&gt;&amp;lt; "userName": "jiXXX11@g.com"&lt;BR /&gt;&amp;lt; } pid=15488 mutator=PopulateCurrentUser sdk=true&lt;BR /&gt;21:14:39 Debug: [Local Cache] computed and stored result pid=15488 mutator=PopulateCurrentUser&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=LoadGitDetails&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ApplySourceLinkedDeploymentPreset&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=DefineDefaultWorkspaceRoot&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ExpandWorkspaceRoot&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=DefaultWorkspacePaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PrependWorkspacePrefix&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=RewriteWorkspacePrefix&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=SetVariables&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ResolveVariableReferences&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ResolveLookupVariables&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ResolveVariableReferences&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:volume-path&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ApplyTargetMode&lt;BR /&gt;21:14:39 Info: Development mode: disabling deployment lock since bundle.deployment.lock.enabled is not set to true pid=15488 mutator=ApplyTargetMode&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ConfigureWSFS&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=LogResourceReferences&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=NormalizePaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=TranslatePathsDashboards&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=validate:SingleNodeCluster&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ResolveVariableReferences(resources)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ExpandPipelineGlobPaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobClusters&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobParameters&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeJobTasks&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergePipelineClusters&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=MergeApps&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=CaptureSchemaDependency&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ConfigureDashboardSerializedDashboard&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=JobClustersFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ClusterFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ModelServingEndpointFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=SetRunAs&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=OverrideCompute&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ApplyPresets&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.dashboards.*, parent_path, /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/resources)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.dashboards.*, embed_credentials, false)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.volumes.*, volume_type, MANAGED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.alerts.*, parent_path, /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/resources)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*, name, Untitled)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*, max_concurrent_runs, 1)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.schedule, pause_status, UNPAUSED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.trigger, pause_status, UNPAUSED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.continuous, pause_status, UNPAUSED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.task[*].dbt_task, schema, default)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.task[*].for_each_task.task.dbt_task, schema, default)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.job_clusters[*].new_cluster.workload_type.clients, notebooks, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.jobs.*.job_clusters[*].new_cluster.workload_type.clients, jobs, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.pipelines.*, edition, ADVANCED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.pipelines.*, channel, CURRENT)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, auto_stop_mins, 120)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, enable_photon, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, max_num_clusters, 1)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.sql_warehouses.*, spot_instance_policy, COST_OPTIMIZED)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.apps.*, description, )"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*, autotermination_minutes, 60)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*.workload_type.clients, notebooks, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator="SetDefaultMutator(resources.clusters.*.workload_type.clients, jobs, true)"&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=DefaultQueueing&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=DashboardFixups&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=ApplyBundlePermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ProcessStaticResources mutator=FixPermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PythonMutator(load_resources)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PythonMutator(apply_mutators)&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:required&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:enum&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=validate:validate_dashboard_etags&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=CheckPermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=TranslatePaths&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=PythonWrapperWarning&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ApplyArtifactsDynamicVersion&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=artifacts.Prepare&lt;BR /&gt;21:14:39 Info: No local tasks in databricks.yml config, skipping auto detect pid=15488 mutator=artifacts.Prepare&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=apps.Validate&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ValidateTargetMode&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=ValidateSharedRootPermissions&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=metadata.AnnotateJobs&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=metadata.AnnotatePipelines&lt;BR /&gt;21:14:39 Debug: Apply pid=15488 mutator=scripts.postinit&lt;BR /&gt;21:14:39 Debug: No script defined for postinit, skipping pid=15488 mutator=scripts.postinit&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate&amp;amp;return_export_info=true&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "created_at": 1767035426041,&lt;BR /&gt;&amp;lt; "modified_at": 1767039104962,&lt;BR /&gt;&amp;lt; "object_id": 490013057061183,&lt;BR /&gt;&amp;lt; "object_type": "FILE",&lt;BR /&gt;&amp;lt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate",&lt;BR /&gt;&amp;lt; "resource_id": "490013057061183",&lt;BR /&gt;&amp;lt; "size": 35097&lt;BR /&gt;&amp;lt; } pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.json&amp;amp;return_export_info=true&lt;BR /&gt;&amp;lt; HTTP/2.0 404 Not Found&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "error_code": "RESOURCE_DOES_NOT_EXIST",&lt;BR /&gt;&amp;lt; "message": "Path (/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.js... (18 more bytes)"&lt;BR /&gt;&amp;lt; } pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: non-retriable error: Path (/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/resources.json) doesn't exist. pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace-files/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; &amp;lt;Streaming response&amp;gt; pid=15488 sdk=true&lt;BR /&gt;21:14:40 Debug: read terraform.tfstate: terraform.tfstate: remote state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6" pid=15488&lt;BR /&gt;21:14:40 Info: Available resource state files (from least to most preferred): [terraform.tfstate: remote terraform state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6" /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform/terraform.tfstate: local terraform state serial=74 lineage="6d87b75c-525a-aa1e-5ef1-346faf226db6"] pid=15488&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=fast_validate(readonly)&lt;BR /&gt;21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:job_cluster_key_defined&lt;BR /&gt;21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:job_task_cluster_spec&lt;BR /&gt;21:14:40 Debug: ApplyParallel pid=15488 mutator=fast_validate(readonly) mutator=validate:artifact_paths&lt;BR /&gt;21:14:40 Info: Phase: build pid=15488&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=scripts.prebuild&lt;BR /&gt;21:14:40 Debug: No script defined for prebuild, skipping pid=15488 mutator=scripts.prebuild&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=artifacts.Build&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=scripts.postbuild&lt;BR /&gt;21:14:40 Debug: No script defined for postbuild, skipping pid=15488 mutator=scripts.postbuild&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ResolveVariableReferences&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ResolveVariableReferences(resources)&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=libraries.ExpandGlobReferences&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=CheckForSameNameLibraries&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=SwitchToPatchedWheels&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=TransformWheelTask&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=CheckDashboardsModifiedRemotely&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=SecretScopeFixups&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:state-pull&lt;BR /&gt;21:14:40 Info: Opening remote deployment state file pid=15488 mutator=deploy:state-pull&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json&amp;amp;return_export_info=true&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "created_at": 1767035422053,&lt;BR /&gt;&amp;lt; "modified_at": 1767039101084,&lt;BR /&gt;&amp;lt; "object_id": 490013057061181,&lt;BR /&gt;&amp;lt; "object_type": "FILE",&lt;BR /&gt;&amp;lt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json",&lt;BR /&gt;&amp;lt; "resource_id": "490013057061181",&lt;BR /&gt;&amp;lt; "size": 2732&lt;BR /&gt;&amp;lt; } pid=15488 mutator=deploy:state-pull sdk=true&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace-files/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; &amp;lt;Streaming response&amp;gt; pid=15488 mutator=deploy:state-pull sdk=true&lt;BR /&gt;21:14:40 Info: Local deployment state is the same or newer, ignoring remote state pid=15488 mutator=deploy:state-pull&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ValidateGitDetails&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=check-running-resources&lt;BR /&gt;21:14:40 Info: Phase: deploy pid=15488&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=scripts.predeploy&lt;BR /&gt;21:14:40 Debug: No script defined for predeploy, skipping pid=15488 mutator=scripts.predeploy&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=lock:acquire&lt;BR /&gt;21:14:40 Info: Skipping; locking is disabled pid=15488 mutator=lock:acquire&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=artifacts.CleanUp&lt;BR /&gt;21:14:40 Debug: POST /api/2.0/workspace/delete&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/artifacts/.internal",&lt;BR /&gt;&amp;gt; "recursive": true&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {} pid=15488 mutator=artifacts.CleanUp sdk=true&lt;BR /&gt;21:14:40 Debug: POST /api/2.0/workspace/mkdirs&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/artifacts/.internal"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {} pid=15488 mutator=artifacts.CleanUp sdk=true&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=libraries.Upload&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=files.Upload&lt;BR /&gt;Uploading bundle files to /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files...&lt;BR /&gt;21:14:40 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "object_id": 490013057061179,&lt;BR /&gt;&amp;lt; "object_type": "DIRECTORY",&lt;BR /&gt;&amp;lt; "path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files",&lt;BR /&gt;&amp;lt; "resource_id": "490013057061179"&lt;BR /&gt;&amp;lt; } pid=15488 mutator=files.Upload sdk=true&lt;BR /&gt;21:14:40 Debug: Path /Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files has type directory (ID: 490013057061179) pid=15488 mutator=files.Upload&lt;BR /&gt;21:14:40 Info: Uploaded bundle files pid=15488 mutator=files.Upload&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:state-update&lt;BR /&gt;21:14:40 Info: Loading deployment state from /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/deployment.json pid=15488 mutator=deploy:state-update&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:state-push&lt;BR /&gt;21:14:40 Info: Writing local deployment state file to remote state directory pid=15488 mutator=deploy:state-push&lt;BR /&gt;21:14:40 Debug: POST /api/2.0/workspace-files/import-file/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/deployment.json?overwrite=true&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "cli_version": "0.281.0",&lt;BR /&gt;&amp;gt; "files": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "deployment/batch_inference/notebooks/BatchInference.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "feature_engineering/features/__init__.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "project_params.json"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "requirements.txt"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "tests/training/__init__.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "training/notebooks/TrainWithFeatureStore.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "validation/notebooks/ModelValidation.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "validation/validation.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": true,&lt;BR /&gt;&amp;gt; "local_path": "feature_engineering/notebooks/GenerateAndWriteFeatures.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "pytest.ini"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "resources/model-workflow-resource.yml"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "resources/monitoring-resource.yml"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "feature_engineering/features/pickup_features.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "tests/feature_engineering/pickup_features_test.py"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "is_notebook": false,&lt;BR /&gt;&amp;gt; "local_path": "README.md"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "... (21 additional elements)"&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "id": "5d0dd384-d632-4fda-bf10-d6e604f7292c",&lt;BR /&gt;&amp;gt; "seq": 26,&lt;BR /&gt;&amp;gt; "timestamp": "2025-12-29T20:14:40.647459783Z",&lt;BR /&gt;&amp;gt; "version": 1&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK pid=15488 mutator=deploy:state-push sdk=true&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=ApplyWorkspaceRootPermissions&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=trackUsedCompute&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=deploy:resource_path_mkdir&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=terraform.Interpolate&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: job normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: experiment normalization diagnostic: unknown field: permissions pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: registered model normalization diagnostic: unknown field: grants pid=15488 mutator=terraform.Write&lt;BR /&gt;21:14:40 Debug: Apply pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:40 Debug: Using Terraform at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/bin/terraform pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:40 Debug: DATABRICKS_TF_CLI_CONFIG_FILE is not defined pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:40 Debug: Environment variables for Terraform: DATABRICKS_HOST, DATABRICKS_TOKEN, DATABRICKS_AUTH_TYPE, HOME, PATH, DATABRICKS_USER_AGENT_EXTRA pid=15488 mutator=terraform.Plan&lt;BR /&gt;21:14:42 Debug: Planning complete and persisted at /home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform/plan&lt;BR /&gt;pid=15488 mutator=terraform.Plan&lt;BR /&gt;Deploying resources...&lt;BR /&gt;21:14:43 Debug: Apply pid=15488 mutator=terraform.Apply&lt;BR /&gt;Error: terraform apply: exit status 1&lt;/P&gt;&lt;P&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_batch_inference_job,&lt;BR /&gt;on bundle.tf.json line 268, in resource.databricks_permissions.job_batch_inference_job:&lt;BR /&gt;268: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_model_training_job,&lt;BR /&gt;on bundle.tf.json line 281, in resource.databricks_permissions.job_model_training_job:&lt;BR /&gt;281: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for job are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.job_write_feature_table_job,&lt;BR /&gt;on bundle.tf.json line 294, in resource.databricks_permissions.job_write_feature_table_job:&lt;BR /&gt;294: },&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create permissions: ACLs for mlflowExperiment are disabled or not available in this tier&lt;/P&gt;&lt;P&gt;with databricks_permissions.mlflow_experiment_experiment,&lt;BR /&gt;on bundle.tf.json line 307, in resource.databricks_permissions.mlflow_experiment_experiment:&lt;BR /&gt;307: }&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Error: cannot create registered model: No metastore assigned for the current workspace.&lt;/P&gt;&lt;P&gt;with databricks_registered_model.model,&lt;BR /&gt;on bundle.tf.json line 315, in resource.databricks_registered_model.model:&lt;BR /&gt;315: }&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Updating deployment state...&lt;BR /&gt;21:14:44 Debug: POST /api/2.0/workspace-files/import-file/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/terraform.tfstate?overwrite=true&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "check_results": null,&lt;BR /&gt;&amp;gt; "lineage": "6d87b75c-525a-aa1e-5ef1-346faf226db6",&lt;BR /&gt;&amp;gt; "outputs": {},&lt;BR /&gt;&amp;gt; "resources": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "instances": null,&lt;BR /&gt;&amp;gt; "mode": "managed",&lt;BR /&gt;&amp;gt; "name": "registered_model_model",&lt;BR /&gt;&amp;gt; "provider": "provider[\"registry.terraform.io/databricks/databricks\"]",&lt;BR /&gt;&amp;gt; "type": "databricks_grants"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "instances": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "attributes": {&lt;BR /&gt;&amp;gt; "always_running": false,&lt;BR /&gt;&amp;gt; "budget_policy_id": null,&lt;BR /&gt;&amp;gt; "continuous": null,&lt;BR /&gt;&amp;gt; "control_run_state": false,&lt;BR /&gt;&amp;gt; "dbt_task": null,&lt;BR /&gt;&amp;gt; "deployment": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "kind": "BUNDLE",&lt;BR /&gt;&amp;gt; "metadata_file_path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/state/metadata.json"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "description": null,&lt;BR /&gt;&amp;gt; "edit_mode": "UI_LOCKED",&lt;BR /&gt;&amp;gt; "email_notifications": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "no_alert_for_skipped_runs": false,&lt;BR /&gt;&amp;gt; "on_duration_warning_threshold_exceeded": null,&lt;BR /&gt;&amp;gt; "on_failure": null,&lt;BR /&gt;&amp;gt; "on_start": null,&lt;BR /&gt;&amp;gt; "on_streaming_backlog_exceeded": null,&lt;BR /&gt;&amp;gt; "on_success": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "environment": null,&lt;BR /&gt;&amp;gt; "existing_cluster_id": null,&lt;BR /&gt;&amp;gt; "format": "MULTI_TASK",&lt;BR /&gt;&amp;gt; "git_source": null,&lt;BR /&gt;&amp;gt; "health": null,&lt;BR /&gt;&amp;gt; "id": "837432832789473",&lt;BR /&gt;&amp;gt; "job_cluster": null,&lt;BR /&gt;&amp;gt; "library": null,&lt;BR /&gt;&amp;gt; "max_concurrent_runs": 4,&lt;BR /&gt;&amp;gt; "max_retries": 0,&lt;BR /&gt;&amp;gt; "min_retry_interval_millis": 0,&lt;BR /&gt;&amp;gt; "name": "[dev jiXXX11] dev-my_mlops_project-batch-inference-job",&lt;BR /&gt;&amp;gt; "new_cluster": null,&lt;BR /&gt;&amp;gt; "notebook_task": null,&lt;BR /&gt;&amp;gt; "notification_settings": null,&lt;BR /&gt;&amp;gt; "parameter": null,&lt;BR /&gt;&amp;gt; "performance_target": null,&lt;BR /&gt;&amp;gt; "pipeline_task": null,&lt;BR /&gt;&amp;gt; "provider_config": null,&lt;BR /&gt;&amp;gt; "python_wheel_task": null,&lt;BR /&gt;&amp;gt; "queue": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "enabled": true&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "retry_on_timeout": false,&lt;BR /&gt;&amp;gt; "run_as": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "service_principal_name": "",&lt;BR /&gt;&amp;gt; "user_name": "jiXXX11@g.com"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "run_job_task": null,&lt;BR /&gt;&amp;gt; "schedule": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "pause_status": "PAUSED",&lt;BR /&gt;&amp;gt; "quartz_cron_expression": "0 0 11 * * ?",&lt;BR /&gt;&amp;gt; "timezone_id": "UTC"&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "spark_jar_task": null,&lt;BR /&gt;&amp;gt; "spark_python_task": null,&lt;BR /&gt;&amp;gt; "spark_submit_task": null,&lt;BR /&gt;&amp;gt; "tags": {&lt;BR /&gt;&amp;gt; "dev": "jiXXX11"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "task": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "clean_rooms_notebook_task": null,&lt;BR /&gt;&amp;gt; "condition_task": null,&lt;BR /&gt;&amp;gt; "dashboard_task": null,&lt;BR /&gt;&amp;gt; "dbt_cloud_task": null,&lt;BR /&gt;&amp;gt; "dbt_platform_task": null,&lt;BR /&gt;&amp;gt; "dbt_task": null,&lt;BR /&gt;&amp;gt; "depends_on": null,&lt;BR /&gt;&amp;gt; "description": "",&lt;BR /&gt;&amp;gt; "disable_auto_optimization": false,&lt;BR /&gt;&amp;gt; "disabled": false,&lt;BR /&gt;&amp;gt; "email_notifications": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "no_alert_for_skipped_runs": false,&lt;BR /&gt;&amp;gt; "on_duration_warning_threshold_exceeded": null,&lt;BR /&gt;&amp;gt; "on_failure": null,&lt;BR /&gt;&amp;gt; "on_start": null,&lt;BR /&gt;&amp;gt; "on_streaming_backlog_exceeded": null,&lt;BR /&gt;&amp;gt; "on_success": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "environment_key": "",&lt;BR /&gt;&amp;gt; "existing_cluster_id": "",&lt;BR /&gt;&amp;gt; "for_each_task": null,&lt;BR /&gt;&amp;gt; "gen_ai_compute_task": null,&lt;BR /&gt;&amp;gt; "health": null,&lt;BR /&gt;&amp;gt; "job_cluster_key": "",&lt;BR /&gt;&amp;gt; "library": null,&lt;BR /&gt;&amp;gt; "max_retries": 0,&lt;BR /&gt;&amp;gt; "min_retry_interval_millis": 0,&lt;BR /&gt;&amp;gt; "new_cluster": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "__apply_policy_default_values_allow_list": null,&lt;BR /&gt;&amp;gt; "apply_policy_default_values": false,&lt;BR /&gt;&amp;gt; "autoscale": null,&lt;BR /&gt;&amp;gt; "aws_attributes": null,&lt;BR /&gt;&amp;gt; "azure_attributes": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "availability": "ON_DEMAND_AZURE",&lt;BR /&gt;&amp;gt; "first_on_demand": 0,&lt;BR /&gt;&amp;gt; "log_analytics_info": null,&lt;BR /&gt;&amp;gt; "spot_bid_max_price": 0&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "cluster_id": "",&lt;BR /&gt;&amp;gt; "cluster_log_conf": null,&lt;BR /&gt;&amp;gt; "cluster_mount_info": null,&lt;BR /&gt;&amp;gt; "cluster_name": "",&lt;BR /&gt;&amp;gt; "custom_tags": {&lt;BR /&gt;&amp;gt; "clusterSource": "mlops-stacks_0.4"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "data_security_mode": "SINGLE_USER",&lt;BR /&gt;&amp;gt; "docker_image": null,&lt;BR /&gt;&amp;gt; "driver_instance_pool_id": "",&lt;BR /&gt;&amp;gt; "driver_node_type_id": "",&lt;BR /&gt;&amp;gt; "enable_elastic_disk": true,&lt;BR /&gt;&amp;gt; "enable_local_disk_encryption": false,&lt;BR /&gt;&amp;gt; "gcp_attributes": null,&lt;BR /&gt;&amp;gt; "idempotency_token": "",&lt;BR /&gt;&amp;gt; "init_scripts": null,&lt;BR /&gt;&amp;gt; "instance_pool_id": "",&lt;BR /&gt;&amp;gt; "is_single_node": false,&lt;BR /&gt;&amp;gt; "kind": "",&lt;BR /&gt;&amp;gt; "library": null,&lt;BR /&gt;&amp;gt; "node_type_id": "Standard_D3_v2",&lt;BR /&gt;&amp;gt; "num_workers": 3,&lt;BR /&gt;&amp;gt; "policy_id": "",&lt;BR /&gt;&amp;gt; "provider_config": null,&lt;BR /&gt;&amp;gt; "remote_disk_throughput": 0,&lt;BR /&gt;&amp;gt; "runtime_engine": "",&lt;BR /&gt;&amp;gt; "single_user_name": "",&lt;BR /&gt;&amp;gt; "spark_conf": {},&lt;BR /&gt;&amp;gt; "spark_env_vars": {},&lt;BR /&gt;&amp;gt; "spark_version": "15.3.x-cpu-ml-scala2.12",&lt;BR /&gt;&amp;gt; "ssh_public_keys": null,&lt;BR /&gt;&amp;gt; "total_initial_remote_disk_size": 0,&lt;BR /&gt;&amp;gt; "use_ml_runtime": false,&lt;BR /&gt;&amp;gt; "workload_type": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "notebook_task": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "base_parameters": {&lt;BR /&gt;&amp;gt; "env": "dev",&lt;BR /&gt;&amp;gt; "git_source_info": "url:; branch:; commit:",&lt;BR /&gt;&amp;gt; "input_table_name": "dev.my_mlops_project.feature_store_inference_input",&lt;BR /&gt;&amp;gt; "model_name": "dev.my_mlops_project.my_mlops_project-model",&lt;BR /&gt;&amp;gt; "output_table_name": "dev.my_mlops_project.predictions"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "notebook_path": "/Workspace/Users/jiXXX11@g.com/.bundle/my_mlops_project/dev/files/deployment/batch_i... (33 more bytes)",&lt;BR /&gt;&amp;gt; "source": "WORKSPACE",&lt;BR /&gt;&amp;gt; "warehouse_id": ""&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "notification_settings": null,&lt;BR /&gt;&amp;gt; "pipeline_task": null,&lt;BR /&gt;&amp;gt; "power_bi_task": null,&lt;BR /&gt;&amp;gt; "python_wheel_task": null,&lt;BR /&gt;&amp;gt; "retry_on_timeout": false,&lt;BR /&gt;&amp;gt; "run_if": "ALL_SUCCESS",&lt;BR /&gt;&amp;gt; "run_job_task": null,&lt;BR /&gt;&amp;gt; "spark_jar_task": null,&lt;BR /&gt;&amp;gt; "spark_python_task": null,&lt;BR /&gt;&amp;gt; "spark_submit_task": null,&lt;BR /&gt;&amp;gt; "sql_task": null,&lt;BR /&gt;&amp;gt; "task_key": "batch_inference_job",&lt;BR /&gt;&amp;gt; "timeout_seconds": 0,&lt;BR /&gt;&amp;gt; "webhook_notifications": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "timeout_seconds": 0,&lt;BR /&gt;&amp;gt; "timeouts": null,&lt;BR /&gt;&amp;gt; "trigger": null,&lt;BR /&gt;&amp;gt; "url": "&lt;A href="https://adb-7405612555097742.2.azuredatabricks.net/#job/837432832789473" target="_blank" rel="nofollow noopener noreferrer"&gt;https://adb-7405612555097742.2.azuredatabricks.net/#job/837432832789473&lt;/A&gt;",&lt;BR /&gt;&amp;gt; "usage_policy_id": null,&lt;BR /&gt;&amp;gt; "webhook_notifications": [&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "on_duration_warning_threshold_exceeded": null,&lt;BR /&gt;&amp;gt; "on_failure": null,&lt;BR /&gt;&amp;gt; "on_start": null,&lt;BR /&gt;&amp;gt; "on_streaming_backlog_exceeded": null,&lt;BR /&gt;&amp;gt; "on_success": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ]&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "private": "eyJlMmJmYjczMC1lY2FhLTExZTYtOGY4OC0zNDM2M2JjN2M0YzAiOnsiY3JlYXRlIjoxODAwMDAwMDAwMDAwLCJ1cGRhdGUi... (52 more bytes)",&lt;BR /&gt;&amp;gt; "schema_version": 2,&lt;BR /&gt;&amp;gt; "sensitive_attributes": null&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "mode": "managed",&lt;BR /&gt;&amp;gt; "name": "batch_inference_job",&lt;BR /&gt;&amp;gt; "provider": "provider[\"registry.terraform.io/databricks/databricks\"]",&lt;BR /&gt;&amp;gt; "type": "databricks_job"&lt;BR /&gt;&amp;gt; },&lt;BR /&gt;&amp;gt; "... (3 additional elements)"&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "serial": 79,&lt;BR /&gt;&amp;gt; "terraform_version": "1.5.5",&lt;BR /&gt;&amp;gt; "version": 4&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK pid=15488 sdk=true&lt;BR /&gt;21:14:44 Debug: Apply pid=15488 mutator=lock:release&lt;BR /&gt;21:14:44 Info: Skipping; locking is disabled pid=15488 mutator=lock:release&lt;BR /&gt;21:14:44 Debug: failed execution pid=15488 exit_code=1&lt;BR /&gt;21:14:44 Debug: POST /telemetry-ext&lt;BR /&gt;&amp;gt; {&lt;BR /&gt;&amp;gt; "items": null,&lt;BR /&gt;&amp;gt; "protoLogs": [&lt;BR /&gt;&amp;gt; "{\"frontend_log_event_id\":\"ee12ba17-3006-4b4f-821c-26ae931fd547\",\"entry\":{\"databricks_cli_log\":{\"... (581 more bytes)"&lt;BR /&gt;&amp;gt; ],&lt;BR /&gt;&amp;gt; "uploadTime": 1767039284849&lt;BR /&gt;&amp;gt; }&lt;BR /&gt;&amp;lt; HTTP/2.0 200 OK&lt;BR /&gt;&amp;lt; {&lt;BR /&gt;&amp;lt; "errors": null,&lt;BR /&gt;&amp;lt; "numProtoSuccess": 1,&lt;BR /&gt;&amp;lt; "numRealtimeSuccess": 0,&lt;BR /&gt;&amp;lt; "numSuccess": 0&lt;BR /&gt;&amp;lt; } pid=15488 sdk=true&lt;BR /&gt;21:14:44 Debug: All 1 logs uploaded successfully pid=15488&lt;BR /&gt;root@jiten:/home/jiten/Desktop/ml_project/my_mlops_project/my_mlops_project/.databricks/bundle/dev/terraform#&lt;/P&gt;&lt;P&gt;Kindly help me to fix it and also provide how can I implement MLOPs project for azure databrick with gitlab CICD pipeline, please share document or uRL which I can follow for implement it&lt;/P&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 30 Dec 2025 07:39:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/142676#M4507</guid>
      <dc:creator>jitenjha11</dc:creator>
      <dc:date>2025-12-30T07:39:49Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error when running databricks deploy bundle command</title>
      <link>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/142683#M4509</link>
      <description>&lt;P&gt;Hi, I think this may be a duplicate of another question, but posting the same answer here for transparency:&lt;/P&gt;
&lt;P&gt;Hi, first things to check is that you have the correct permissions on the user or service principal you're running the job with, the user needs to have workspace access and cluster creation access toggled on.&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="emma_s_0-1767089492831.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/22574iAF6BBDB69994AC49/image-size/medium?v=v2&amp;amp;px=400" role="button" title="emma_s_0-1767089492831.png" alt="emma_s_0-1767089492831.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Next you need to check you have a metastore assigned to the workspace, you can do this in the account console, the metastore column should not be blank.&lt;/P&gt;
&lt;P&gt;In terms of the MLOPS project. The documentation in the repo you've linked above will be your best starting point.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 30 Dec 2025 10:11:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/142683#M4509</guid>
      <dc:creator>emma_s</dc:creator>
      <dc:date>2025-12-30T10:11:38Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error when running databricks deploy bundle command</title>
      <link>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/143036#M4516</link>
      <description>&lt;P data-start="197" data-end="301"&gt;This is expected behavior with mlops-stacks and not an issue with your Terraform version or the CLI.&lt;/P&gt;
&lt;P data-start="303" data-end="860"&gt;The main problem is that your Azure Databricks workspace does not have Unity Catalog enabled or assigned. The mlops-stacks templates assume Unity Catalog by default. Because of that, Terraform tries to set permissions on jobs and MLflow experiments and also tries to create a registered model, all of which require a Unity Catalog metastore. When the workspace doesn’t have a metastore, Databricks returns errors like “ACLs for job / mlflowExperiment are disabled or not available in this tier” and “No metastore assigned for the current workspace”.&lt;/P&gt;
&lt;P data-start="862" data-end="1150"&gt;Once you create and attach a Unity Catalog metastore to the workspace, these errors go away and the bundle deploy works as expected. After assigning the metastore, make sure you also set a default catalog and schema and that your user or service principal has the required privileges.&lt;/P&gt;
&lt;P data-start="1152" data-end="1480"&gt;If you just want to get things running quickly in a dev setup without Unity Catalog, you can temporarily remove the permissions blocks from the job and MLflow experiment resources and skip registered model creation, but this is only a workaround. The official and supported path for mlops-stacks is with Unity Catalog enabled.&lt;/P&gt;
&lt;P data-start="1482" data-end="1802"&gt;For CI/CD on Azure Databricks with GitLab, the recommended approach is to use a service principal (not a user PAT) and Databricks Bundles. The Databricks Bundles CI/CD documentation applies directly to GitLab, and the mlops-stacks repo has workflow examples you can easily translate from GitHub Actions to GitLab CI.&lt;/P&gt;
&lt;P data-start="1804" data-end="1824"&gt;Docs that helped me:&lt;/P&gt;
&lt;UL data-start="1825" data-end="2075"&gt;
&lt;LI data-start="1825" data-end="1924"&gt;
&lt;P data-start="1827" data-end="1924"&gt;Unity Catalog setup: &lt;A target="_new" rel="noopener" data-start="1848" data-end="1922"&gt;https://docs.databricks.com/data-governance/unity-catalog/get-started.html&lt;/A&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI data-start="1925" data-end="1999"&gt;
&lt;P data-start="1927" data-end="1999"&gt;MLflow + Unity Catalog: &lt;A target="_new" rel="noopener" data-start="1951" data-end="1997"&gt;https://docs.databricks.com/mlflow/models.html&lt;/A&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI data-start="2000" data-end="2075"&gt;
&lt;P data-start="2002" data-end="2075"&gt;Bundles CI/CD: &lt;A target="_new" rel="noopener" data-start="2017" data-end="2073"&gt;https://docs.databricks.com/dev-tools/bundles/ci-cd.html&lt;/A&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P data-start="2077" data-end="2178" data-is-last-node="" data-is-only-node=""&gt;In short: enable Unity Catalog, redeploy the bundle, and the errors you’re seeing should be resolved.&lt;/P&gt;</description>
      <pubDate>Mon, 05 Jan 2026 15:50:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/getting-error-when-running-databricks-deploy-bundle-command/m-p/143036#M4516</guid>
      <dc:creator>iyashk-DB</dc:creator>
      <dc:date>2026-01-05T15:50:40Z</dc:date>
    </item>
  </channel>
</rss>

