cancel
Showing results for 
Search instead for 
Did you mean: 
Lakebase Discussions
Ask questions, share challenges, and connect with others working on Lakebase. From troubleshooting to best practices, this is where conversations happen.
cancel
Showing results for 
Search instead for 
Did you mean: 

BUG: Agent deployment fails with PERMISSION_DENIED for Lakebase dependency when created via databric

andres920310
New Contributor III

Summary

We are encountering a PERMISSION_DENIED error when deploying a Databricks Agent that uses Lakebase for agent memory, even though the endpoint creator has all documented permissions.

The failure happens during serving endpoint creation, which is triggered from a job task running a notebook that uses the databricks-agents library.

Based on the documentation and our permission setup, this deployment should succeed, but it fails when Databricks attempts to grant permissions to the served entity’s service principal.

We believe this is a Databricks bug related to Lakebase permission handling during agent deployment.


Environment

  • Cloud: AWS

  • Deployment mechanism:

    • Databricks Asset Bundles (DABs)

    • Job task running a notebook that uses databricks-agents to create the serving endpoint

  • Features involved:

    • Databricks Agents Framework

    • Lakebase (used for agent memory)

    • MLflow model logging with resource dependencies


What We’re Doing

  1. Create a Lakebase instance to be used as agent memory

  2. Log an agent model using MLflow, explicitly declaring the Lakebase dependency:

     
    from mlflow.models.resources import DatabricksLakebase
     
    mlflow.pyfunc.log_model(
    ...
    resources=[
    DatabricksLakebase(database_instance_name="agent-memory",
    ] )
  3. Deploy the agent by:

    • Executing a job task calling the databricks-agents library to create or update a serving endpoint


Permissions Setup (Confirmed)


Expected Behavior

  • The serving endpoint should be created successfully

  • Databricks should be able to grant the served entity’s service principal access to the Lakebase dependency automatically


Actual Behavior

The deployment fails during served entity creation, with the following error:

 

 
Endpoint update failed Failed to deploy agent_model_1: Pre-deployment setup for served entity with name 'agent_model_1' and version '1' failed. Error: Served entity service creation failed. This often happens due to failure to grant the service principal associated with the served entity permission to access one or more Databricks product resources. Error: PERMISSION_DENIED: Failed to change permissions for SP 3e86aa94-20e8-4a99-aa48-7e4ae9fb895f. Reason: PERMISSION_DENIED: Endpoint creator doesn't have permission to access dependency type: LAKEBASE with name: agent-memory

Why We Believe This Is a Bug

  • The same user:

    • Creates the Lakebase

    • Logs the model

    • Executes the job

    • Creates the serving endpoint

  • The Lakebase dependency is explicitly declared at model logging time using mlflow.models.resources.DatabricksLakebase

  • The user has:

    • databricks_superuser role

    • Explicit CAN_MANAGE permissions

  • The error occurs when Databricks internally attempts to grant permissions to the served entity’s service principal

  • The error message claims the endpoint creator lacks permission, which contradicts:

    • Actual permissions

    • Documented requirements

This suggests a bug in one of the following areas:

  • Permission validation for Lakebase dependencies during agent deployment

  • Service principal permission propagation for Lakebase

  • Handling of Lakebase as a dependency type in the Agents framework


Request

  • Can the Databricks team confirm whether:

    • This is a known issue with Agents + Lakebase?

    • There are additional (currently undocumented) permissions required?

  • If this is a bug, we would appreciate help escalating this to the relevant engineering team.

We’re happy to provide workspace details or a full repro privately if needed.

1 ACCEPTED SOLUTION

Accepted Solutions

Thanks for your answer @pradeep_singh,

We are using MLFlow version 3.8.1

Update: We were told by a Databricks employee that this is well-known issue and that currently only workspace admins can properly passthrough credentials while agent endpoint creation. We tested again making the endpoint creator a workspace admin and it worked. 

P.S. Another bug we found is that even after passing the parameter 

scale_to_zero_enabled=True to agents.deploy(), the endpoint is not marked correctly to scale to zero. We had to manually change that through the UI.

 

View solution in original post

2 REPLIES 2

pradeep_singh
Contributor

Can you check the MLFlow version you are using . 

pradeep_singh_0-1769958322721.png

 

https://docs.databricks.com/aws/en/generative-ai/agent-framework/agent-authentication#supported-reso...

 

Thank You
Pradeep Singh - https://www.linkedin.com/in/dbxdev

Thanks for your answer @pradeep_singh,

We are using MLFlow version 3.8.1

Update: We were told by a Databricks employee that this is well-known issue and that currently only workspace admins can properly passthrough credentials while agent endpoint creation. We tested again making the endpoint creator a workspace admin and it worked. 

P.S. Another bug we found is that even after passing the parameter 

scale_to_zero_enabled=True to agents.deploy(), the endpoint is not marked correctly to scale to zero. We had to manually change that through the UI.