cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Swastick
by New Contributor III
  • 62 Views
  • 1 replies
  • 0 kudos

Token report page

Hi All,I'm looking for the API for this token report page in databricks admin page so that I can get this on my notebook.There is a API for workspace level not for account level./api/2.0/token-management/tokensCan anyone point me to right API?

Swastick_0-1763627156046.png
  • 62 Views
  • 1 replies
  • 0 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 0 kudos

This page is an internal aggregation view; there is no single public API endpoint that generates this "Token Report."The Databricks Account Console performs a proprietary backend query that reaches into all your workspaces to build this view. To repl...

  • 0 kudos
RameshRetnasamy
by Contributor
  • 25376 Views
  • 27 replies
  • 26 kudos

Resolved! Unable to login to Azure Databricks Account Console

I have a personal Azure pay-as-you-go subscription in which I have the 'Global Administrator' role. I am also the databricks account administrator.Until two weeks ago, I was able to access the databricks account console without any issues, but I am f...

Screenshot 2024-08-07 at 12.13.53.png Screenshot 2024-08-07 at 12.15.30.png Screenshot 2024-08-07 at 12.17.20.png Screenshot 2024-08-07 at 12.18.35.png
Administration & Architecture
account-console
Databricks
  • 25376 Views
  • 27 replies
  • 26 kudos
Latest Reply
mits1
New Contributor
  • 26 kudos

Thanks Dustin.This solves my issue too,but I want to know WHY this happened?I used an email id (say,xx@gmail.com) to login to Azure,using the same id/user I have deployed databricks and able to launch the workspace.But not account console.what's so s...

  • 26 kudos
26 More Replies
sparkplug
by New Contributor III
  • 1039 Views
  • 12 replies
  • 5 kudos

Resolved! I need a switch to turn off Data Apps in databricks workspaces

HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...

  • 1039 Views
  • 12 replies
  • 5 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 5 kudos

@Raman_Unifeye , I don't have visibility into the roadmap. However, if you are a customer you can always log a feature request. Cheers, Louis.

  • 5 kudos
11 More Replies
Carsten03
by New Contributor III
  • 24795 Views
  • 11 replies
  • 6 kudos

Resolved! Run workflow using git integration with service principal

Hi,I want to run a dbt workflow task and would like to use the git integration for that. Using my personal user I am able to do so but I am running my workflows using a service principal.I added git credentials and the repository using terraform. I a...

  • 24795 Views
  • 11 replies
  • 6 kudos
Latest Reply
Coffee77
Contributor III
  • 6 kudos

On the other hand, another approach you could use. Configure your tasks with relative paths to notebooks and deploy all of them with DAB. Your job will reference directly the deployed notebook, no need to access GIT from jobs/notebooks. That is deleg...

  • 6 kudos
10 More Replies
hv_sg3
by New Contributor
  • 108 Views
  • 1 replies
  • 1 kudos

Enable Compute Policy Management and Compute Policy Admin Role

Hi,I have an account with an Enterprise plan and wanted to change some features of the compute policy for a cluster i wanted to create in a workspace I am an Admin of. But I cannot because the fields are read-only.Co-Pilot directed me to look for an ...

  • 108 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @hv_sg3 ,That's weird. As an workspace admin you should be able to do that. Could you attach some screens?

  • 1 kudos
Marco37
by Contributor II
  • 2757 Views
  • 13 replies
  • 6 kudos

Resolved! Install python packages from Azure DevOps feed with service principal authentication

At the moment I install python packages from our Azure DevOps feed with a PAT token as authentication mechanism. This works well, but I want to use a service principal instead of the PAT token.I have created an Azure service principal and assigned it...

Marco37_0-1753975679472.png Marco37_1-1753975813527.png Marco37_2-1753975934347.png
  • 2757 Views
  • 13 replies
  • 6 kudos
Latest Reply
FilipD
New Contributor II
  • 6 kudos

I'm kinda late to the party but what is the suggested way of retriving the access token rn? Using some #bash or python code stored in global init script or cluster scoped init scripts? I don't want to stored this code in the notebook.Idea is to block...

  • 6 kudos
12 More Replies
APJESK
by New Contributor III
  • 87 Views
  • 1 replies
  • 0 kudos

Can anyone share Databricks security model documentation or best-practice references

Can anyone share Databricks security model documentation or best-practice references

  • 87 Views
  • 1 replies
  • 0 kudos
Latest Reply
Coffee77
Contributor III
  • 0 kudos

Here is the official documentation of Databricks: https://docs.databricks.com/aws/en/security/  Do you need to dive deeper into any specific area?

  • 0 kudos
chandru44
by New Contributor II
  • 146 Views
  • 1 replies
  • 1 kudos

Moving Databricks Metastore Storage Account Between Azure Subscriptions

I have two Azure subscriptions: one for Prod and another for Non-Prod. During the initial setup of the Non-Production Databricks Workspace, I configured the metastore storage account in the Non-Prod subscription. However, I now want to move this meta...

chandru44_0-1763266537882.png
  • 146 Views
  • 1 replies
  • 1 kudos
Latest Reply
Coffee77
Contributor III
  • 1 kudos

Assuming the metastore is the same for your DEV and PROD environments and what you want is just to use the same storage account + container to place managed tables, volumes, etc. in theory you just need to copy all content from your source storage ac...

  • 1 kudos
margarita_shir
by New Contributor
  • 155 Views
  • 1 replies
  • 0 kudos

aws databricks with frontend private link

 In aws databricks documentation, frontend PrivateLink assumes a separate transit VPC connected via Direct Connect/VPN. However, I'm implementing a different architecture using Tailscale for private network access.My setup: Tailscale subnet router de...

  • 155 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hello @margarita_shir  Short answer: Yes—if your clients can privately reach the existing Databricks “Workspace (including REST API)” interface endpoint, you can reuse that same VPC endpoint for front‑end (user) access. You must not try to use the se...

  • 0 kudos
pdiamond
by Contributor
  • 131 Views
  • 1 replies
  • 2 kudos

Resolved! Lakebase query history / details

Is there somehwere in Databricks that I can see details about queries run againt one of my Lakebase databases (similar to query history system tables)?What I'm ultimately trying to figure out is where the time is being spent between when I issue the ...

  • 131 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @pdiamond ,Currently in beta there's a feature that let's you monitor active queries:https://docs.databricks.com/aws/en/oltp/projects/active-queriesAlso in beta there's Lakebase SQL editor that will allow you to analyze queries:https://docs.databr...

  • 2 kudos
RDE305
by New Contributor II
  • 244 Views
  • 1 replies
  • 0 kudos

A single DLT for Ingest - feedback on this architecture

What are your thoughts on this Databricks pipeline design?Different facilities will send me backups of a proprietary transactional database containing tens of thousands of tables. Each facility may have differences in how these tables are populated o...

  • 244 Views
  • 1 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

Your design shows strong alignment with the Medallion Architecture principles and addresses schema variability well, but there are some scalability and governance considerations worth discussing. Also Pre-Bronze, Building a schema registry early is e...

  • 0 kudos
FabianGutierrez
by Contributor
  • 1363 Views
  • 3 replies
  • 0 kudos

Looking for experiences with DABS CLI Deployment, Terraform and Security

Hi Community,I hope my topic finds you well. Within our Databricks landscape we decided to use DABS (Databricks Asset Bundles) however we found out (the hard way) that it uses Terraform for Deployment purposes. This is a concern now for Security and ...

  • 1363 Views
  • 3 replies
  • 0 kudos
Latest Reply
Coffee77
Contributor III
  • 0 kudos

Try to use always service principals to deploy your asset bundles. If desired take a look here: https://www.youtube.com/watch?v=5WreXn0zbt8 Concerning terraform state, it is indeed generated, take a look at this picture extracted from one of my deplo...

  • 0 kudos
2 More Replies
maikel
by New Contributor III
  • 270 Views
  • 3 replies
  • 1 kudos

Agent outside databricks communication with databricks MCP server

Hello Community!I have a following use case in my project:User -> AI agent -> MCP Server -> Databricks data from unity catalog.- AI agent is not created in the databricks- MCP server is created in the databricks and should expose tools to get data fr...

  • 270 Views
  • 3 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Hopefully this helps... You can securely connect your external AI agent to a Model Context Protocol (MCP) server and Unity Catalog while maintaining strong control over authentication and resource management. The method depends on whether MCP is outs...

  • 1 kudos
2 More Replies
eshwari
by New Contributor III
  • 185 Views
  • 1 replies
  • 1 kudos

Restricting Catalog and External Location Visibility Across Databricks Workspaces

Restricting Catalog and External Location Visibility Across Databricks Workspaces I am facing exact similar issue, But I don't want to create separate metastore. and I have added environment name as a prefix to all external locations. All the locatio...

  • 185 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

You can hide or scope external locations and catalogs so they are only visible within their respective Databricks workspaces—even when using a shared metastore—by using "workspace binding" (also called isolation mode or workspace-catalog/workspace-ex...

  • 1 kudos
erigaud
by Honored Contributor
  • 2765 Views
  • 5 replies
  • 4 kudos

Resolved! DLT-Asset bundle : Pipelines do not support a setting a run_as user that is different from the owner

Hello !We're using Databricks asset bundles to deploy to several environments using a devops pipeline. The service principal running the CICD pipeline and creating the job (owner) is not the same as the SPN that will be running the jobs (run_as).This...

  • 2765 Views
  • 5 replies
  • 4 kudos
Latest Reply
Coffee77
Contributor III
  • 4 kudos

Maybe I'm not catching this or missing something else but I've got the following job in one of my demo workspaces:Creator is my user and the job runs as a service principal account. Those are different identities. I got this by deploying the job with...

  • 4 kudos
4 More Replies