cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ambigus9
by Contributor
  • 656 Views
  • 1 replies
  • 0 kudos

R-studio on Dedicated Cluster Invalid Access Token

Hello!! Currently I have an R-studio installed on a Dedicated Cluster over Azure Databricks, here are the specs:I must to make enfasis over the Access mode: Manual and Dedicated to a Group.Here, we install R-studio using a notebook with the following...

ambigus9_0-1743020318837.png error token access.png
  • 656 Views
  • 1 replies
  • 0 kudos
Latest Reply
ambigus9
Contributor
  • 0 kudos

Hello! It's me again, I'm also getting the following error: after testing a connection to databricks using sparklyr:Error: ! java.lang.IllegalStateException: No Unity API token found in Unity Scope Run `sparklyr::spark_last_error()` to see the full ...

  • 0 kudos
KLin
by New Contributor III
  • 902 Views
  • 7 replies
  • 1 kudos

Resolved! Unable to Pinpoint where network traffic originates from in GCP

Hi everyone,I have a question regarding networking. A bit of background first: For security reasons, the current allow-policy from GCP to our on-prem-infrastructure is being replaced by a deny-policy for traffic originating from GCP. Therefore access...

  • 902 Views
  • 7 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @KLin, happy to help! -  The reason why traffic originates from the pods subnet for clusters/SQL warehouses without the x-databricks-nextgen-cluster tag (still using GKE) and from the node subnet for clusters with the GCE tag is due to the underly...

  • 1 kudos
6 More Replies
chandru44
by New Contributor
  • 266 Views
  • 1 replies
  • 0 kudos

Guidance on Populating the cloud_infra_cost Table in System Catalog

In the system catalog, there are three tables: cloud_infra_cost, list_prices, and usage. While the list_prices and usage tables contain cost-related information, the cloud_infra_cost table is currently empty. I am using AWS cloud. Can anyone provide ...

Screenshot_1.png
  • 266 Views
  • 1 replies
  • 0 kudos
Latest Reply
skumarrana
Databricks Employee
  • 0 kudos

cloud_infra_cost system table feature is in private preview. Please reach out to your Databricks Account team so that they can help you with enrolling into the private preview. 

  • 0 kudos
jonas_braun
by New Contributor II
  • 297 Views
  • 1 replies
  • 0 kudos

Asset Bundle: inject job start_time parameter

Hey!I'm deploying a job with databricks asset bundles.When the pyspark task is started on a job cluster, I want the python code to read the job start_time and select the right data sources based on that parameter.Ideally, I would read the parameter f...

  • 297 Views
  • 1 replies
  • 0 kudos
Latest Reply
jonas_braun
New Contributor II
  • 0 kudos

The databricks cli version is Databricks CLI v0.239.1

  • 0 kudos
alesventus
by Contributor
  • 209 Views
  • 1 replies
  • 0 kudos

Move metastore to another azure subscription

Hi, We need to migrate our metastore with Unity Catalog to a new Azure subscription while remaining in the same Azure region. Currently, we have two workspaces attached to a single Unity Catalog. I’m looking for the best approach to move the metastor...

alesventus_0-1742817858728.png
  • 209 Views
  • 1 replies
  • 0 kudos
Latest Reply
Nivethan_Venkat
Contributor
  • 0 kudos

Hi @alesventus,There are few points to be considered before migrating from One metastore to another. We need to see how the catalogs, schemas and tables are created as of now.If you have created everything has managed like managed catalog, schema and...

  • 0 kudos
mnorland
by New Contributor III
  • 1014 Views
  • 1 replies
  • 0 kudos

Resolved! Custom VPC Subranges for New GCP Databricks Deployment

What Pods and Services subranges would you recommend for a /22 subnet for a custom VPC for a new GCP Databricks deployment in the GCE era?  

  • 1014 Views
  • 1 replies
  • 0 kudos
Latest Reply
mnorland
New Contributor III
  • 0 kudos

The secondary ranges are there to support legacy GKE clusters.  While required in the UI, they can be empty in terraform (per a source) for new deployments as clusters are GCE now. (There is a green GCE next to the cluster name.)  When observing the ...

  • 0 kudos
Jeff4
by New Contributor
  • 306 Views
  • 0 replies
  • 0 kudos

Unable to create workspace using API

Hi all,I'm trying to automate the deployment of Databricks into GCP. In order to streamline the process, I created a standalone project to hold the service accounts SA1 and SA2, with the second one then being manually populated into the Databricks ac...

  • 306 Views
  • 0 replies
  • 0 kudos
hartenc
by New Contributor II
  • 345 Views
  • 2 replies
  • 0 kudos

Workflow job runs are disabled

I'm not totally clear on the financial details, but from what I've been told: A few months our contract with Databricks expired and changed in a per-month subscription. In those months there was a problem with payments due to bills being sent to a wr...

  • 345 Views
  • 2 replies
  • 0 kudos
Latest Reply
hartenc
New Contributor II
  • 0 kudos

We contacted them, but were told that we could only use community support unless we got a premium support subscription (not sure about the exact term, somebody else asked them).Our account ID is ddcb191f-aff5-4ba5-be46-41adf1705e03. If the  workspace...

  • 0 kudos
1 More Replies
Georgi
by New Contributor
  • 210 Views
  • 1 replies
  • 0 kudos

How to set a static IP to a cluster

Is there a way to set a static IP to a cluster on the Databricks instance? I'm trying to establish connection with a service outside AWS and it seems the only way to allow inbound connections is by adding the IP to a set of rules. thanks!I couldn’t f...

  • 210 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

Hi, @Georgi Databricks clusters on AWS don’t have a built‐in way to assign a static IP address. Instead, the typical workaround is to route all outbound traffic from your clusters through a NAT Gateway (or similar solution) that has an Elastic IP ass...

  • 0 kudos
mzs
by New Contributor II
  • 888 Views
  • 1 replies
  • 1 kudos

Resolved! Understanding Azure frontend private link endpoints

Hi,I've been reading up on private link (https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link) and have some questions:In the standard deployment, do the transit VNet (frontend private endpoint) and Databricks work...

  • 888 Views
  • 1 replies
  • 1 kudos
Latest Reply
Zubisid
New Contributor II
  • 1 kudos

Below are the answers to your questions -1) No, they don’t have to be in the same subscription. You can have the transit VNet (with the front-end Private Endpoint) in one subscription and the Databricks workspace in another, as long as you set up the...

  • 1 kudos
mzs
by New Contributor II
  • 607 Views
  • 2 replies
  • 2 kudos

Using a proxy server to install packages from PyPI in Azure Databricks

Hi,I'm setting up a workspace in Azure and would like to put some restrictions in place on outbound Internet access to reduce the risk of data exfiltration from notebooks and jobs. I plan to use VNet Injection and SCC + back-end private link for comp...

  • 607 Views
  • 2 replies
  • 2 kudos
Latest Reply
mzs
New Contributor II
  • 2 kudos

Thanks Isi, this is great info. I'll update once I've tried it.

  • 2 kudos
1 More Replies
meshko
by New Contributor II
  • 612 Views
  • 4 replies
  • 1 kudos

help undersanding RAM utilization graph

I am trying to understand the following graph databricks is showing me and failing:What is that constant lightly shaded area close to 138GB? It is not explained in the "Usage type" legend. The job is running completely on the driver node, not utilizi...

databricks.png
  • 612 Views
  • 4 replies
  • 1 kudos
Latest Reply
koji_kawamura
Databricks Employee
  • 1 kudos

Hi @meshko  The light-shaded area represents the total available RAM size. The tooltip shows it when you hover over a mouse.    

  • 1 kudos
3 More Replies
dofrey
by New Contributor II
  • 723 Views
  • 1 replies
  • 1 kudos

Create account group with terraform without account admin permissions

I’m trying to create an account-level group in Databricks using Terraform. When creating a group via the UI, it automatically becomes an account-level group that can be reused across workspaces. However, I’m struggling to achieve the same using Terra...

  • 723 Views
  • 1 replies
  • 1 kudos
Latest Reply
fazetu01
New Contributor II
  • 1 kudos

I am also interested in the solution for this! Workspace-level groups cannot be used to grant permissions on Unity Catalog resources so I also need to be able to create account-level groups in terraform while not being an account admin.

  • 1 kudos