cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AlbertWang
by Valued Contributor
  • 1983 Views
  • 4 replies
  • 1 kudos

Resolved! How to use Databricks CLI as a service principal?

Hi all,I have a question about how to use Databricks CLI on my local environment as a service principal?I have installed Databricks CLI and configured the file `.databrickscfg` as shown below. [DEFAULT] host = https://adb-123123123.1.azuredatabr...

  • 1983 Views
  • 4 replies
  • 1 kudos
Latest Reply
Stefan-Koch
Valued Contributor II
  • 1 kudos

got you.I found a working solution. Try this one:[devsp] azure_workspace_resource_id = /subscriptions/bc0cd1..././.../Databricks/workspaces/my-workspace azure_tenant_id = bc0cd1... azure_client_id = fa0cd1... azure_client_secr...

  • 1 kudos
3 More Replies
LauJohansson
by Contributor
  • 454 Views
  • 1 replies
  • 1 kudos

Terraform - Azure Databricks workspace without NAT gateway

Hi all,I have experienced an increase in costs - even when not using Databricks compute.It is due to the NAT-gateway, that are (suddenly) automatically deployed.When creating Azure Databricks workspaces using Terraform:A NAT-gateway is created. When ...

LauJohansson_0-1729142670306.png LauJohansson_1-1729142785587.png
  • 454 Views
  • 1 replies
  • 1 kudos
Latest Reply
saurabh18cs
Valued Contributor III
  • 1 kudos

try by adding more properties:Also, Ensure that the subnets used by Azure Databricks do not have settings that require a NAT gateway.Consider using private endpoints for Azure Databricks to avoid the need for a NAT gateway.   infrastructure_encryptio...

  • 1 kudos
Mayank1
by New Contributor
  • 1305 Views
  • 4 replies
  • 0 kudos
  • 1305 Views
  • 4 replies
  • 0 kudos
Latest Reply
case-k
New Contributor III
  • 0 kudos

Thank you so much!! I solve this by reinstalling chrome browser. I got this issue last week and can not solve it even if wait, clear cache, restart..etc. But worked in another type of browser. So I reinstall Chrome browser and it worked. Thank you 

  • 0 kudos
3 More Replies
jreh
by New Contributor II
  • 1713 Views
  • 6 replies
  • 0 kudos

Exact cost for job execution calculation

Hi everybody,I want to calculate the exact cost of single job execution. In all examples I can find on the internet it uses the tables system.billing.usage and system.billing.list_prices. It makes sense to calculate the sum of DBUs consumed and multi...

jreh_0-1728643622955.png
  • 1713 Views
  • 6 replies
  • 0 kudos
Latest Reply
jreh
New Contributor II
  • 0 kudos

@radothede, I've clarified this with Databricks and my assumption was correct. The formula sum(usage_quantity * list_prices.pricing.default)is only right, if the time window in the usage table is 1 hour. For every window that is not 1 hour, the fract...

  • 0 kudos
5 More Replies
adil-shiva
by New Contributor II
  • 751 Views
  • 4 replies
  • 0 kudos

Interface for Databricks CE

My Databricks CE interface does not have the quick guides or toggles for Data Science & Engineering/Machine Learning. this is what it looks like and I want to see the quick guides. 

IMG_1294.png
  • 751 Views
  • 4 replies
  • 0 kudos
Latest Reply
LauJohansson
Contributor
  • 0 kudos

What cloud provider do you use? 

  • 0 kudos
3 More Replies
GuyPerson
by New Contributor
  • 1402 Views
  • 1 replies
  • 0 kudos

Resolved! Docs/Info for metastore, artifact blob storage endpoints etc for Azure Databricks

Hi! This is probably a very newbie type of question and my google skills seems to be lacking but is there any in depth documentation/explanations of the metastore, artifact blob storage, system tables storage, log blob storage and event hubs services...

  • 1402 Views
  • 1 replies
  • 0 kudos
Latest Reply
shashank853
Databricks Employee
  • 0 kudos

Hi,You can check below: System Tables Storage Purpose: System tables storage is used to store system-level metadata and configuration data for the Azure Databricks workspace.Data Stored: This includes metadata related to the Unity Catalog, cluster c...

  • 0 kudos
Fiabane
by New Contributor III
  • 1697 Views
  • 2 replies
  • 3 kudos

Resolved! Databricks Asset Bundles + Artifacts + Poetry

Hello,I've configured the DABs on our project successfully. Moreover, I could switch from setuptools to poetry almost successfully. In the project's databricks.yml I configured it as the documentation suggested, I've just changed the name of the arti...

Fiabane_0-1729111836436.png
  • 1697 Views
  • 2 replies
  • 3 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 3 kudos

Hi @Fiabane ,Could you first check:Do you see your .whl file in your artifacts folder?Could you try to install the package by running the code in your notebook :         %pip install  <path to your wheel>As far as I understand you want to have a job ...

  • 3 kudos
1 More Replies
UnaiUribarri
by New Contributor II
  • 819 Views
  • 2 replies
  • 0 kudos

Databricks Kryo setup

I would like to consolidate all our Spark jobs in Databricks. One of those jobs that are currently running in Azure HDInsight is not properly working using a Databricks JAR job.It uses Spark 3.3 RDDs and requires configuring Kryo serialisation. There...

  • 819 Views
  • 2 replies
  • 0 kudos
Latest Reply
dilsan77
New Contributor II
  • 0 kudos

Integrating Spark tasks with Databricks can greatly improve your workflow. For tasks that require Kryo serialization, make sure you configure your Spark session correctly. You may need to adjust the serialization settings in your Spark configuration....

  • 0 kudos
1 More Replies
Jim-Shady
by New Contributor II
  • 459 Views
  • 1 replies
  • 2 kudos

Azure Databricks Classic Compute Plane Firewall

I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled using NSG rules,...

  • 459 Views
  • 1 replies
  • 2 kudos
Latest Reply
michael569gardn
New Contributor III
  • 2 kudos

@Jim-Shady wrote:I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled...

  • 2 kudos
PabloCSD
by Valued Contributor
  • 3026 Views
  • 1 replies
  • 0 kudos

Resolved! How to deploy to Databricks Assets Bundle from Azure DevOps using Service Principal?

I have a CI/CD process that after a Pull Request (PR) to main it deploys to staging.It works using a Personal Access Token using Azure Pipelines.From local, deploying using Service Principal works (https://community.databricks.com/t5/administration-a...

  • 3026 Views
  • 1 replies
  • 0 kudos
Latest Reply
PabloCSD
Valued Contributor
  • 0 kudos

I needed to deploy a job using CI/CD Azure Pipelines without using the OAuth, this is the way:First you need to have configured the Service Principal, for that you need to generate it in your workspace with this you will have:A host: Which is your wo...

  • 0 kudos
Derek_Czarny
by New Contributor III
  • 908 Views
  • 5 replies
  • 1 kudos

Resolved! Creating Groups with API and Python

I am working on a notebook to help me create Azure Databricks Groups.  When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them.  When I create a group using the API, and I create the w...

  • 908 Views
  • 5 replies
  • 1 kudos
Latest Reply
Derek_Czarny
New Contributor III
  • 1 kudos

That was it, thank you.  I was looking at the wrong details.  I really appreciate it.

  • 1 kudos
4 More Replies
umccanna
by New Contributor III
  • 1287 Views
  • 5 replies
  • 0 kudos

Resolved! Unable to Create Job Task Using Git Provider Invalid Path

I am attempting to create a task in a job using the Git Provider as a source and GitHub is the provider.  The repo is a private repo.  Regardless of how I enter the path to the notebook I receive the same error that the notebook path is invalid and o...

  • 1287 Views
  • 5 replies
  • 0 kudos
Latest Reply
umccanna
New Contributor III
  • 0 kudos

Like I said in a previous response.  This started working automatically a few days ago with no changes on our end.  The developer who was working on this decided to try it one more time and it just worked, no error this time.  I don't know if Databri...

  • 0 kudos
4 More Replies
RameshSolanki
by New Contributor II
  • 809 Views
  • 1 replies
  • 0 kudos

Bring data from databricks to sharepoint list using the Power Automate

Good afternoon to all and I am new to this community.We are trying to bring data from databricks to sharepoint list using the Power Automate app (create workflow and trigger it when there is new record or exising record is modified in source table in...

  • 809 Views
  • 1 replies
  • 0 kudos
Latest Reply
RameshSolanki
New Contributor II
  • 0 kudos

Hi all, Can anyone assist me with this request ?Thanks in advance 

  • 0 kudos
vhazeleger
by New Contributor
  • 482 Views
  • 1 replies
  • 0 kudos

Tabs for notebooks

Browsing this page of the documentation, the displayed GIF shows a notebook that is opened in its own tab. I've been looking for how to enable this feature in my own workspace, but cannot find it.Does anyone know how to enable this feature?

  • 482 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Nope.It seems that is some kind of new version of the UI.In the SQL editor one can open multiple tabs. But for python notebooks I have no idea.

  • 0 kudos