cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

UnaiUribarri
by New Contributor II
  • 1761 Views
  • 2 replies
  • 0 kudos

Databricks Kryo setup

I would like to consolidate all our Spark jobs in Databricks. One of those jobs that are currently running in Azure HDInsight is not properly working using a Databricks JAR job.It uses Spark 3.3 RDDs and requires configuring Kryo serialisation. There...

  • 1761 Views
  • 2 replies
  • 0 kudos
Latest Reply
dilsan77
New Contributor II
  • 0 kudos

Integrating Spark tasks with Databricks can greatly improve your workflow. For tasks that require Kryo serialization, make sure you configure your Spark session correctly. You may need to adjust the serialization settings in your Spark configuration....

  • 0 kudos
1 More Replies
Jim-Shady
by New Contributor II
  • 857 Views
  • 1 replies
  • 2 kudos

Azure Databricks Classic Compute Plane Firewall

I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled using NSG rules,...

  • 857 Views
  • 1 replies
  • 2 kudos
Latest Reply
michael569gardn
New Contributor III
  • 2 kudos

@Jim-Shady wrote:I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled...

  • 2 kudos
PabloCSD
by Valued Contributor II
  • 5099 Views
  • 1 replies
  • 0 kudos

Resolved! How to deploy to Databricks Assets Bundle from Azure DevOps using Service Principal?

I have a CI/CD process that after a Pull Request (PR) to main it deploys to staging.It works using a Personal Access Token using Azure Pipelines.From local, deploying using Service Principal works (https://community.databricks.com/t5/administration-a...

  • 5099 Views
  • 1 replies
  • 0 kudos
Latest Reply
PabloCSD
Valued Contributor II
  • 0 kudos

I needed to deploy a job using CI/CD Azure Pipelines without using the OAuth, this is the way:First you need to have configured the Service Principal, for that you need to generate it in your workspace with this you will have:A host: Which is your wo...

  • 0 kudos
umccanna
by New Contributor III
  • 2460 Views
  • 5 replies
  • 0 kudos

Resolved! Unable to Create Job Task Using Git Provider Invalid Path

I am attempting to create a task in a job using the Git Provider as a source and GitHub is the provider.  The repo is a private repo.  Regardless of how I enter the path to the notebook I receive the same error that the notebook path is invalid and o...

  • 2460 Views
  • 5 replies
  • 0 kudos
Latest Reply
umccanna
New Contributor III
  • 0 kudos

Like I said in a previous response.  This started working automatically a few days ago with no changes on our end.  The developer who was working on this decided to try it one more time and it just worked, no error this time.  I don't know if Databri...

  • 0 kudos
4 More Replies
RameshSolanki
by New Contributor II
  • 1285 Views
  • 1 replies
  • 0 kudos

Bring data from databricks to sharepoint list using the Power Automate

Good afternoon to all and I am new to this community.We are trying to bring data from databricks to sharepoint list using the Power Automate app (create workflow and trigger it when there is new record or exising record is modified in source table in...

  • 1285 Views
  • 1 replies
  • 0 kudos
Latest Reply
RameshSolanki
New Contributor II
  • 0 kudos

Hi all, Can anyone assist me with this request ?Thanks in advance 

  • 0 kudos
vhazeleger
by New Contributor
  • 827 Views
  • 1 replies
  • 0 kudos

Tabs for notebooks

Browsing this page of the documentation, the displayed GIF shows a notebook that is opened in its own tab. I've been looking for how to enable this feature in my own workspace, but cannot find it.Does anyone know how to enable this feature?

  • 827 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Nope.It seems that is some kind of new version of the UI.In the SQL editor one can open multiple tabs. But for python notebooks I have no idea.

  • 0 kudos
PabloCSD
by Valued Contributor II
  • 1082 Views
  • 1 replies
  • 0 kudos

How to generate an Azure Subscription from a Databricks Generated Service Principal?

Hello, I currently have a Service Principal (SP) Client_Id and its associated secret, I generated it directly from my workspace in Databricks, i was following this post: https://github.com/databricks/cli/issues/1722, but I don't know how to generate ...

  • 1082 Views
  • 1 replies
  • 0 kudos
Latest Reply
VictoriaTha
New Contributor II
  • 0 kudos

Learn to summon an Azure Subscription from a Databricks-generated Service Principal. Harness the power of data with this vital step in Azure infrastructure management. Mastering it is as crucial as surviving Fnaf

  • 0 kudos
tcmx
by New Contributor II
  • 1761 Views
  • 1 replies
  • 0 kudos

Resolved! Restrictions on setting environment variables in Compute Policies

As recommended by Databricks, we are trying to use Compute Policies to set environment variables, which are used by our notebooks, across clusters.However, when specifying a JSON string as env var, we are getting this error upon applying the policy t...

  • 1761 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

This is because you use Shared access mode.This enables multiple users to use the cluster simultaneously.However, there are features that do not work on these Shared access mode clusters:https://docs.databricks.com/en/compute/access-mode-limitations....

  • 0 kudos
stevanovic
by New Contributor
  • 2917 Views
  • 1 replies
  • 0 kudos

One Azure Tenant with Multiple Azure Databricks Accounts

Hi there,We have one Azure tenant with multiple subscriptions. Each subscription is a project for itself.At this moment, we have only one Azure Databricks account, and all workspaces (created under different subscriptions) are associated with it.Can ...

  • 2917 Views
  • 1 replies
  • 0 kudos
Latest Reply
radothede
Valued Contributor II
  • 0 kudos

hello @stevanovic ,as far as I understand, in Azure, you can create one databricks account per tenant, meaning for example unity catalog is also tenant-level resource.There is a fantastic blog post available here:https://community.databricks.com/t5/t...

  • 0 kudos
Kayla
by Valued Contributor II
  • 2201 Views
  • 3 replies
  • 2 kudos

Resolved! Silly question-Easy way to show full notebook path or owner in UI?

We have a few people working in Databricks right now in different clones of the same repository. Occasionally we'll have multiple people with the same branch open- one working, another just has it open to see what it looks like, sort of deal.This has...

  • 2201 Views
  • 3 replies
  • 2 kudos
Latest Reply
radothede
Valued Contributor II
  • 2 kudos

hi @Kayla ,I think the easiest way to check the current notebook location when opened is just hover the mouse cursor over the name of the notebook (top left, "ADE 3.1 - Streaming Deduplication" in this case) and wait for about 1-2 seconds; after that...

  • 2 kudos
2 More Replies
sharat_n
by New Contributor
  • 1592 Views
  • 1 replies
  • 1 kudos

Delta Lake: Running Delete and writes concurrently

Is it safe to run a delete query when there are active writes to a delta lake table? Next question : Is it safe to run a vacuum when writes are being done actively?  

  • 1592 Views
  • 1 replies
  • 1 kudos
Latest Reply
radothede
Valued Contributor II
  • 1 kudos

Hello @sharat_n ,Yes, it is generally safe to run a DELETE query on a Delta Lake table while active writes are happening.Delta Lake is designed with ACID transactions, meaning operations like DELETE, UPDATE, and MERGE are atomic and isolated.In other...

  • 1 kudos
SolaireOfAstora
by New Contributor
  • 3657 Views
  • 0 replies
  • 0 kudos

Databricks report error: unexpected end of stream, read 0 bytes from 4 (socket was closed by server)

Has anyone encountered this error and knows how to resolve it?"Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)."This occurs in Databricks while generating reports.I've already adjusted the wait_timeout to 28,800, and both ...

  • 3657 Views
  • 0 replies
  • 0 kudos
chaaaaaarlie
by New Contributor II
  • 2295 Views
  • 2 replies
  • 1 kudos

Resolved! Missing 'Permissions' settings for Delta Live Tables Pipelines

Context: Azure Databricks, I am account admin, workspace admin, and pipeline owner (as confirmed via the API and visually in the Pipelines screen). When attempting to grant CAN_MANAGE access to developers for our DLT pipelines via the Databricks web ...

  • 2295 Views
  • 2 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

If you click on the link of the Pipeline and then you access the kebab menu you will see the Permissions page  

  • 1 kudos
1 More Replies
sjs
by New Contributor II
  • 1056 Views
  • 1 replies
  • 0 kudos

Unable to create a databricks workspace

I am unable to create a databricks workspace with vnet injection. I get this error:│ { │ "status": "Failed", │ "error": { │ "code": "InternalServerError", │ "message": "INTERNAL_ERROR: Unexpected error: Cannot call getCertifiedMetastoreFo...

  • 1056 Views
  • 1 replies
  • 0 kudos
Latest Reply
sjs
New Contributor II
  • 0 kudos

The issue resolved itself when I tried to create a new resource group, dedicated to just Databricks. I don't know why that worked. If anyone know what went wrong, I would appreciate feedback!

  • 0 kudos