cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Lakehouse Architecture


Forum Posts

ksenija
by Contributor
  • 870 Views
  • 3 replies
  • 0 kudos

Connecting Databricks and Azure Devops

Hi everyone,When I tried to create new Databricks job that is using a notebook from a repo, it asked me to set up Azure DevOps Services (Personal access token) in Linked Accounts under my username. And now every time I want to create a new branch or ...

ksenija_0-1717148736222.png ksenija_1-1717148797484.png
  • 870 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Honored Contributor
  • 0 kudos

Hi @ksenija , Looks like the Git credentials for the job uses a different account and are missing.  The job is configured to use {a particular user} but this account has credentials for {another configured user}. So you need to update the git details...

  • 0 kudos
2 More Replies
JameDavi_51481
by New Contributor III
  • 1412 Views
  • 4 replies
  • 2 kudos

Adhoc workflows - managing resource usage on shared clusters

We run a shared cluster that is used for general purpose adhoc analytics, which I assume is a relatively common use case to try to keep costs down. However, the technical experience of users of this cluster varies a lot, so we run into situations whe...

  • 1412 Views
  • 4 replies
  • 2 kudos
Latest Reply
xorbix_rshiva
Contributor
  • 2 kudos

Here's another idea: configure a Personal Compute policy and restrict the inexperienced users from attaching to the shared cluster, Then, only grant unrestricted cluster creation permissions to trusted users.You can override the default personal comp...

  • 2 kudos
3 More Replies
Kaz
by New Contributor II
  • 820 Views
  • 1 replies
  • 0 kudos

Databricks CLI/SDKs not returning all logs even when less than 5 MB

We're currently using the python sdk, but the same problem is in the databricks cli. The documentation states that when using workspace.jobs.get_run_output().logs, the last 5 MB of these logs are returned. However, we notice that the logs are truncat...

  • 820 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Kaz, The truncation of logs in Databricks occurs when the size of the JSON representation exceeds 100 KB. When this limit is reached, the values are truncated, and the string “truncated” is appended to the affected entries. In rare cases where a ...

  • 0 kudos
746837
by New Contributor II
  • 3537 Views
  • 3 replies
  • 0 kudos

Databricks and SMTP

Using databricks as aws partner trying to run python script to validate email addresses.  Whenever it gets to the smtp portion it times out.  I am able to telnet from python to the POP servers and get a response, I can ping domains and get replies, b...

  • 3537 Views
  • 3 replies
  • 0 kudos
Latest Reply
Babu_Krishnan
Contributor
  • 0 kudos

@746837 , Did you resolve this issue ? 

  • 0 kudos
2 More Replies
Kutbuddin
by New Contributor II
  • 2520 Views
  • 5 replies
  • 1 kudos

Resolved! Stream Query termination using available now trigger and toTable.

We are running a streaming job in databricks with custom streaming logic which consumes a CDC stream from mongo and appends to a delta table, at the end of the streaming job we have a internal checkpointing logic which creates an entry into a table w...

  • 2520 Views
  • 5 replies
  • 1 kudos
Latest Reply
Kutbuddin
New Contributor II
  • 1 kudos

I was expecting spark.sql(f"insert into table {internal_tab_name} values({dt})") to execute at the end after the streaming query was written to the table. What I observed:The spark sql query spark.sql(f"insert into table {internal_tab_name} values({d...

  • 1 kudos
4 More Replies
phguk
by New Contributor III
  • 1006 Views
  • 2 replies
  • 0 kudos

Efficient methods to make a temporary copy of a table

I'm using a tool (SAS) that doesn't inherently support time travel - that's to say it doesn't generate SQL including Timestamp or Version (for example). An obvious work-around could be to first copy/clone the version of the table, which SAS can then ...

  • 1006 Views
  • 2 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@phguk I think that Shallow Clone would be the best solution here.

  • 0 kudos
1 More Replies
radothede
by New Contributor III
  • 1432 Views
  • 5 replies
  • 3 kudos

Monitoring VM costs using cluster pools

Hello,With ref to docs https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags cluster tags are not propagated to VM when created within a pool.Is there any workaround for monitoring VM costs using cluster pools (j...

Administration & Architecture
cluster pools
costs
vm
  • 1432 Views
  • 5 replies
  • 3 kudos
Latest Reply
radothede
New Contributor III
  • 3 kudos

Dear @Kaniz_Fatma ,as You mentioned, Databricks does not provide out of the box support for VM usage monitoring for job clusters created from cluster pool.If we really want to use cluster pool, I would consider:1) splitting the pool into separate poo...

  • 3 kudos
4 More Replies
radothede
by New Contributor III
  • 2984 Views
  • 1 replies
  • 0 kudos

Databricks DBU pre-purchase

Hello there,are pre-purchased DBU still valid? Can we use it?https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-discount-databricksCan someone please explain how it works in practice, by example?What if I pre-puc...

Administration & Architecture
DBCU
optimize costs
Pre-purchase
reservation
  • 2984 Views
  • 1 replies
  • 0 kudos
Latest Reply
radothede
New Contributor III
  • 0 kudos

@Kaniz_Fatma could you please kindly look at this one? Thank You in advance.

  • 0 kudos
Snoonan
by Contributor
  • 2844 Views
  • 3 replies
  • 0 kudos

Resolved! Secrete management

Hi all, I am trying to use secrets to connect to my Azure storage account. I want to be able to read the data form the storage account using a pyspark notebook.Has anyone experience setting up such a connection or has good documentation to do so?I ha...

  • 2844 Views
  • 3 replies
  • 0 kudos
Latest Reply
DonatienTessier
Contributor
  • 0 kudos

Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...

  • 0 kudos
2 More Replies
Aiden-Z
by New Contributor III
  • 2379 Views
  • 4 replies
  • 1 kudos

Resolved! Init script failure after workspace upload

We have a pipeline in Azure Devops that deploys init scripts to the workspace folder on an Azure Databricks resource using the workspace API (/api/2.0/workspace/import), we use format "AUTO" and overwrite "true" to achieve this. After being uploaded ...

  • 2379 Views
  • 4 replies
  • 1 kudos
Latest Reply
Aiden-Z
New Contributor III
  • 1 kudos

If anyone else comes across this problem, the issue was a deployment powershell script was changing LF to CRLF before upload in the init script. The solution was to upload with LF line endings in the pipeline. 

  • 1 kudos
3 More Replies
pawelmitrus
by Contributor
  • 2113 Views
  • 4 replies
  • 0 kudos

Azure Databricks account api can't auth

Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...

  • 2113 Views
  • 4 replies
  • 0 kudos
Latest Reply
pawelmitrus
Contributor
  • 0 kudos

UPDATE Solved, the very same solution started to work today from running a pipeline with tf - M2M auth. with a service principal with fed auth. That's the 2 from my above post.When trying to follow these steps https://learn.microsoft.com/en-us/azure/...

  • 0 kudos
3 More Replies
NadithK
by Contributor
  • 1901 Views
  • 2 replies
  • 0 kudos

Public exposure for clusters in SCC enabled workspaces

Hi,We are facing a requirement where we need to somehow expose one of our Databricks clusters to an external service. Our organization's cyber team is running a security audit of all of the resource we use and they have some tools which they use to r...

  • 1901 Views
  • 2 replies
  • 0 kudos
Latest Reply
NadithK
Contributor
  • 0 kudos

Hi @Kaniz_Fatma ,Thank you very much for the reply. But I don't think this actually resolves our concern.All these solutions talk about utilizing the databricks cluster to access/read data in Databricks. They focus on getting to the Databricks data t...

  • 0 kudos
1 More Replies
harripy
by New Contributor III
  • 3300 Views
  • 7 replies
  • 0 kudos

Databricks SQL connectivity in Python with Service Principals

Tried to use M2M OAuth connectivity on Databricks SQL Warehouse in Python:from databricks.sdk.core import Config, oauth_service_principal from databricks import sql .... config = Config(host=f"https://{host}", client_...

  • 3300 Views
  • 7 replies
  • 0 kudos
Latest Reply
kavyakavuri
New Contributor II
  • 0 kudos

I am facing the same issue with the same error logs as @harripy. Can you please help @Yeshwanth @Dani ? 

  • 0 kudos
6 More Replies
leungi
by Contributor
  • 3595 Views
  • 2 replies
  • 1 kudos

Resolved! Install system libraries on the cluster

The `Library` option in cluster config allows installation of language-specific libraries - e.g., PyPi for Python, CRAN for R.Some of these libraries - e.g., `sf` - require system libraries - e.g., `libudunits2-dev`, `libgdal-dev`.How may one install...

  • 3595 Views
  • 2 replies
  • 1 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 1 kudos

you can install in init scripthttps://docs.databricks.com/en/init-scripts/index.html 

  • 1 kudos
1 More Replies
otydos
by New Contributor II
  • 2342 Views
  • 2 replies
  • 0 kudos

Authenticate with Terraform to Databricks Account level using Azure MSI(System assigned)

Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...

  • 2342 Views
  • 2 replies
  • 0 kudos
Latest Reply
DonatienTessier
Contributor
  • 0 kudos

Hello,On my side, I always have to add the provider in each resource block.You can try that:  resource "databricks_group" "xxxxx" { provider = databricks.accounts display_name = "xxxxx" }  About authentication, you can also try to add:auth_type  ...

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels