- 5177 Views
- 5 replies
- 1 kudos
Resolved! Stream Query termination using available now trigger and toTable.
We are running a streaming job in databricks with custom streaming logic which consumes a CDC stream from mongo and appends to a delta table, at the end of the streaming job we have a internal checkpointing logic which creates an entry into a table w...
- 5177 Views
- 5 replies
- 1 kudos
- 1 kudos
I was expecting spark.sql(f"insert into table {internal_tab_name} values({dt})") to execute at the end after the streaming query was written to the table. What I observed:The spark sql query spark.sql(f"insert into table {internal_tab_name} values({d...
- 1 kudos
- 1820 Views
- 2 replies
- 0 kudos
Efficient methods to make a temporary copy of a table
I'm using a tool (SAS) that doesn't inherently support time travel - that's to say it doesn't generate SQL including Timestamp or Version (for example). An obvious work-around could be to first copy/clone the version of the table, which SAS can then ...
- 1820 Views
- 2 replies
- 0 kudos
- 0 kudos
@phguk I think that Shallow Clone would be the best solution here.
- 0 kudos
- 1179 Views
- 0 replies
- 0 kudos
Databricks CLI/SDKs not returning all logs even when less than 5 MB
We're currently using the python sdk, but the same problem is in the databricks cli. The documentation states that when using workspace.jobs.get_run_output().logs, the last 5 MB of these logs are returned. However, we notice that the logs are truncat...
- 1179 Views
- 0 replies
- 0 kudos
- 3151 Views
- 3 replies
- 3 kudos
Monitoring VM costs using cluster pools
Hello,With ref to docs https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags cluster tags are not propagated to VM when created within a pool.Is there any workaround for monitoring VM costs using cluster pools (j...
- 3151 Views
- 3 replies
- 3 kudos
- 3 kudos
Dear @Retired_mod ,as You mentioned, Databricks does not provide out of the box support for VM usage monitoring for job clusters created from cluster pool.If we really want to use cluster pool, I would consider:1) splitting the pool into separate poo...
- 3 kudos
- 4248 Views
- 1 replies
- 0 kudos
Databricks DBU pre-purchase
Hello there,are pre-purchased DBU still valid? Can we use it?https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-discount-databricksCan someone please explain how it works in practice, by example?What if I pre-puc...
- 4248 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod could you please kindly look at this one? Thank You in advance.
- 0 kudos
- 5361 Views
- 3 replies
- 0 kudos
Resolved! Secrete management
Hi all, I am trying to use secrets to connect to my Azure storage account. I want to be able to read the data form the storage account using a pyspark notebook.Has anyone experience setting up such a connection or has good documentation to do so?I ha...
- 5361 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...
- 0 kudos
- 3651 Views
- 3 replies
- 1 kudos
Resolved! Init script failure after workspace upload
We have a pipeline in Azure Devops that deploys init scripts to the workspace folder on an Azure Databricks resource using the workspace API (/api/2.0/workspace/import), we use format "AUTO" and overwrite "true" to achieve this. After being uploaded ...
- 3651 Views
- 3 replies
- 1 kudos
- 1 kudos
If anyone else comes across this problem, the issue was a deployment powershell script was changing LF to CRLF before upload in the init script. The solution was to upload with LF line endings in the pipeline.
- 1 kudos
- 3612 Views
- 4 replies
- 0 kudos
Azure Databricks account api can't auth
Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...
- 3612 Views
- 4 replies
- 0 kudos
- 0 kudos
UPDATE Solved, the very same solution started to work today from running a pipeline with tf - M2M auth. with a service principal with fed auth. That's the 2 from my above post.When trying to follow these steps https://learn.microsoft.com/en-us/azure/...
- 0 kudos
- 2574 Views
- 1 replies
- 0 kudos
Public exposure for clusters in SCC enabled workspaces
Hi,We are facing a requirement where we need to somehow expose one of our Databricks clusters to an external service. Our organization's cyber team is running a security audit of all of the resource we use and they have some tools which they use to r...
- 2574 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod ,Thank you very much for the reply. But I don't think this actually resolves our concern.All these solutions talk about utilizing the databricks cluster to access/read data in Databricks. They focus on getting to the Databricks data t...
- 0 kudos
- 5174 Views
- 2 replies
- 1 kudos
Resolved! Install system libraries on the cluster
The `Library` option in cluster config allows installation of language-specific libraries - e.g., PyPi for Python, CRAN for R.Some of these libraries - e.g., `sf` - require system libraries - e.g., `libudunits2-dev`, `libgdal-dev`.How may one install...
- 5174 Views
- 2 replies
- 1 kudos
- 1 kudos
you can install in init scripthttps://docs.databricks.com/en/init-scripts/index.html
- 1 kudos
- 4087 Views
- 2 replies
- 0 kudos
Authenticate with Terraform to Databricks Account level using Azure MSI(System assigned)
Hello, I want to authenticate with terraform to databricks account level with : Azure Managed Identity(System-assigned) of my Azure VMto perform operation like create group. I followed differents tutorial and the documentation on Azure and Databricks...
- 4087 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello,On my side, I always have to add the provider in each resource block.You can try that: resource "databricks_group" "xxxxx" { provider = databricks.accounts display_name = "xxxxx" } About authentication, you can also try to add:auth_type ...
- 0 kudos
- 3864 Views
- 2 replies
- 0 kudos
Problem loading catalog data from multi node cluster after changing Vnet IP range in AzureDatabricks
We've changed address range for Vnet and subnet that the Azure Databricks workspace(standard sku) was using, after that when we try to access the catalog data, we're getting socket closed error. This error is only with Multi node cluster, for single ...
- 3864 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes, it is mentioned that we cannot change the Vnet. I've changed the range in the same vnet but not the Vnet. Is there any troubleshooting that I can do to find this issue. The problem is, I don't want to recreate the workspace. It is a worst case s...
- 0 kudos
- 6524 Views
- 2 replies
- 0 kudos
Enable automatic schema evolution for Delta Lake merge for an SQL warehouse
Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...
- 6524 Views
- 2 replies
- 0 kudos
- 0 kudos
why can we not enable autoMerge in SQL warehouse when my tables are delta tables?
- 0 kudos
- 18415 Views
- 4 replies
- 1 kudos
Resolved! databricks OAuth is not supported for this host
I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...
- 18415 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work Specifically ARM_CLIENT_ID ARM_TENANT_ID ARM_CLIENT_SECRET https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...
- 1 kudos
- 2411 Views
- 1 replies
- 0 kudos
Terraform for Databricks
Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...
- 2411 Views
- 1 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
38 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
75 | |
36 | |
25 | |
17 | |
12 |