- 1651 Views
- 3 replies
- 0 kudos
Unity Catalog read issue
Hello,Our company is POCing the Unity Catalog with Azure as provider.We have 2 subscriptions that contains 1 databricks each and 1 ADLS GEN2 each.Initially we have the default `hive_metastore` connected to the ADLS GEN2. I've created a secret scope a...
- 1651 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello,The UC catalog container is sitting in Sub B.Basically we have a Spoke and Hub configuration for each Subscriptions. Each Subscriptions can access to any ressource inside it own subscription with some PE. But to access other Subscription ressou...
- 0 kudos
- 767 Views
- 1 replies
- 0 kudos
Using JARs from Google Cloud Artifact Registry in Databricks for Job Execution
We have our CI/CD pipelines set up in Google Cloud using Cloud Build, and we are publishing our artifacts to a private repository in Google Cloud's Artifact Registry. I want to use these JAR files to create and run jobs in Databricks.However, when I ...
- 767 Views
- 1 replies
- 0 kudos
- 0 kudos
To integrate your CI/CD pipeline with Databricks, you have a couple of options:1. Using Artifact Registry as an intermediary: Currently, Databricks does not directly support Google Artifact Registry URLs. However, you can use an intermediate storage ...
- 0 kudos
- 2027 Views
- 1 replies
- 1 kudos
Resolved! Enable git credential for entire Databricks account
Hello,I have a question regarding git authentication in the context of a Databricks job. I know that a GitHub Personal Access Token can be generated for a user using GitHub App authorization flow. My team is currently configuring Terraform templates ...
- 2027 Views
- 1 replies
- 1 kudos
- 1 kudos
@tonypiazza good day! To authorize access to all private repositories within a GitHub organization at the account level, without configuring git credentials on a per-user basis, follow these steps: 1. Use a GitHub Machine User:- Create a GitHub machi...
- 1 kudos
- 2593 Views
- 2 replies
- 0 kudos
Resolved! Terraform databricks_grants errors on external_location
We are using terraform to setup Unity Catalog external locations and when using databricks_grants to set permissions on the external locations it throws the following error: Error: cannot create grants: permissions for external_location-test_locatio...
- 2593 Views
- 2 replies
- 0 kudos
- 0 kudos
I figured out my issue... The principal name is case sensitive and if the input value doesn't match the case of the email address or Group Name in the workspace/account it throws that ambiguous error.
- 0 kudos
- 5012 Views
- 3 replies
- 0 kudos
Setup Alerts to monitor cost in dartabricks
Hi Team Tech,Recently we had an huge bill in databricks and we want to somehow have an alerting system in place to monitor cost.Please can someone help me to know how to setup email notification for cost alerts in databricks.As we need to setup an em...
- 5012 Views
- 3 replies
- 0 kudos
- 0 kudos
This is an old thread but for posterity... @Francis What you were looking for is not available in the workspace setting but in the account console. The unified way of tracking costs within Databricks is through System Tables, specifically billable us...
- 0 kudos
- 1063 Views
- 3 replies
- 0 kudos
Connecting Databricks and Azure Devops
Hi everyone,When I tried to create new Databricks job that is using a notebook from a repo, it asked me to set up Azure DevOps Services (Personal access token) in Linked Accounts under my username. And now every time I want to create a new branch or ...
- 1063 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @ksenija , Looks like the Git credentials for the job uses a different account and are missing. The job is configured to use {a particular user} but this account has credentials for {another configured user}. So you need to update the git details...
- 0 kudos
- 4527 Views
- 3 replies
- 0 kudos
Databricks and SMTP
Using databricks as aws partner trying to run python script to validate email addresses. Whenever it gets to the smtp portion it times out. I am able to telnet from python to the POP servers and get a response, I can ping domains and get replies, b...
- 4527 Views
- 3 replies
- 0 kudos
- 3156 Views
- 5 replies
- 1 kudos
Resolved! Stream Query termination using available now trigger and toTable.
We are running a streaming job in databricks with custom streaming logic which consumes a CDC stream from mongo and appends to a delta table, at the end of the streaming job we have a internal checkpointing logic which creates an entry into a table w...
- 3156 Views
- 5 replies
- 1 kudos
- 1 kudos
I was expecting spark.sql(f"insert into table {internal_tab_name} values({dt})") to execute at the end after the streaming query was written to the table. What I observed:The spark sql query spark.sql(f"insert into table {internal_tab_name} values({d...
- 1 kudos
- 1155 Views
- 2 replies
- 0 kudos
Efficient methods to make a temporary copy of a table
I'm using a tool (SAS) that doesn't inherently support time travel - that's to say it doesn't generate SQL including Timestamp or Version (for example). An obvious work-around could be to first copy/clone the version of the table, which SAS can then ...
- 1155 Views
- 2 replies
- 0 kudos
- 0 kudos
@phguk I think that Shallow Clone would be the best solution here.
- 0 kudos
- 911 Views
- 0 replies
- 0 kudos
Databricks CLI/SDKs not returning all logs even when less than 5 MB
We're currently using the python sdk, but the same problem is in the databricks cli. The documentation states that when using workspace.jobs.get_run_output().logs, the last 5 MB of these logs are returned. However, we notice that the logs are truncat...
- 911 Views
- 0 replies
- 0 kudos
- 1834 Views
- 3 replies
- 3 kudos
Monitoring VM costs using cluster pools
Hello,With ref to docs https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags cluster tags are not propagated to VM when created within a pool.Is there any workaround for monitoring VM costs using cluster pools (j...
- 1834 Views
- 3 replies
- 3 kudos
- 3 kudos
Dear @Retired_mod ,as You mentioned, Databricks does not provide out of the box support for VM usage monitoring for job clusters created from cluster pool.If we really want to use cluster pool, I would consider:1) splitting the pool into separate poo...
- 3 kudos
- 3346 Views
- 1 replies
- 0 kudos
Databricks DBU pre-purchase
Hello there,are pre-purchased DBU still valid? Can we use it?https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-discount-databricksCan someone please explain how it works in practice, by example?What if I pre-puc...
- 3346 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod could you please kindly look at this one? Thank You in advance.
- 0 kudos
- 3378 Views
- 3 replies
- 0 kudos
Resolved! Secrete management
Hi all, I am trying to use secrets to connect to my Azure storage account. I want to be able to read the data form the storage account using a pyspark notebook.Has anyone experience setting up such a connection or has good documentation to do so?I ha...
- 3378 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...
- 0 kudos
- 2751 Views
- 3 replies
- 1 kudos
Resolved! Init script failure after workspace upload
We have a pipeline in Azure Devops that deploys init scripts to the workspace folder on an Azure Databricks resource using the workspace API (/api/2.0/workspace/import), we use format "AUTO" and overwrite "true" to achieve this. After being uploaded ...
- 2751 Views
- 3 replies
- 1 kudos
- 1 kudos
If anyone else comes across this problem, the issue was a deployment powershell script was changing LF to CRLF before upload in the init script. The solution was to upload with LF line endings in the pipeline.
- 1 kudos
- 2580 Views
- 4 replies
- 0 kudos
Azure Databricks account api can't auth
Hi,does anyone know about any existing issue with azure databricks account api? I cannot do below:1. login with cli `databricks auth login --account-id <acc_id>, this is what I get https://adb-MY_ID.azuredatabricks.net/oidc/accounts/MY_ACC_ID/v1/auth...
- 2580 Views
- 4 replies
- 0 kudos
- 0 kudos
UPDATE Solved, the very same solution started to work today from running a pipeline with tf - M2M auth. with a service principal with fed auth. That's the 2 from my above post.When trying to follow these steps https://learn.microsoft.com/en-us/azure/...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
62 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
13 | |
9 | |
9 | |
9 |