- 14377 Views
- 4 replies
- 1 kudos
Resolved! databricks OAuth is not supported for this host
I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...
- 14377 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work Specifically ARM_CLIENT_ID ARM_TENANT_ID ARM_CLIENT_SECRET https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...
- 1 kudos
- 2009 Views
- 1 replies
- 0 kudos
Terraform for Databricks
Hi all,I can't find guidance on how to create a Databricks access connector for connecting catalogs to external data locations, using Terraform.Also, I want to create my catalogs, set-up external locations etc using Terraform. Has anyone got a good r...
- 2009 Views
- 1 replies
- 0 kudos
- 1493 Views
- 1 replies
- 1 kudos
keyrings.google-artifactregistry-auth fails to install backend on runtimes > 10.4
We run Databricks on GCP. We store our private Python packages in the Google Artifact Registry. When we need to install the private packages we a global init script to install `keyring` and `keyrings.google-artifactregistry-auth`. The we `pip inst...
- 1493 Views
- 1 replies
- 1 kudos
- 953 Views
- 1 replies
- 0 kudos
SQL Warehouse tag list from system table ?
Hello, Is there a way to get the tags of SQL Warehouse clusters from system tables ? like you do with system.compute.clustersThanks,
- 953 Views
- 1 replies
- 0 kudos
- 0 kudos
Answering my own question : system.billing.usage.custom_tags['cluster-owner'] @ databricks : I don't really understand the logic here
- 0 kudos
- 4907 Views
- 2 replies
- 0 kudos
Resolved! Databricks SSO Azure AD
Hello,I'm trying to test SSO with Azure AD.The test sso is passing on dtabricks and I can connect to databricks using SSO.When I try to test with postman to obtain a token I have the next error message :{"error_description":"OAuth application with ...
- 4907 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello,The issue was with the postman.In postman you don't have to give the client id from your IDP but the client id from databricks "App connections".it is working well now.thank you.
- 0 kudos
- 3658 Views
- 1 replies
- 0 kudos
Databricks on premise GDCE
Hello, Any plans for supporting Databricks on GDCE or other on private cloud-native stack/HW on premise?Regards, Patrick
- 3658 Views
- 1 replies
- 0 kudos
- 1123 Views
- 2 replies
- 1 kudos
inter connected notebook
How to use inter connected notebook, available in databricks?
- 1123 Views
- 2 replies
- 1 kudos
- 1 kudos
Do you mean running one notebook from another and using variables and functions defined in the other one? If that's what you're seeking, try using the magic command %run + notebook path. You can find some documentation about it here: https://docs.da...
- 1 kudos
- 1846 Views
- 2 replies
- 0 kudos
Asset Bundles -> creation of Azure DevOps pipeline
If you choose in asset bundles mlops-stacks, it will create for you out of the box many nice things, including a pipeline to deploy to dev/stage/prod. #databricks
- 1846 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for sharing this @Hubert-Dudek
- 0 kudos
- 1391 Views
- 1 replies
- 0 kudos
databricks on azure jdbc
Hello Databricks teamI have one question regarding data bricks on azure configuration using jdbc [Simba][SparkJDBCDriver](700100)I am getting below error message : java.sql.SQLException: [Simba][SparkJDBCDriver](700100) Connection timeout expired. De...
- 1391 Views
- 1 replies
- 0 kudos
- 0 kudos
Check your network connection. Try "%sh nc -zv {hostname} {port}"
- 0 kudos
- 1798 Views
- 0 replies
- 0 kudos
Unity Catalog - Created UC and linked it to my DEV storage account for the entire org
Hello everyone,I was lead in a data platform modernization project. This was my first time administrating databricks and I got myself into quite the situation. Essentially i made the mistake of linking our enterprise wide Unity Catalog to our DEV Azu...
- 1798 Views
- 0 replies
- 0 kudos
- 2453 Views
- 1 replies
- 0 kudos
Resolved! Creating Databricks workspace
Hi all,I am creating a Databricks workspace that has its own virtual network.When I create it I get this error:'The workspace 'xxxxxx' is in a failed state and cannot be launched. Please review error details in the activity log tab and retry your ope...
- 2453 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi all,I resolved the issue.My subnets did not have the correct delegations.Thanks,Sean
- 0 kudos
- 6146 Views
- 3 replies
- 0 kudos
New admin question: How do you enable R on a existing cluster?
Hello Community. I have a user trying to use R and receive the error message illustrated on the attachment. I can't seem to find correct documentation on enabling R on an existing cluster. Would anyone be able to point me in the right direction? Than...
- 6146 Views
- 3 replies
- 0 kudos
- 2692 Views
- 2 replies
- 1 kudos
Resolved! How do we get user list who accessed specific table or view in Unity catalog for last 6 months
We have a business use case where we want to track users who accessed a specific table in Unity catalog for last 6 months . Is there a way where we can pull this data ?
- 2692 Views
- 2 replies
- 1 kudos
- 1 kudos
yes system table will have all details
- 1 kudos
- 1432 Views
- 3 replies
- 0 kudos
Rest endpoint for data bricks audit logs
I am trying to find official documentation link to get audit logs of data bricks. unable to find it. referred on.https://docs.databricks.com/en/administration-guide/account-settings/audit-logs.htmlhttps://docs.databricks.com/api/workspace/introductio...
- 1432 Views
- 3 replies
- 0 kudos
- 0 kudos
@madhura I could not find any endpoint that can be used to get the Audit logs. However, you can enable system tables in your workspace and try to read the data just like you read from any other table. Please check this to enable system tables: https...
- 0 kudos
- 2110 Views
- 1 replies
- 0 kudos
Databricks Job alerts
I'm currently running jobs on job clusters and would like these jobs to time out after 168 hours (7 days), at which point a new job cluster will be assigned. This timeout is specifically to ensure that jobs don't run on the same cluster for too long,...
- 2110 Views
- 1 replies
- 0 kudos
- 0 kudos
@Priyam1 Good day!Based on the information provided, it seems that we do not have a direct way to mute notifications for timed-out jobs while still receiving alerts for job failures. You can reduce the number of notifications sent by filtering out no...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
73 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
20 | |
12 | |
9 |