- 4234 Views
- 3 replies
- 1 kudos
Can I configure Notebook Result Downloads with Databricks CLI , API or Terraform provider ?
I'm Databricks Admin and I'm looking for a solution to automate some Security Workspace settings.Those are:Notebook result downloadSQL result downloadNotebook table clipboard featuresI can't find these options in the Databricks terraform provider, Da...
- 4234 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Redford, With the Databricks API, you have the capability to toggle the following features: Enable/Disable Features Notebook result download (key name: enableResultsDownloading)Notebook table clipboard features (key name: enableNotebo...
- 1 kudos
- 349 Views
- 7 replies
- 0 kudos
unknown geo redundancy storage events (& costs) in azure databricks resource group
Hi All,I'm after some guidance on how to identify massive (100000%) spikes in bandwidth usage (and related costs) in the azure databricks provisioned/managed resource group storage account & stop themThese blips are adding 30-50% to our monthly costs...
- 349 Views
- 7 replies
- 0 kudos
- 0 kudos
Thanks for opening a case with us, we will have a look at it.
- 0 kudos
- 232 Views
- 6 replies
- 0 kudos
Unity catalog meta store is created within undesired storage account
I came to know that our unity catalog meta store has been created in the default storage account of our databricks workspace and this storage account has some system denied access policies, therefore we don't have access to see the data inside. I'm w...
- 232 Views
- 6 replies
- 0 kudos
- 0 kudos
You will need to backup the current metastore including the metadata and then start recreating the catalogs, schemas and tables on the new metastore.
- 0 kudos
- 276 Views
- 10 replies
- 0 kudos
Update existing Metastores in AWS databricks
Hello Team,I am unable to update the Existing Metastore in my AWS databricks. I have new aws account and I am trying to update my existing workspace, however I am unable to update the s3 bucket details and Network configuration ( greyed out ) in the...
- 276 Views
- 10 replies
- 0 kudos
- 0 kudos
Unfortunately there is no way to move the state of the workspace manually so based on this the solution will be to recreate the workspace and migrate the data
- 0 kudos
- 125 Views
- 3 replies
- 1 kudos
How Can a Workspace Admin Grant Workspace Admin Permissions to a Group?
I want to grant Workspace Admin permissions to a group instead of individual users, but I haven’t found a way to do this. I considered assigning permissions by adding the group to the Databricks-managed 'admins' group (establishing a parent-child rel...
- 125 Views
- 3 replies
- 1 kudos
- 1 kudos
No problem! I will check internally if there is any feature request of this nature. You can use the "admins" group for adding admin users or SPs.
- 1 kudos
- 161 Views
- 2 replies
- 1 kudos
Resolved! DLT TABLES schema mapping
how we map table that in delta live table to a bronze , sliver , gold schema ? is that possible to store in different schema the dlt tables??
- 161 Views
- 2 replies
- 1 kudos
- 120 Views
- 2 replies
- 0 kudos
AzureDevOps Repos Databricks update via pipeline not working
Hi all, im working with Azure DevOps and Databricks, using an app registration which it has permission on AzureDevOps and inside databricks as manager,user and in the group admins so it has permission over the repos.Im doing a pipeline to update or c...
- 120 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @TatiMun thanks for your question, can we review the following: Verify Remote URL: Double-check that the remote Git repo URL associated with the REPO_ID in Databricks is correct and accessible.Check PAT Permissions: Ensure that the Personal Acc...
- 0 kudos
- 238 Views
- 6 replies
- 1 kudos
Resolved! Unable to Pinpoint where network traffic originates from in GCP
Hi everyone,I have a question regarding networking. A bit of background first: For security reasons, the current allow-policy from GCP to our on-prem-infrastructure is being replaced by a deny-policy for traffic originating from GCP. Therefore access...
- 238 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi @KLin, happy to help! - The reason why traffic originates from the pods subnet for clusters/SQL warehouses without the x-databricks-nextgen-cluster tag (still using GKE) and from the node subnet for clusters with the GCE tag is due to the underly...
- 1 kudos
- 2732 Views
- 5 replies
- 0 kudos
"Azure Container Does Not Exist" when cloning repositories in Azure Databricks
Good Morning, I need some help with the following issue:I created a new Azure Databricks resource using the vnet-injection procedure. (here) I then proceeded to link my Azure Devops account using a personal account token. If I try to clone a reposito...
- 2732 Views
- 5 replies
- 0 kudos
- 0 kudos
I found my error was related to my use of Pulumi (or in this other answer Terraform)
- 0 kudos
- 1101 Views
- 2 replies
- 1 kudos
Security Consideration for OAUTH Secrets to use Service Principal to authenticate with Databricks
What are the security consideration we need to keep in mind when we want to us OAUTH Secrets to use a Service Principal to access Azure Databricks when Identity federation is disabled and workspace is not yet on boarded on to Unity Catalog? Can we co...
- 1101 Views
- 2 replies
- 1 kudos
- 1 kudos
Any updates on this?Also struggling with the OAuth security considerations. Specifically with updating the OAuth Secrets.Currently using a SP to access Databricks workspace for DevOps purposes through the Databricks CLI.I have the SP set up to renew ...
- 1 kudos
- 167 Views
- 4 replies
- 0 kudos
Database Error in model Couldn't initialize file system for path abfss://
Recently the following error ocurs when running DBT:Database Error in model un_unternehmen_sat (models/2_un/partner/sats/un_unternehmen_sat.sql)Couldn't initialize file system for path abfss://dp-ext-fab@stcssdpextfabprd.dfs.core.windows.net/__unitys...
- 167 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @Th0r ,Here is the explanation:Shallow clones in Databricks rely on references to data files of the original table. If the original table is dropped, recreated, or altered in a way that changes its underlying files, the shallow clone’s references ...
- 0 kudos
- 91 Views
- 0 replies
- 0 kudos
Databricks Asset Bundle a new way for amazing ETL
They say having the right tools at your disposal can make all the difference when navigating complex terrains. For organizations leveraging Databricks, simplifying deployment and scaling operations is often a key challenge.Over the years, I’ve explor...
- 91 Views
- 0 replies
- 0 kudos
- 2095 Views
- 2 replies
- 0 kudos
UserAgentEntry added to JDBC URL but not visible in Audit logs
Hi,As part of Databricks Best Practices, I have added 'UserAgentEntry' to JDBC URL that is being created when we are executing SQL statements through the JDBC driver.Sample url - jdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;...
- 2095 Views
- 2 replies
- 0 kudos
- 0 kudos
Sorry, I was mistaken. please ignore the previous response. The correct one isjdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;PWD=<token>;UserAgentEntry=<ApplicationName/Year>;
- 0 kudos
- 167 Views
- 2 replies
- 1 kudos
Resolved! Exhausted Server when deploying a Databricks Assets Bundle (DAB)
Hello, I'm currently with a colleague inspecting the code and when trying to deploy the DAB it gets stuck: (.venv) my_user@my_pc my-dab-project % databricks bundle deploy -t=dev -p=my-dab-project-prod Building wheel... Uploading my-dab-project-...
- 167 Views
- 2 replies
- 1 kudos
- 1 kudos
You are using a venv, the venv has too many files and is not needed to be included, try adding this on your databricks.ymlsync: exclude: - "venv" Hope it helps
- 1 kudos
- 180 Views
- 3 replies
- 0 kudos
Delete the AWS Databricks account
I have created the aws databricks account from aws market place , but and I have cancelled the subscription after 14 days free trail from the market place. But still i see the account. How will i delete these databricks account associated with my ema...
- 180 Views
- 3 replies
- 0 kudos
- 0 kudos
@dhruv1 As mentioned, it would be best to reach out to support for assistance.https://help.databricks.com/s/signuprequest
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
63 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
41 | |
15 | |
9 | |
9 | |
9 |