- 860 Views
- 1 replies
- 0 kudos
Can I edit the ADLSg2 storage location for a schema?
I want to alter the schema and basically point it to a new path in the data lake #UnityCatalog
- 860 Views
- 1 replies
- 0 kudos
- 0 kudos
don't think so.You can alter the owner and dbproperties using the alter schema command, but not the location.https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema
- 0 kudos
- 2797 Views
- 0 replies
- 0 kudos
Struggling with UC Volume Paths
I am trying to setup my volumes and give them paths in the data lake but I keep getting this message:Input path url 'abfss://my-container@my-storage-account.dfs.core.windows.net/' overlaps with managed storage within 'CreateVolume' callThere WAS some...
- 2797 Views
- 0 replies
- 0 kudos
- 3404 Views
- 0 replies
- 0 kudos
Error: cannot create permissions: invalid character '<' looking for beginning of value
I'm trying to use terraform to assign a cluster policy to an account-level group (sync'd from AAD via SCIM)My provider is configured like thisprovider "databricks" {alias = "azure_account"host = "accounts.azuredatabricks.net"account_id = "%DATABRICKS...
- 3404 Views
- 0 replies
- 0 kudos
- 8479 Views
- 0 replies
- 0 kudos
[Possible Bug] Repo Notebooks being modified without human interaction
Our production workspace has several Repos integrated with GitHub. These repos aways point to master and should never be modified manually by a human directly in the workspace as the pulls are triggered by a GitHub Actions workflow. This workflow cal...
- 8479 Views
- 0 replies
- 0 kudos
- 1959 Views
- 1 replies
- 1 kudos
Resolved! Terraform Repos Git URL Allow List
Hi,I am provisioning databricks workspaces using terraform and want to add specific github repo url that can be used. In UI there is an option for that but when it comes to terraform there is nothing specific. I came across custom_config option here ...
- 1959 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello,This can normally be achieved using this terraform resource:resource "databricks_workspace_conf" "this" { custom_config = { "enableProjectsAllowList": true, "projectsAllowList": "url1,url2,url3", } }Cheers
- 1 kudos
- 8008 Views
- 0 replies
- 0 kudos
Databricks workspace in our own VNET
We have setup a Databricks workspace in our own Azure VNET, including a private endpoint. Connecting to the WS works fine (through the private ip address). However, when creating my first cluster, I run into this problem:"ADD_NODES_FAILED...Failed to...
- 8008 Views
- 0 replies
- 0 kudos
- 623 Views
- 0 replies
- 0 kudos
Data Marketplace private exchange
I want to use Data Markerplace but only as private / local mode, so don't want to publish any products outside my organization.I know I can create private listing , but it can be done only from provider console.I'm added to marketplace role but not s...
- 623 Views
- 0 replies
- 0 kudos
- 4979 Views
- 1 replies
- 2 kudos
Assigning Databricks Account Admin role to User group
At my current organization, we have a few users with the Databricks Account admin role assigned. But as per our company policy, individual users should not be given such elevated privileges. They should be given to user groups so that users in those ...
- 4979 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi Kaniz,Thank you for the feedback and I was able to find below solution from the article you have mentioned.https://docs.databricks.com/en/administration-guide/users-groups/groups.html#assign-account-admin-roles-to-a-groupSeems we can use Databrick...
- 2 kudos
- 1183 Views
- 2 replies
- 1 kudos
Global search programatically
Hi!At the workspace header, there is a search box that allow us to look for a text in all notebooks in the workspace. Is there a way via CLI or API to call the global search https://<workspace-domain>/graphql/SearchGql so the result can be analysed a...
- 1183 Views
- 2 replies
- 1 kudos
- 1 kudos
if you checked in the notebooks into a git repo, the search in the git repo (or API) might save you.
- 1 kudos
- 1433 Views
- 1 replies
- 1 kudos
Resolved! REST API workspace list content doesn't work with Queries
Hi, I'm trying to export the SQL Queries in certain folders in workspace, but the list content API callGET /api/2.0/workspace/list doesnt work with queries? how should I export only queries in a certain folder in workspace? Thank you very much
- 1433 Views
- 1 replies
- 1 kudos
- 1 kudos
@Xyguo - Currently, exporting a SQL query file is not supported. Kindly create an idea feature request by following the article listed - https://docs.databricks.com/en/resources/ideas.html#create-an-idea-in-the-ideas-portal to raise a feature reques...
- 1 kudos
- 7666 Views
- 2 replies
- 1 kudos
Resolved! Multiple orphan vms in managed resource group after starting and terminating my personal cluster
Hi today, i had problem with starting the cluster and therefore i did multiple times star and terminating. The problem is that this actions always started one new VM in the managed resource group but never turned it off. Therefore i ended up with mul...
- 7666 Views
- 2 replies
- 1 kudos
- 1 kudos
It always takes a few minutes, even 10, before the machine is terminated in Azure. It is better to set a pool of machines in data bricks and use them from there so we will not be keeping Zombie machines.
- 1 kudos
- 1826 Views
- 0 replies
- 1 kudos
Databricks Enhances Job Monitoring with Duration Thresholds for Workflow
Databricks has introduced Duration Thresholds for workflows. This new addition allows users to set time limits for workflow execution, significantly improving monitoring of job performance. When a job exceeds the preset duration, the system triggers ...
- 1826 Views
- 0 replies
- 1 kudos
- 651 Views
- 0 replies
- 0 kudos
Multi bronze layers : multi client data
data coming in from multiple sources belonging to multiple customers.Should I create Single storage account name it bronze and then separate containers for each client's data to be put in.And then merge the data in the silver layer. What's the best p...
- 651 Views
- 0 replies
- 0 kudos
- 4910 Views
- 1 replies
- 0 kudos
Resolved! Can't set account admin using Terraform
I want to set the account admin for a service principal in order to create the Unity Catalog metastore. The Terraform code looks like this: data "databricks_service_principal" "application" { count = var.environment == "dev" ? 1 : 0 application_...
- 4910 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 3586 Views
- 2 replies
- 0 kudos
INVALID_PARAMETER_VALUE.LOCATION_OVERLAP: overlaps with managed storage error with S3 paths
We're trying to read from an S3 bucket using unity catalog and are selectively getting "INVALID_PARAMETER_VALUE.LOCATION_OVERLAP: overlaps with managed storage error" errors within the same bucket. This works: "dbutils.fs.ls("s3://BUCKETNAME/dev/he...
- 3586 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi,Could you please elaborate on the issue here? Running the list command on a managed directory is not supported in Unity Catalog. Catalog/schema storage locations are reserved for managed storage.Please tag @Debayan with your next comment which wi...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
62 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
13 | |
9 | |
9 | |
9 |