- 9314 Views
- 2 replies
- 1 kudos
Resolved! Any way to move the unity catalog to a new external storage location?
Dear Databricks CommunityThe question is about changing an existing unity catalog to a new storage location. For example: With an existing unity catalog (i.e. catalog1) includeing schemas and volumes. The catalog is based on an external location (i....
- 9314 Views
- 2 replies
- 1 kudos
- 1 kudos
https://docs.databricks.com/ja/sql/language-manual/sql-ref-syntax-ddl-alter-location.html
- 1 kudos
- 252 Views
- 3 replies
- 1 kudos
cloud_infra_costs
I was looking at the system catalog and realized that there is an empty table called cloud_infra_costs. Could you tell me what is this for and why it is empty?
- 252 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks for replying. This makes sense. Any idea why it is empty and what to do to populate this?
- 1 kudos
- 360 Views
- 5 replies
- 2 kudos
Ingress/Egress private endpoint
Hello ,We have configured our Databricks environment with private endpoint connections injected into our VNET, which includes two subnets (public and private). We have disabled public IPs and are using Network Security Groups (NSGs) on the subnet, as...
- 360 Views
- 5 replies
- 2 kudos
- 2 kudos
@Fkebbati First, traffic cost in Azure are not reported as a separate Resource Type, but appended to main resource causing the traffic. If you want to distinguish them use for instance Service Name. In this case traffic cost is appended to Databricks...
- 2 kudos
- 11738 Views
- 1 replies
- 2 kudos
Authentication for Databricks Apps
Databricks Apps allows us to define dependencies & an entrypoint to execute a Python application like Gradio, Streamlit, etc. It seems I can also run a FastAPI application and access via an authenticated browser which is potentially a very powerful c...
- 11738 Views
- 1 replies
- 2 kudos
- 2 kudos
According to the response in this Reddit thread, it appears that this feature is not yet supported: https://www.reddit.com/r/databricks/comments/1g5ni8e/comment/lsceh4n/
- 2 kudos
- 698 Views
- 3 replies
- 0 kudos
Databricks (GCP) Cluster not resolving Hostname into IP address
we have #mongodb hosts that must be resolved to private internal loadbalancer ips ( of another cluster ), and that we are unable to add host aliases in the Databricks GKE cluster in order for the spark to be able to connect to a mongodb and resolve t...
- 698 Views
- 3 replies
- 0 kudos
- 0 kudos
Also found this - https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14868
- 0 kudos
- 749 Views
- 3 replies
- 0 kudos
Data leakage risk happened when we use the Azure Databricks workspace
Context:We are utilizing an Azure Databricks workspace for data management and model serving within our project, with delegated VNet and subnets configured specifically for this workspace. However, we are consistently observing malicious flow entries...
- 749 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello everyone! We have worked with our security team, Microsoft, and other customers who have seen similar log messages. This log message is very misleading, as it appears to state that the malicious URI was detected within your network — this would...
- 0 kudos
- 291 Views
- 1 replies
- 0 kudos
Unable to add a databricks permission to existing policy
Hi, We're using databricks provider v1.49.1 to manager our Azure databricks cluster and other resources. Having an issue setting permissions with the databricks terraform resource "databricks_permissions" where the error indicates that the clust...
- 291 Views
- 1 replies
- 0 kudos
- 0 kudos
Is this cluster policy a custom policy? If you try for testing purposes to modify it in the UI does it allows you to
- 0 kudos
- 299 Views
- 1 replies
- 0 kudos
Compute terminated. Reason: Control Plane Request Failure
Hi,I have started to get this error: Failed to get instance bootstrap steps from the Databricks Control Plane. Please check that instances have connectivity to the Databricks Control Plane. and I am suspecting it has to do with networking.I am just a...
- 299 Views
- 1 replies
- 0 kudos
- 0 kudos
Do you still facing issues? I would suggest to check the Securitygroups and make sure they match with:https://docs.databricks.com/en/security/network/classic/customer-managed-vpc.html#security-groupsAdditionally check if the inbound and outbound addr...
- 0 kudos
- 392 Views
- 1 replies
- 0 kudos
Updating databricks git repo from github action - how to
HiMy company is migrating from azuredevops to github and we have a pipeline in azuredevops which updates/syncs databricks repos whenever a pull request is made to the development branch. The azure devops pipeline (which works) looks like this: trigge...
- 392 Views
- 1 replies
- 0 kudos
- 0 kudos
It seems like the issue you're encountering is related to missing Git provider credentials when trying to update the Databricks repo via GitHub Actions. Based on the context provided, here are a few steps you can take to resolve this issue: Verify...
- 0 kudos
- 127 Views
- 1 replies
- 0 kudos
Customer Facing Integration
Is Databricks intended to be used in customer facing application architectures? I have heard that Databricks is primarily intended to be internally facing. Is this true?If you are using it for customer facing ML applications, what tool stack are yo...
- 127 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @hucklebarryrees ,Databricks is indeed primarily designed as an analytical platform rather than a transactional system. It’s optimized for data processing, machine learning, and analytics rather than handling high-frequency, parallel transactional...
- 0 kudos
- 963 Views
- 2 replies
- 1 kudos
Resolved! Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook
The problemAfter setting up Unity Catalog and a managed Volume, I can upload/download files to/from the volume, on Databricks Workspace UI.However, I cannot access the volume from notebook. I created an All-purpose compute, and run dbutils.fs.ls("/Vo...
- 963 Views
- 2 replies
- 1 kudos
- 1 kudos
I found the reason and a solution, but I feel this is a bug. And I wonder what is the best practice.When I enable the ADSL Gen2's Public network access from all networks as shown below, I can access the volume from a notebook.However, if I enable the...
- 1 kudos
- 246 Views
- 4 replies
- 0 kudos
Unity Group management, Group: Manager role
We would like to have the ability to assign an individual and/or group to the "Group: Manager" role, providing them with the ability to add/remove users without the need to be an account or workspace administrator. Ideally this would be an option fo...
- 246 Views
- 4 replies
- 0 kudos
- 0 kudos
thanks @NandiniN , we have looked through that documentation and still have not been able to get anything to work without the user also being an account or workspace admin. The way i'm interpreting the documentation (screenshot) is the API currently...
- 0 kudos
- 269 Views
- 2 replies
- 0 kudos
List files in Databricks Workspace with Databricks CLI
I want to list all files in my Workspace with the CLIThere's a command for it: databricks fs ls dbfs:/When I run this, I get this result: I can then list the content of databricks-datasets, but no other directory. How can I list the content of the Wo...
- 269 Views
- 2 replies
- 0 kudos
- 0 kudos
I know it's possible with Databricks SDK, but I want to solve it with the CLI on the Terminal.
- 0 kudos
- 136 Views
- 1 replies
- 0 kudos
Enabling Object Lock for the S3 bucket that is delivering audit logs
Hello Community,I am trying to enable Object Lock on the S3 bucket to which the audit log is delivered, but the following error occurs if Object Lock is enabled when the delivery settings are enabled.> {"error_code":"PERMISSION_DENIED","message":"Fai...
- 136 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @hiro12 Enabling Object Lock on an S3 bucket after configuring the delivery settings should not affect the ongoing delivery of audit logs. But I would say, it is better to understand the root cause of the error. The error you encountered when ena...
- 0 kudos
- 1579 Views
- 8 replies
- 2 kudos
Open Delta Sharing and Deletion Vectors
Hi,Just experimenting with open delta sharing and running into a few technical traps. Mainly that if deletion vectors are enabled on a delta table (which they are by default now) we get errors when trying to query a table (specifically with PowerBI)...
- 1579 Views
- 8 replies
- 2 kudos
- 2 kudos
@NandiniN we are talking of PowerBI connection, so you cannot set that option.@F_Goudarzi I have just tried out with PBI Desktop Version: 2.132.1053.0 and it is running (I did not disable Deletion Vectors into my table.) I also tried with last versio...
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
54 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
37 | |
9 | |
9 | |
9 | |
8 |