- 256 Views
- 3 replies
- 0 kudos
Databricks Workspace Access and Permissions
Hi Team,The GCP Databricks URL https://accounts.gcp.databricks.com/ for GCP Databricks is linked to the GCP Billing Account.We have two clients with separate GCP Organizations:client1.example.comclient2.example.comBoth GCP Organizations share the sam...
- 256 Views
- 3 replies
- 0 kudos
- 0 kudos
@karthiknuvepro The Databricks Account should be handled by a third-party Cloud Administration team. The workspace admins can work with them to set up the necessary cloud resources to support their catalogs and user adds/remove from their selected a...
- 0 kudos
- 5 Views
- 0 replies
- 0 kudos
Querying view gives spurious error
When trying to query a view, my_view, we sometimes see a spurious error. This seems to occur after the table underlying the view has been updated. The error persists for a while and then it seems to fix itself. Error running query: [42501] [Simba][Ha...
- 5 Views
- 0 replies
- 0 kudos
- 34 Views
- 0 replies
- 0 kudos
Reading delta table from ADLS gen2 using ABFS driver
Scenario - I have an ADLS g2 account and trying to read a delta table using ABFS driver. I am using Databricks Serverless compute. There are no firewalls in place as I am working with sample data. There's network line of sight between Databricks serv...
- 34 Views
- 0 replies
- 0 kudos
- 219 Views
- 2 replies
- 1 kudos
How to delete or clean Legacy Hive Metastore after successful completion of UC migration
Say we have completed the migration of tables from Hive Metastore to UC. All the users, jobs and clusters are switched to UC. There is no more activity on Legacy Hive Metastore.What is the best recommendation on deleting or cleaning the Hive Metastor...
- 219 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @bhanu_dp,If I understood correctly, all the tables / objects in your HIVE METASTORE is now migrated to UC Catalog. If you have used UCX for migration, you might not see the process / workflow for removing the HIVE METASTORE from your workspace as...
- 1 kudos
- 740 Views
- 3 replies
- 2 kudos
Privileged Identity Management for Databricks with Microsoft Entra ID
Privileged Identity Management (PIM) can be used to secure access to critical Databricks roles with Just-in-Time (JIT) access. This approach helps organizations enforce time-bound permissions, approval workflows, and centralized auditing for sensitiv...
- 740 Views
- 3 replies
- 2 kudos
- 2 kudos
Thanks for sharing this, it is helpful. However, onboarding the AAD group as an account admin under the Databricks account is not straightforward and is also not clearly explained in the blog.
- 2 kudos
- 241 Views
- 1 replies
- 0 kudos
Databricks App : Limitations
I have some questions regarding Databricks App.1) Can we use Framework other than mentioned in documentation( Streamlit,Flask,Dash,Gradio,Shiny).2) Can we allocate compute more than 2 vCPU and 6GB memory to any App.3) Any other programming language o...
- 241 Views
- 1 replies
- 0 kudos
- 0 kudos
1.) You can use most Python-based application frameworks, including some beyond those mentioned above.(Reference here) 2.) Currently, app capacity is limited to 2 vCPUs and 6 GB of RAM. However, future updates may introduce options for scaling out an...
- 0 kudos
- 12164 Views
- 2 replies
- 2 kudos
Authentication for Databricks Apps
Databricks Apps allows us to define dependencies & an entrypoint to execute a Python application like Gradio, Streamlit, etc. It seems I can also run a FastAPI application and access via an authenticated browser which is potentially a very powerful c...
- 12164 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello, Thank you for your questions and answers regarding this topic. Is it available this feature right now? Or still not supported? Thank you in advance
- 2 kudos
- 193 Views
- 2 replies
- 1 kudos
Databricks Network Policies
Hi Databricks community. I have 2 questions that I'd appreciate if you can shed some lights on:Is the new Network Policies in Databricks account, only applicable to serverless compute or are these workspace-wide policies which apply to all other comp...
- 193 Views
- 2 replies
- 1 kudos
- 1 kudos
I am not support, just a regular customer like you, but here is what I know:1. Yes, serverless egress only applies to serverless. There is another upcoming change you'll need to make for your classic compute, announced by Microsoft at Default outbou...
- 1 kudos
- 1203 Views
- 2 replies
- 1 kudos
Vocareum Lab issue
Hi,I have taken a Lab Subscription from databricks academy. It worked fine for me for 1 day. As I try to login today, it is not starting up. I really appreciate any help in this regard. I have things as planned to do on the lab and when it is not sta...
- 1203 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi, was this resolved? can I know the solution please? Also, is there any restriction on the number of days or so to use this lab? I am getting 'Session ended' message even after I login again.RegardsDeepa
- 1 kudos
- 61 Views
- 0 replies
- 0 kudos
von Google Cloud Storage
Hi everyone,I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Here’s what I've done so far:Bucket Setup:I created a GCP bu...
- 61 Views
- 0 replies
- 0 kudos
- 130 Views
- 0 replies
- 0 kudos
Create account group with terraform without account admin permissions
I’m trying to create an account-level group in Databricks using Terraform. When creating a group via the UI, it automatically becomes an account-level group that can be reused across workspaces. However, I’m struggling to achieve the same using Terra...
- 130 Views
- 0 replies
- 0 kudos
- 152 Views
- 1 replies
- 0 kudos
Convert Account to Self-managed
I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...
- 152 Views
- 1 replies
- 0 kudos
- 0 kudos
Or better yet if we could delete it so I can re-create the account.
- 0 kudos
- 143 Views
- 0 replies
- 0 kudos
Migrate to a new account
Hey Team,We're looking into migrating our correct Databricks solution from 1 AWS account (us-east-1 region) to another (eu-central-1 region). I have no documentation left on/about how the corrent solution was provisioned, but I can see CloudFormation...
- 143 Views
- 0 replies
- 0 kudos
- 169 Views
- 2 replies
- 0 kudos
AWS Secrets Manager access
HiI am trying to establish a method of accessing secrets from AWS Secrets Manager and understand this can be done with boto as suggested from AWS.We have created all of the relevant IAM roles, instance profiles etc. Accessing S3 with this method is ...
- 169 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Alberto_Umana,Yes, the Role has the SecretsManagerReadWrite policy.Also within my furthet investigation I tried running it via Personal Cluster and it worked!Basically, 3 scenarios:- Shared Cluster with applied InstanceProfile - Secrets failing- ...
- 0 kudos
- 111 Views
- 1 replies
- 0 kudos
Communication between multiple region workspaces & metastore
Hi EveryoneWe currently have a workspace and metastore in the EastUS region, and we’re planning to set up another workspace and metastore in the Canada region. Additionally, we need to be able to access data from the Canada region within the US regio...
- 111 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dnirmania! Delta Sharing is ideal for read-only, cross-platform data access without duplication. Direct metastore connections offer low-latency access between Databricks workspaces under a unified governance model. Additionally, you may explor...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
83 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
25 | |
16 | |
10 |