- 3368 Views
- 1 replies
- 0 kudos
Managing databricks workspace permissions
I need assistance with writing API/Python code to manage a Databricks workspace permissions database(unity catalog). The task involves obtaining a list of workspace details from the account console, which includes various details like Workspace name,...
- 3368 Views
- 1 replies
- 0 kudos
- 0 kudos
Here's a start. https://docs.databricks.com/api/workspace/workspacebindings/updatebindings As far as coding, I use CURL. See attachment as to the syntax. Note the example in the attachment is for Workspace notebooks, as opposed to Workspace envir...
- 0 kudos
- 205 Views
- 1 replies
- 0 kudos
Get hardware metrics like CPU usage, memory usage and send it to Azure Monitor
Hello guys,I would like to get hardware metrics like server load distribution, CPU utilization, memory utilization and send it to Azure Monitor. Is there any way to do this? Can you help me with this doubt?Thanks.
- 205 Views
- 1 replies
- 0 kudos
- 0 kudos
@xzero-trustx wrote:Hello guys,I would like to get hardware metrics like server load distribution, CPU utilization, memory utilization and send it to Azure Monitor. Is there any way to do this? Can you help me with this doubt?Thanks.Hello!Yes, you ca...
- 0 kudos
- 270 Views
- 1 replies
- 0 kudos
Databricks and AWS CodeArtifact
Hello, I saw multiple topics about it, but I need explanations and a solution.In my context, we have developers that are developing Python projects, like X.In Databricks, we have a cluster with a library of the main project A that is dependent of X.p...
- 270 Views
- 1 replies
- 0 kudos
- 0 kudos
I saw that solution may be in the init script but it's not really essy to work with.I mean, there's no log generated from the bash script, so this is not an easy way to solve my problem, maybe you have some advices about it?
- 0 kudos
- 209 Views
- 1 replies
- 1 kudos
Issue running notebook from another notebook in Job cluster
Hi,I have situation where I can run my notebook without any issue when I use a 'normal' cluster. However, when I run the exact same notebook in a job cluster it fails.It fails at the point where it runs the cell:`%run ../utils/some_other_notebook`And...
- 209 Views
- 1 replies
- 1 kudos
- 1 kudos
Not sure what went wrong but after pulling the sources (notebooks) again from GIT it now works both for my 'normal' cluster and the 'job' cluster.Case closed for me...
- 1 kudos
- 289 Views
- 2 replies
- 5 kudos
- 289 Views
- 2 replies
- 5 kudos
- 5 kudos
Hi,Yes, Databricks Asset Bundles (DABs) can be used with a Standard Tier Databricks workspace. The use of DABs is not directly tied to the workspace pricing tier but rather to the configuration of your workspace and integration with CI/CD pipelines.
- 5 kudos
- 183 Views
- 2 replies
- 1 kudos
Meta data for cloned table
Hi, guys!How I can find out whether a table is cloned right now - hopefully by querying some meta data (information_schema or the like)?Thanks! Sebastian
- 183 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @SeBaFlu ,To determine whether a table in Databricks is a clone (created using Delta Lake's CREATE TABLE CLONE), you can use Delta Lake's metadata and DESCRIBE HISTORY command.
- 1 kudos
- 522 Views
- 2 replies
- 0 kudos
Resolved! Download Dashboard as PDF
I see several reference in Databricks documentation to export a Dashboard as a PDF, yet I have no options for format when I download it, it creates a json file. Is there a way to download a Dashboard as a PDF?
- 522 Views
- 2 replies
- 0 kudos
- 0 kudos
Actually you can download a Dashboard to a PDF if it is a legacy dashboard. It does not appear as an option for an AI/BI dashboard.
- 0 kudos
- 513 Views
- 4 replies
- 0 kudos
Resolved! Databricks apps, data plane configuration not supported
Unable to create app, get 'This workspace has a data plane configuration that is not yet supported' message. Is there something specific I should look for configuration wise to correct the issue? Azure hosted. Virtual network.
- 513 Views
- 4 replies
- 0 kudos
- 0 kudos
Hello @rew_data, You might want to check if you region is available to use Databricks Apps, please refer to: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/ The error message "This workspace has a data plane configuratio...
- 0 kudos
- 183 Views
- 1 replies
- 1 kudos
REST API List dashboard schedules - 501 NOT IMPLEMENTED
When I try to retrieve the dashboard scheduling info based on REST API List dashboard schedules I receive the following `501 NOT IMPLEMENTED` response:{ "error_code": "NOT_IMPLEMENTED", "message": "This API is not yet supported." }But e.g the...
- 183 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @gyorgyjelinek, I just tried testing on my end and got the same failure as yours: python3 list_dashboardID.py Error 501: {"error_code":"NOT_IMPLEMENTED","message":"This API is not yet supported."} This endpoint might not be fully supported yet ...
- 1 kudos
- 240 Views
- 1 replies
- 1 kudos
Restrict a Workspace User from Creating/Managing Databricks Jobs
Hello Databricks team,I currently have a workspace user, and I want to disable their ability to create or manage Databricks jobs entirely. Specifically, I would like to prevent the user from accessing the "Create Job" option in the Databricks UI or v...
- 240 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @neointab, Currently, Databricks does not offer a direct workspace-level setting to restrict job creation for specific users. However, there are some workarounds and related controls that can be considered: Cluster Creation Restrictions: One ap...
- 1 kudos
- 154 Views
- 1 replies
- 0 kudos
Resolved! Challenge isolating databricks workspace with single unity catalog metstore for multiple workspaces
Hello Community,I am currently managing multiple workspaces for various projects and facing challenges in achieving data asset isolation between these workspaces. My goal is to ensure that data sharing happens exclusively through Delta Sharing.The cu...
- 154 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi iskidet01You can use use workspace-catalog bindings. https://learn.microsoft.com/en-us/azure/databricks/catalogs/#workspace-catalog-binding. When you create a catalog, you can assign it to specific workspace, instead of "All workspaces have access...
- 0 kudos
- 288 Views
- 3 replies
- 0 kudos
PAT needed but not allowed in "Advanced Data Engineering - 6.5L Deploy pipeline with the CLI Lab"
It is stated in the lab notebook that:Run the setupRun the setup script for this lesson by running the cell below. This will ensure that:The Databricks CLI is installedAuthentication is configuredA pipeline is createdHowever, when I tried to run the ...
- 288 Views
- 3 replies
- 0 kudos
- 0 kudos
... and if I start with step 5, using workspace-level authorisation, I ended up with "localhost refused to connect." in the generated link.
- 0 kudos
- 181 Views
- 1 replies
- 0 kudos
Will Lakehouse Federation between Databricks and Snowflake support Azure Entra ID?
The Lakehouse Federation between Databricks and Snowflake looks promising, but the lack of support for Azure Entra ID as an identity provider (IdP) is a big limitation for enterprises standardized on it.Managing separate OAuth flows or using Snowflak...
- 181 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @martkev, Currently, Azure Databricks does not support using Azure Entra ID (formerly Azure Active Directory) directly as an identity provider (IdP) for federated queries on Snowflake. The only supported OAuth integration for Snowflake is Snowf...
- 0 kudos
- 297 Views
- 2 replies
- 2 kudos
Resolved! system schemas permission
Hi,I'm an account admin on Databricks and when I try to set select permission for system schemasI take "PERMISSION_DENIED: User is not an owner of Schema 'system.compute'." When I try to set permission for system catalog,I take "Requires ownership o...
- 297 Views
- 2 replies
- 2 kudos
- 331 Views
- 3 replies
- 2 kudos
Networking configuration of Azure Databricks managed storage account
Hi all,I created an Azure Databricks Workspace, and the workspace creates an Azure Databricks managed storage account.The networking configuration of the storage account is "Enabled from all networks".Shall I change it to "Enabled from selected virtu...
- 331 Views
- 3 replies
- 2 kudos
- 2 kudos
You dont need view on the subnets itself.In regards the Disabling key access you could use any of the other authentication methods listed here: https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage#connect-to-azure-data-lak...
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
66 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
41 | |
22 | |
9 | |
9 | |
9 |