- 914 Views
- 1 replies
- 0 kudos
Get hardware metrics like CPU usage, memory usage and send it to Azure Monitor
Hello guys,I would like to get hardware metrics like server load distribution, CPU utilization, memory utilization and send it to Azure Monitor. Is there any way to do this? Can you help me with this doubt?Thanks.
- 914 Views
- 1 replies
- 0 kudos
- 0 kudos
@xzero-trustx wrote:Hello guys,I would like to get hardware metrics like server load distribution, CPU utilization, memory utilization and send it to Azure Monitor. Is there any way to do this? Can you help me with this doubt?Thanks.Hello!Yes, you ca...
- 0 kudos
- 894 Views
- 1 replies
- 1 kudos
Issue running notebook from another notebook in Job cluster
Hi,I have situation where I can run my notebook without any issue when I use a 'normal' cluster. However, when I run the exact same notebook in a job cluster it fails.It fails at the point where it runs the cell:`%run ../utils/some_other_notebook`And...
- 894 Views
- 1 replies
- 1 kudos
- 1 kudos
Not sure what went wrong but after pulling the sources (notebooks) again from GIT it now works both for my 'normal' cluster and the 'job' cluster.Case closed for me...
- 1 kudos
- 1249 Views
- 2 replies
- 5 kudos
- 1249 Views
- 2 replies
- 5 kudos
- 5 kudos
Hi,Yes, Databricks Asset Bundles (DABs) can be used with a Standard Tier Databricks workspace. The use of DABs is not directly tied to the workspace pricing tier but rather to the configuration of your workspace and integration with CI/CD pipelines.
- 5 kudos
- 806 Views
- 2 replies
- 1 kudos
Meta data for cloned table
Hi, guys!How I can find out whether a table is cloned right now - hopefully by querying some meta data (information_schema or the like)?Thanks! Sebastian
- 806 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @SeBaFlu ,To determine whether a table in Databricks is a clone (created using Delta Lake's CREATE TABLE CLONE), you can use Delta Lake's metadata and DESCRIBE HISTORY command.
- 1 kudos
- 3758 Views
- 2 replies
- 0 kudos
Resolved! Download Dashboard as PDF
I see several reference in Databricks documentation to export a Dashboard as a PDF, yet I have no options for format when I download it, it creates a json file. Is there a way to download a Dashboard as a PDF?
- 3758 Views
- 2 replies
- 0 kudos
- 0 kudos
Actually you can download a Dashboard to a PDF if it is a legacy dashboard. It does not appear as an option for an AI/BI dashboard.
- 0 kudos
- 1765 Views
- 4 replies
- 0 kudos
Resolved! Databricks apps, data plane configuration not supported
Unable to create app, get 'This workspace has a data plane configuration that is not yet supported' message. Is there something specific I should look for configuration wise to correct the issue? Azure hosted. Virtual network.
- 1765 Views
- 4 replies
- 0 kudos
- 0 kudos
Hello @rew_data, You might want to check if you region is available to use Databricks Apps, please refer to: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/ The error message "This workspace has a data plane configuratio...
- 0 kudos
- 678 Views
- 1 replies
- 1 kudos
REST API List dashboard schedules - 501 NOT IMPLEMENTED
When I try to retrieve the dashboard scheduling info based on REST API List dashboard schedules I receive the following `501 NOT IMPLEMENTED` response:{ "error_code": "NOT_IMPLEMENTED", "message": "This API is not yet supported." }But e.g the...
- 678 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @gyorgyjelinek, I just tried testing on my end and got the same failure as yours: python3 list_dashboardID.py Error 501: {"error_code":"NOT_IMPLEMENTED","message":"This API is not yet supported."} This endpoint might not be fully supported yet ...
- 1 kudos
- 1709 Views
- 1 replies
- 0 kudos
Resolved! Challenge isolating databricks workspace with single unity catalog metstore for multiple workspaces
Hello Community,I am currently managing multiple workspaces for various projects and facing challenges in achieving data asset isolation between these workspaces. My goal is to ensure that data sharing happens exclusively through Delta Sharing.The cu...
- 1709 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi iskidet01You can use use workspace-catalog bindings. https://learn.microsoft.com/en-us/azure/databricks/catalogs/#workspace-catalog-binding. When you create a catalog, you can assign it to specific workspace, instead of "All workspaces have access...
- 0 kudos
- 1139 Views
- 3 replies
- 0 kudos
PAT needed but not allowed in "Advanced Data Engineering - 6.5L Deploy pipeline with the CLI Lab"
It is stated in the lab notebook that:Run the setupRun the setup script for this lesson by running the cell below. This will ensure that:The Databricks CLI is installedAuthentication is configuredA pipeline is createdHowever, when I tried to run the ...
- 1139 Views
- 3 replies
- 0 kudos
- 0 kudos
... and if I start with step 5, using workspace-level authorisation, I ended up with "localhost refused to connect." in the generated link.
- 0 kudos
- 1234 Views
- 1 replies
- 0 kudos
Will Lakehouse Federation between Databricks and Snowflake support Azure Entra ID?
The Lakehouse Federation between Databricks and Snowflake looks promising, but the lack of support for Azure Entra ID as an identity provider (IdP) is a big limitation for enterprises standardized on it.Managing separate OAuth flows or using Snowflak...
- 1234 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @martkev, Currently, Azure Databricks does not support using Azure Entra ID (formerly Azure Active Directory) directly as an identity provider (IdP) for federated queries on Snowflake. The only supported OAuth integration for Snowflake is Snowf...
- 0 kudos
- 3257 Views
- 2 replies
- 2 kudos
Resolved! system schemas permission
Hi,I'm an account admin on Databricks and when I try to set select permission for system schemasI take "PERMISSION_DENIED: User is not an owner of Schema 'system.compute'." When I try to set permission for system catalog,I take "Requires ownership o...
- 3257 Views
- 2 replies
- 2 kudos
- 2944 Views
- 3 replies
- 2 kudos
Networking configuration of Azure Databricks managed storage account
Hi all,I created an Azure Databricks Workspace, and the workspace creates an Azure Databricks managed storage account.The networking configuration of the storage account is "Enabled from all networks".Shall I change it to "Enabled from selected virtu...
- 2944 Views
- 3 replies
- 2 kudos
- 2 kudos
You dont need view on the subnets itself.In regards the Disabling key access you could use any of the other authentication methods listed here: https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage#connect-to-azure-data-lak...
- 2 kudos
- 2069 Views
- 2 replies
- 0 kudos
Create Databricks managed service principal programatically ?
For the current Databricks service principal API or the Databricks SDK, an ID is required. However, when dealing with Databricks-managed service principals, you typically only have the name. For registering with cloud providers, like Microsoft Entra ...
- 2069 Views
- 2 replies
- 0 kudos
- 0 kudos
Have you found a solution on how to programmatically create a Databricks managed service principal?
- 0 kudos
- 1917 Views
- 2 replies
- 2 kudos
Resolved! Default schema in SQL Editor is not 'default' when unity catalog is set as default catalog
In workspace settings: Workspace admin - advanced - other - Default catalog for the workspace is set to different than hive_metastore, it is set to a `Unity Catalog` catalog - the expected behaviour is copied here from the related more info panel:"Se...
- 1917 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Alberto_Umana ,Thank you for the explanation. I mark your comment as the accepted solution as it contains the current implementation logic and the work around. Good to know that the more info panel is a bit misleading as of now because the SQL Ed...
- 2 kudos
- 4082 Views
- 10 replies
- 3 kudos
Error "Integrating Apache Spark with Databricks Unity Catalog Assets via Open APIs" on Azure
Great blog post: https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-catalog-assets/ba-p/97533I have attempted to reproduce this with Azure Databricks, and ADLS gen2 as the storage backend.Although I'm ab...
- 4082 Views
- 10 replies
- 3 kudos
- 3 kudos
Thanks @dkushari I looked at the github issue you posted, but it has to do specifically with DELTA_UNSUPPORTED_SCHEMA_DURING_READ when streaming *from* a delta table.The specific error I'm seeing is a key error for the Azure storage account hosting t...
- 3 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
64 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 122 | |
| 42 | |
| 37 | |
| 31 | |
| 25 |