- 660 Views
- 2 replies
- 1 kudos
[Metastore] can't assign to workspace in cross Azure Account envirronment
Hi all,hope you are doing well. We have a metastore in tenant A, and workspaces in tenant B. Tenant B has Contributor Role to tenant A, to access the metastore. For some reasons, we can not assign the metastore from tenant A to the workspace in tenan...
- 660 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @data_bricklayer, To address this, ensure the Contributor Role in tenant B has the right permissions for tenant A's metastore. Verify that the workspace in tenant B is properly registered and visible in tenant A and that both are in the same regio...
- 1 kudos
- 1267 Views
- 3 replies
- 4 kudos
dbutils secrets break matplotlib in recent Databricks runtimes
Using a Databricks cluster running Databricks Runtime version 15.4 LTS Beta, and with no init-scripts, the following code works fine:import numpy as npimport pandas as pddf = pd.DataFrame(np.random.randn(2000, 10))df.plot(subplots=True, figsize=(10, ...
- 1267 Views
- 3 replies
- 4 kudos
- 4 kudos
OK, I understand the error now:This specific secret is a small alphanumeric string. The probability of it being part of the base64 of a big PNG is quite large.Is there any workaround for this? I know I can't turn off the redaction of secrets. Can I d...
- 4 kudos
- 1363 Views
- 1 replies
- 0 kudos
Obtain access of Azure metastore storage account to configure Lifecycle management
I recently set up an Azure Databricks workspace with an automatically created metastore and metastore-level root storage within the metastore blob storage account. All the catalogs, schemas, and tables/volumes have been created without a specified or...
- 1363 Views
- 1 replies
- 0 kudos
- 1933 Views
- 2 replies
- 0 kudos
Databricks connection pool(ing) using python
I have a python application which I need to connect with databricks (Databricks SQL Connector). The process for that is quite simple -1. connect with dbr2. execute your query3. get the result4. close the connectionExample from - https://docs.databric...
- 1933 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod, thanks for your reply. I have 2 follow-up questions for you - When you say "consider managing connection lifetimes by closing and recreating them periodically" - the problem I face with this is when my API request comes in and I crea...
- 0 kudos
- 3546 Views
- 5 replies
- 4 kudos
Resolved! UCX- installation error -
I'm getting the following error when trying to install UCX on a specific workspace.I have already installed ucx from my machine on other workspace and works, now on this workspace i'm getting this error ERROR [d.l.blueprint.parallel][installing_das...
- 3546 Views
- 5 replies
- 4 kudos
- 4 kudos
Issue was resolved in lsql v0.7.4, a ucx dependency: https://github.com/databrickslabs/lsql/releases/tag/v0.7.41. Upgrade the Databricks cli: https://github.com/databricks/cli/releases/2. Upgrade ucx: `databricks labs install ucx`
- 4 kudos
- 716 Views
- 1 replies
- 0 kudos
vocareum lab showing no workspace found.
Hi,I bought Generative AI Engineering with Databricks lab subscription by paying $75 just to get the hands on experience.The labs worked for one day and then they stop working. Please active the labs ASAP, I am preparing for the certification exam an...
- 716 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @cleversuresh, Thank you for sharing your concern with us! To expedite your request, please list your concerns on our ticketing portal. Our support staff would be able to act faster on the resolution (our standard resolution time is 24-48 hours)...
- 0 kudos
- 2093 Views
- 5 replies
- 2 kudos
Promoting assets from Dev workspace to Prod
What is the best way to promote Databricks objects from one workspace to another? For example, DABs are great for code and models. However, SQL alerts do not seem to fall into DABs. The API allows for retrieving queries and creating them via API in t...
- 2093 Views
- 5 replies
- 2 kudos
- 2 kudos
https://docs.databricks.com/api/workspace/queries https://docs.databricks.com/en/sql/dbsql-api-latest.html#changes-to-the-alerts-api The API path is now api/2.0/sql/alerts, replacing the legacy path of /api/2.0/preview/sql/alerts. https://docs.databr...
- 2 kudos
- 853 Views
- 1 replies
- 1 kudos
System Tables
Hi,I've recently enabled the billing schema in system tables, but still not able to view it in catalog. I am both an account admin and workspace admin, as well as an owner of the workspace in Azure. A colleague (who used to be Azure AD admin but no l...
- 853 Views
- 1 replies
- 1 kudos
- 649 Views
- 1 replies
- 1 kudos
How to list my groups in unitity catalog via an API
I'm trying to list my groups in unity catalog via an API call, but it keeps giving the error "Bad Target"My GET request is:-https://adb-3676685197829242.2.azuredatabricks.net/api/2.1/accounts/cfb89269-0d9e-4235-b7985-455481b501a/scim/v2/GroupsI'm abl...
- 649 Views
- 1 replies
- 1 kudos
- 1001 Views
- 1 replies
- 1 kudos
Databricks + Apache Iceberg = advantageous or wasted effort due to duplicate functionality ?
Trying to design a Lakehouse. Spark is at the base layer. Now wondering if adding Apache Iceberg sitting below Spark will be of help, or, not ? Preferring Iceberg for its auto indexing, ACID query facilities over big hetergenous datasets. Wonder if i...
- 1001 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello, if you're planning on building your own open source stack of spark+iceberg, it can be a good choice. If you're on Databricks, however, you're going to miss out a *lot* on delta features that are baked into the platform. Specifically compute +...
- 1 kudos
- 3444 Views
- 4 replies
- 3 kudos
Resolved! [VNET injection] Container and container subnet
Hi,I was researching everywhere and could not find the answer. I understand that when workspace is created, it has 2 subnets, host and container. The VM, which runs the Databricks container, is in host subnet, which logically means the container is a...
- 3444 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @data_bricklayer ,Public Subnet (host): The public subnet is typically used for resources that need to communicate with the internet or other Azure services. In Azure Databricks, this subnet is used for driver nodes of the clusters that require ou...
- 3 kudos
- 4869 Views
- 3 replies
- 2 kudos
Connecting external location from a different tenant in Azure
Hi,we have a setup with 2 different Azure tenants. In tenant A we have a storage account that we want to connect as an external location to a databricks workspace in the tenant B. For that we have established a private endpoint from the storage accou...
- 4869 Views
- 3 replies
- 2 kudos
- 2 kudos
It would be superb to connect between two tenants with Azure Databricks Access Connector
- 2 kudos
- 3873 Views
- 1 replies
- 0 kudos
Resolved! Prevent service principal UUID from appearing on job name
Hello!I am using service principal id to authenticate my databricks bundle. But when the job runs, this id is automatically appended to both the name and tags column on the jobs run page. In my databricks.yml file I have name: "[${var.environment}]" ...
- 3873 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi!Sounds like the "development" mode. DAB will automatically prefix your jobname with <env> <user name> if you set "mode" to "development" in the databricks.yml file. The name lookup for service principals apparently doesn't work nicely and you get ...
- 0 kudos
- 1761 Views
- 4 replies
- 5 kudos
Why all workspace users can see my user folder
Hi, I am a Databricks account admin user with admin access to our workspace. My user folder for some reason is visible to all workspace users. I have checked permissions settings where possible and cannot see anything that would indicate fully shared...
- 1761 Views
- 4 replies
- 5 kudos
- 5 kudos
workspace is visible for all , you have to make changes in Admin console ,you will find this feature there to disable it
- 5 kudos
- 1598 Views
- 1 replies
- 1 kudos
Resolved! How to Pass Azure variable to databricks.yml file
Hello I would like to find a way to pass a variable from my Azure variables to my databricks yml file. For example I would like to pass the variable BUNDLE_TARGET to the location in this databricks.yml fileIs there a way to do something like this?...
- 1598 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @TheManOfSteele ,Here are the examples of how to achieve that. I think the simplest way would be to set environment variables.azure devops - How can I pass parameters to databricks.yml in Databricks Asset Bundles? - Stack OverflowDatabricks Asset ...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
16 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
43 | |
33 | |
25 | |
17 | |
10 |