- 1593 Views
- 1 replies
- 0 kudos
Implementing Governance on DLT pipelines using compute policy
I am implementing governance over compute creation in the workspaces by implementing custom compute policies for all-purpose, job and dlt pipelines. I was successfully able to create compute policies for all-purpose and jobs where I could restrict th...
- 1593 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DeepankarB, To enforce compute policies for DLT pipelines, make sure your policy JSON includes policy_family_id: dlt and set apply_policy_default_values: true in the pipeline cluster settings. This helps apply the instance restrictions correctly ...
- 0 kudos
- 2071 Views
- 4 replies
- 0 kudos
Databricks Predictive optimization
If we want to enable Databricks Predictive Optimization, then is it also mandatory to enable serverless Job/Notebook Compute in our account. We already have Serverless SQL warehouse available in our workspaces.
- 2071 Views
- 4 replies
- 0 kudos
- 0 kudos
The documentation states this:Predictive optimization identifies tables that would benefit from ANALYZE, OPTIMIZE, and VACUUM operations and queues them to run using serverless compute for jobs.If I don't have serverless workloads enabled how does pr...
- 0 kudos
- 4264 Views
- 2 replies
- 0 kudos
Resolved! Migrate to a new account
Hey Team,We're looking into migrating our correct Databricks solution from 1 AWS account (us-east-1 region) to another (eu-central-1 region). I have no documentation left on/about how the corrent solution was provisioned, but I can see CloudFormation...
- 4264 Views
- 2 replies
- 0 kudos
- 0 kudos
I ended up using the terrafrom-databricks-provider tool to perform an export and import of the old workspace into the new one. All that was needed was a PAT in each, export from the old, sed the region, account and PAT and apply. This got me about 7...
- 0 kudos
- 739 Views
- 1 replies
- 0 kudos
Does using SDK API calls cost money?
When using the Databricks SDK to retrieve metadata—such as catalogs, schemas, or tables—through its built-in API endpoints, does this incur any cost similar to running SQL queries?Specifically, executing SQL queries via the API spins up a compute clu...
- 739 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi there @Skully, You are right since you are just fetching the metadata information from catalog, tables etc instead of directly interacting or running any SQL queries, it doesn't cost same as creating a compute. when we retrieve the metadata inform...
- 0 kudos
- 2146 Views
- 4 replies
- 1 kudos
Resolved! Enable Databricks system error
Hi,We want to enable some system system tables in our databricks workspace using this command:curl -v -X PUT -H "Authorization: Bearer <PAT token>" "https://adb-0000000000.azuredatabricks.net/api/2.0/unity-catalog/metastores/<metastore-id>/systemsche...
- 2146 Views
- 4 replies
- 1 kudos
- 1 kudos
While disabling some system schemas we disabled billing system schema and now we cannot enable it again due to this error: billing system schema can only be enabled by Databricks.How can I re-enable billing schema?
- 1 kudos
- 1394 Views
- 1 replies
- 0 kudos
Collation problem with df.first() when different from UTF8_BINARY
I'm getting a error when I want to select the first() from a dataframe when using a collation different than UTF8_BINARYThis works :df_result = spark.sql(f""" SELECT 'en-us' AS ETLLanguageCode""")display(df_result)print(df_resu...
- 1394 Views
- 1 replies
- 0 kudos
- 1725 Views
- 1 replies
- 0 kudos
Resolved! Can a SQL Warehouse Pro be shared across multiple workspaces
I'm currently using a SQL Warehouse Pro in one of my Databricks workspaces, and I’m trying to optimize costs. Since the Pro Warehouse can be quite expensive to run, I’d prefer not to spin up additional instances in each workspace.Is there any way to ...
- 1725 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @jfid A SQL Warehouse Pro instance cannot be shared directly across multiple Databricks workspaces. Each workspace requires its own SQL Warehouse instance, even if the compute and data access needs are similar. This is because compute resources li...
- 0 kudos
- 3732 Views
- 2 replies
- 0 kudos
Convert Account to Self-managed
I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...
- 3732 Views
- 2 replies
- 0 kudos
- 0 kudos
Or better yet if we could delete it so I can re-create the account.
- 0 kudos
- 2211 Views
- 3 replies
- 0 kudos
Resolved! Unable to submit new case on Databricks (AWS)
Hi, I wanted to submit a case, but when I try to submit one, I see this.You do not have access to submit a case. You can just view your organization's cases. In Case of any query Please contact your admin.I looked into settings, for support but could...
- 2211 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @Sangamswadik! Try raising a case by visiting the site (https://help.databricks.com/s/contact-us?ReqType=training) and filling out the form shown below:
- 0 kudos
- 4817 Views
- 5 replies
- 2 kudos
Resolved! How to get logged in user name/email in the databricks streamlit app?
I have created a Databricks App using streamlit and able to deploy and use it successfully.I need to get the user name/email address of the logged in user and display in the streamlit app. Is this possible?If not possible at the moment, any roadmap f...
- 4817 Views
- 5 replies
- 2 kudos
- 2 kudos
I have also tried to deploy a streamlit app, however I was not able to deploy it.
- 2 kudos
- 1384 Views
- 2 replies
- 0 kudos
Enforcing developers to use something like a single user cluster
Dear allwe have a challenge. Developers create/recreate tables/views in PRD environment by running notebooks on all-purpose clusters where as the same notebooks already exist as jobs. Not sure, why the developers feel comfortable in using all-purpose...
- 1384 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Stefan, exactly, we have the same. the CI/CD process invokes jobs that run as service principal. So far, so good. But, please note that not all situations would fall under this ideal case. There will be cases wherein I have to recreate 50 views ou...
- 0 kudos
- 7653 Views
- 3 replies
- 0 kudos
Leverage Azure PIM with DataBricks with Contributor role privilege
We are trying to leverage Azure PIM. This works great for most things, however; we've run into a snag. We want to limit the contributor role to a group and only at the resource group level, not subscription. We wish to elevate via PIM. This will ...
- 7653 Views
- 3 replies
- 0 kudos
- 0 kudos
Never did, so we scrapped PIM with Databricks for now.
- 0 kudos
- 1933 Views
- 7 replies
- 1 kudos
Resolved! Unable to Pinpoint where network traffic originates from in GCP
Hi everyone,I have a question regarding networking. A bit of background first: For security reasons, the current allow-policy from GCP to our on-prem-infrastructure is being replaced by a deny-policy for traffic originating from GCP. Therefore access...
- 1933 Views
- 7 replies
- 1 kudos
- 1 kudos
Hi @KLin, happy to help! - The reason why traffic originates from the pods subnet for clusters/SQL warehouses without the x-databricks-nextgen-cluster tag (still using GKE) and from the node subnet for clusters with the GCE tag is due to the underly...
- 1 kudos
- 4270 Views
- 1 replies
- 0 kudos
Resolved! Custom VPC Subranges for New GCP Databricks Deployment
What Pods and Services subranges would you recommend for a /22 subnet for a custom VPC for a new GCP Databricks deployment in the GCE era?
- 4270 Views
- 1 replies
- 0 kudos
- 0 kudos
The secondary ranges are there to support legacy GKE clusters. While required in the UI, they can be empty in terraform (per a source) for new deployments as clusters are GCE now. (There is a green GCE next to the cluster name.) When observing the ...
- 0 kudos
- 1118 Views
- 2 replies
- 0 kudos
Workflow job runs are disabled
I'm not totally clear on the financial details, but from what I've been told: A few months our contract with Databricks expired and changed in a per-month subscription. In those months there was a problem with payments due to bills being sent to a wr...
- 1118 Views
- 2 replies
- 0 kudos
- 0 kudos
We contacted them, but were told that we could only use community support unless we got a premium support subscription (not sure about the exact term, somebody else asked them).Our account ID is ddcb191f-aff5-4ba5-be46-41adf1705e03. If the workspace...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 110 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |