- 1507 Views
- 1 replies
- 0 kudos
Resolved! Can a SQL Warehouse Pro be shared across multiple workspaces
I'm currently using a SQL Warehouse Pro in one of my Databricks workspaces, and I’m trying to optimize costs. Since the Pro Warehouse can be quite expensive to run, I’d prefer not to spin up additional instances in each workspace.Is there any way to ...
- 1507 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @jfid A SQL Warehouse Pro instance cannot be shared directly across multiple Databricks workspaces. Each workspace requires its own SQL Warehouse instance, even if the compute and data access needs are similar. This is because compute resources li...
- 0 kudos
- 3306 Views
- 2 replies
- 0 kudos
Convert Account to Self-managed
I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...
- 3306 Views
- 2 replies
- 0 kudos
- 0 kudos
Or better yet if we could delete it so I can re-create the account.
- 0 kudos
- 3225 Views
- 0 replies
- 1 kudos
Salesforce Marketing Cloud integration
What is the best way to get Salesforce Marketing Cloud data into Databricks? Lakeflow / Federation connectors are limited to Salesforce and Salesforce Data Cloud right now. Are there plans to add Salesforce Marketing Cloud? The only current option w...
- 3225 Views
- 0 replies
- 1 kudos
- 3112 Views
- 0 replies
- 1 kudos
Jobs API 2.2 No Longer Enabled for Azure Government
Hello,My team deploys job in the Azure Government environment. We have been using the updated cli (> .205) to do so. Sometime within the last month and a half, our azure us gov environment stopped working with the jobs api 2.2. It was working before ...
- 3112 Views
- 0 replies
- 1 kudos
- 2010 Views
- 3 replies
- 0 kudos
Resolved! Unable to submit new case on Databricks (AWS)
Hi, I wanted to submit a case, but when I try to submit one, I see this.You do not have access to submit a case. You can just view your organization's cases. In Case of any query Please contact your admin.I looked into settings, for support but could...
- 2010 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @Sangamswadik! Try raising a case by visiting the site (https://help.databricks.com/s/contact-us?ReqType=training) and filling out the form shown below:
- 0 kudos
- 3019 Views
- 0 replies
- 0 kudos
DLT constantly failing with time out errors
DLT was working but then started getting time outs frequentlycom.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster xxxxxxxxxxxx: Self-bootstrap timed out during launch. Please try again later and con...
- 3019 Views
- 0 replies
- 0 kudos
- 3054 Views
- 0 replies
- 0 kudos
Unable to query using multi-node clusters but works with serverless warehouse & single-node clusters
We have a schema with 10tables and currently all 4 users have ALL access. When I (or any other user) spin up a serverless SQL warehouse, I am able to query one of the tables (million rows) in SQL Editor and get a response within seconds. `select co...
- 3054 Views
- 0 replies
- 0 kudos
- 4033 Views
- 5 replies
- 2 kudos
Resolved! How to get logged in user name/email in the databricks streamlit app?
I have created a Databricks App using streamlit and able to deploy and use it successfully.I need to get the user name/email address of the logged in user and display in the streamlit app. Is this possible?If not possible at the moment, any roadmap f...
- 4033 Views
- 5 replies
- 2 kudos
- 2 kudos
I have also tried to deploy a streamlit app, however I was not able to deploy it.
- 2 kudos
- 1251 Views
- 2 replies
- 0 kudos
Enforcing developers to use something like a single user cluster
Dear allwe have a challenge. Developers create/recreate tables/views in PRD environment by running notebooks on all-purpose clusters where as the same notebooks already exist as jobs. Not sure, why the developers feel comfortable in using all-purpose...
- 1251 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Stefan, exactly, we have the same. the CI/CD process invokes jobs that run as service principal. So far, so good. But, please note that not all situations would fall under this ideal case. There will be cases wherein I have to recreate 50 views ou...
- 0 kudos
- 7448 Views
- 3 replies
- 0 kudos
Leverage Azure PIM with DataBricks with Contributor role privilege
We are trying to leverage Azure PIM. This works great for most things, however; we've run into a snag. We want to limit the contributor role to a group and only at the resource group level, not subscription. We wish to elevate via PIM. This will ...
- 7448 Views
- 3 replies
- 0 kudos
- 0 kudos
Never did, so we scrapped PIM with Databricks for now.
- 0 kudos
- 3145 Views
- 1 replies
- 0 kudos
R-studio on Dedicated Cluster Invalid Access Token
Hello!! Currently I have an R-studio installed on a Dedicated Cluster over Azure Databricks, here are the specs:I must to make enfasis over the Access mode: Manual and Dedicated to a Group.Here, we install R-studio using a notebook with the following...
- 3145 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello! It's me again, I'm also getting the following error: after testing a connection to databricks using sparklyr:Error: ! java.lang.IllegalStateException: No Unity API token found in Unity Scope Run `sparklyr::spark_last_error()` to see the full ...
- 0 kudos
- 1689 Views
- 7 replies
- 1 kudos
Resolved! Unable to Pinpoint where network traffic originates from in GCP
Hi everyone,I have a question regarding networking. A bit of background first: For security reasons, the current allow-policy from GCP to our on-prem-infrastructure is being replaced by a deny-policy for traffic originating from GCP. Therefore access...
- 1689 Views
- 7 replies
- 1 kudos
- 1 kudos
Hi @KLin, happy to help! - The reason why traffic originates from the pods subnet for clusters/SQL warehouses without the x-databricks-nextgen-cluster tag (still using GKE) and from the node subnet for clusters with the GCE tag is due to the underly...
- 1 kudos
- 2755 Views
- 1 replies
- 0 kudos
Asset Bundle: inject job start_time parameter
Hey!I'm deploying a job with databricks asset bundles.When the pyspark task is started on a job cluster, I want the python code to read the job start_time and select the right data sources based on that parameter.Ideally, I would read the parameter f...
- 2755 Views
- 1 replies
- 0 kudos
- 0 kudos
The databricks cli version is Databricks CLI v0.239.1
- 0 kudos
- 3822 Views
- 1 replies
- 0 kudos
Resolved! Custom VPC Subranges for New GCP Databricks Deployment
What Pods and Services subranges would you recommend for a /22 subnet for a custom VPC for a new GCP Databricks deployment in the GCE era?
- 3822 Views
- 1 replies
- 0 kudos
- 0 kudos
The secondary ranges are there to support legacy GKE clusters. While required in the UI, they can be empty in terraform (per a source) for new deployments as clusters are GCE now. (There is a green GCE next to the cluster name.) When observing the ...
- 0 kudos
- 2721 Views
- 0 replies
- 0 kudos
Unable to create workspace using API
Hi all,I'm trying to automate the deployment of Databricks into GCP. In order to streamline the process, I created a standalone project to hold the service accounts SA1 and SA2, with the second one then being manually populated into the Databricks ac...
- 2721 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
39 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
82 | |
36 | |
25 | |
17 | |
15 |