- 1384 Views
- 1 replies
- 0 kudos
Collation problem with df.first() when different from UTF8_BINARY
I'm getting a error when I want to select the first() from a dataframe when using a collation different than UTF8_BINARYThis works :df_result = spark.sql(f""" SELECT 'en-us' AS ETLLanguageCode""")display(df_result)print(df_resu...
- 1384 Views
- 1 replies
- 0 kudos
- 1684 Views
- 1 replies
- 0 kudos
Resolved! Can a SQL Warehouse Pro be shared across multiple workspaces
I'm currently using a SQL Warehouse Pro in one of my Databricks workspaces, and I’m trying to optimize costs. Since the Pro Warehouse can be quite expensive to run, I’d prefer not to spin up additional instances in each workspace.Is there any way to ...
- 1684 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @jfid A SQL Warehouse Pro instance cannot be shared directly across multiple Databricks workspaces. Each workspace requires its own SQL Warehouse instance, even if the compute and data access needs are similar. This is because compute resources li...
- 0 kudos
- 3706 Views
- 2 replies
- 0 kudos
Convert Account to Self-managed
I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...
- 3706 Views
- 2 replies
- 0 kudos
- 0 kudos
Or better yet if we could delete it so I can re-create the account.
- 0 kudos
- 2164 Views
- 3 replies
- 0 kudos
Resolved! Unable to submit new case on Databricks (AWS)
Hi, I wanted to submit a case, but when I try to submit one, I see this.You do not have access to submit a case. You can just view your organization's cases. In Case of any query Please contact your admin.I looked into settings, for support but could...
- 2164 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @Sangamswadik! Try raising a case by visiting the site (https://help.databricks.com/s/contact-us?ReqType=training) and filling out the form shown below:
- 0 kudos
- 4718 Views
- 5 replies
- 2 kudos
Resolved! How to get logged in user name/email in the databricks streamlit app?
I have created a Databricks App using streamlit and able to deploy and use it successfully.I need to get the user name/email address of the logged in user and display in the streamlit app. Is this possible?If not possible at the moment, any roadmap f...
- 4718 Views
- 5 replies
- 2 kudos
- 2 kudos
I have also tried to deploy a streamlit app, however I was not able to deploy it.
- 2 kudos
- 1375 Views
- 2 replies
- 0 kudos
Enforcing developers to use something like a single user cluster
Dear allwe have a challenge. Developers create/recreate tables/views in PRD environment by running notebooks on all-purpose clusters where as the same notebooks already exist as jobs. Not sure, why the developers feel comfortable in using all-purpose...
- 1375 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Stefan, exactly, we have the same. the CI/CD process invokes jobs that run as service principal. So far, so good. But, please note that not all situations would fall under this ideal case. There will be cases wherein I have to recreate 50 views ou...
- 0 kudos
- 7612 Views
- 3 replies
- 0 kudos
Leverage Azure PIM with DataBricks with Contributor role privilege
We are trying to leverage Azure PIM. This works great for most things, however; we've run into a snag. We want to limit the contributor role to a group and only at the resource group level, not subscription. We wish to elevate via PIM. This will ...
- 7612 Views
- 3 replies
- 0 kudos
- 0 kudos
Never did, so we scrapped PIM with Databricks for now.
- 0 kudos
- 3791 Views
- 1 replies
- 0 kudos
R-studio on Dedicated Cluster Invalid Access Token
Hello!! Currently I have an R-studio installed on a Dedicated Cluster over Azure Databricks, here are the specs:I must to make enfasis over the Access mode: Manual and Dedicated to a Group.Here, we install R-studio using a notebook with the following...
- 3791 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello! It's me again, I'm also getting the following error: after testing a connection to databricks using sparklyr:Error: ! java.lang.IllegalStateException: No Unity API token found in Unity Scope Run `sparklyr::spark_last_error()` to see the full ...
- 0 kudos
- 1882 Views
- 7 replies
- 1 kudos
Resolved! Unable to Pinpoint where network traffic originates from in GCP
Hi everyone,I have a question regarding networking. A bit of background first: For security reasons, the current allow-policy from GCP to our on-prem-infrastructure is being replaced by a deny-policy for traffic originating from GCP. Therefore access...
- 1882 Views
- 7 replies
- 1 kudos
- 1 kudos
Hi @KLin, happy to help! - The reason why traffic originates from the pods subnet for clusters/SQL warehouses without the x-databricks-nextgen-cluster tag (still using GKE) and from the node subnet for clusters with the GCE tag is due to the underly...
- 1 kudos
- 4252 Views
- 1 replies
- 0 kudos
Resolved! Custom VPC Subranges for New GCP Databricks Deployment
What Pods and Services subranges would you recommend for a /22 subnet for a custom VPC for a new GCP Databricks deployment in the GCE era?
- 4252 Views
- 1 replies
- 0 kudos
- 0 kudos
The secondary ranges are there to support legacy GKE clusters. While required in the UI, they can be empty in terraform (per a source) for new deployments as clusters are GCE now. (There is a green GCE next to the cluster name.) When observing the ...
- 0 kudos
- 1080 Views
- 2 replies
- 0 kudos
Workflow job runs are disabled
I'm not totally clear on the financial details, but from what I've been told: A few months our contract with Databricks expired and changed in a per-month subscription. In those months there was a problem with payments due to bills being sent to a wr...
- 1080 Views
- 2 replies
- 0 kudos
- 0 kudos
We contacted them, but were told that we could only use community support unless we got a premium support subscription (not sure about the exact term, somebody else asked them).Our account ID is ddcb191f-aff5-4ba5-be46-41adf1705e03. If the workspace...
- 0 kudos
- 631 Views
- 1 replies
- 0 kudos
How to set a static IP to a cluster
Is there a way to set a static IP to a cluster on the Databricks instance? I'm trying to establish connection with a service outside AWS and it seems the only way to allow inbound connections is by adding the IP to a set of rules. thanks!I couldn’t f...
- 631 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, @Georgi Databricks clusters on AWS don’t have a built‐in way to assign a static IP address. Instead, the typical workaround is to route all outbound traffic from your clusters through a NAT Gateway (or similar solution) that has an Elastic IP ass...
- 0 kudos
- 3330 Views
- 1 replies
- 1 kudos
Resolved! Understanding Azure frontend private link endpoints
Hi,I've been reading up on private link (https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link) and have some questions:In the standard deployment, do the transit VNet (frontend private endpoint) and Databricks work...
- 3330 Views
- 1 replies
- 1 kudos
- 1 kudos
Below are the answers to your questions -1) No, they don’t have to be in the same subscription. You can have the transit VNet (with the front-end Private Endpoint) in one subscription and the Databricks workspace in another, as long as you set up the...
- 1 kudos
- 5168 Views
- 2 replies
- 2 kudos
Using a proxy server to install packages from PyPI in Azure Databricks
Hi,I'm setting up a workspace in Azure and would like to put some restrictions in place on outbound Internet access to reduce the risk of data exfiltration from notebooks and jobs. I plan to use VNet Injection and SCC + back-end private link for comp...
- 5168 Views
- 2 replies
- 2 kudos
- 2 kudos
Thanks Isi, this is great info. I'll update once I've tried it.
- 2 kudos
- 1470 Views
- 4 replies
- 1 kudos
help undersanding RAM utilization graph
I am trying to understand the following graph databricks is showing me and failing:What is that constant lightly shaded area close to 138GB? It is not explained in the "Usage type" legend. The job is running completely on the driver node, not utilizi...
- 1470 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @meshko The light-shaded area represents the total available RAM size. The tooltip shows it when you hover over a mouse.
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |