- 1373 Views
- 3 replies
- 3 kudos
Resolved! AWS-Databricks' workspace attached to a NCC doesn't generate Egress Stable IPs
I am facing an issue when configuring a Databricks workspace on AWS with a Network Connectivity Configuration (NCC).Even after attaching the NCC, the workspace does not generate Egress Stable IPs as expected.In the workspace configuration tab, under ...
- 1373 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @ricelso ,Sorry to hear you are still facing this issue.This behaviour isn't expected - I would suggest you kindly raise this with your Databricks Account Executive, and they can raise a support request to get this investigated further.Please let ...
- 3 kudos
- 7390 Views
- 24 replies
- 5 kudos
Resolved! Unable to enable Serverless Notebooks
Hello there,I have a Databricks Premium Subscription but am not able to enable Serverless Notebooks (as that option does not seem to exist). I have gone through DB documentation and have Unity Catalog Enabled. I even opened a ticket (00591635) but it...
- 7390 Views
- 24 replies
- 5 kudos
- 5 kudos
We have the same exact thing happening. The /config endpoint shows it is enabled, but we can not select it. The account is several month old. @Walter_C could you help out here as well?
- 5 kudos
- 1435 Views
- 2 replies
- 2 kudos
Resolved! Databricks One
Hello,I can not find where I enable databricks one in my workspace. Can someone help me understand were this is located or who can grant me access to this feature?I checked the the "Previews" in my account and it is not there.Thanks in advance.Best,J...
- 1435 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @jp_allard1 Databricks One is now in Public Preview. It is a Workspace level feature, so a user who has Workspace Admin role should be able to enable it from the Workspace Preview setting page as shown in this screenshot.
- 2 kudos
- 1973 Views
- 7 replies
- 0 kudos
Not able to connect to GCP Secret Manager except when using "No isolation shared" Cluster
Hey everyone,We’re trying to access secrets stored in GCP Secret Manager using its Python package from Databricks on GCP. However, we can only reach the Secret Manager when using "No Isolation Shared" clusters, which is not an option for us. Currentl...
- 1973 Views
- 7 replies
- 0 kudos
- 0 kudos
This is a huge issue. We are seeing the same thing. google auth is broken for databricks on GCP? Only with no isolation enabled is it able to access the metadata service and get credentials.Why is the metadata service not reachable? I would be shocke...
- 0 kudos
- 494 Views
- 1 replies
- 2 kudos
Resolved! Serverless Workspace Observability
I’m setting up observability for a Databricks serverless workspace on AWS and need some guidance.I know we can configure audit logs for S3 delivery, but I’m unsure if that alone is sufficient.For a complete observability setup especially when integra...
- 494 Views
- 1 replies
- 2 kudos
- 2 kudos
Hey @APJESK - thanks for reaching out! For comprehensive observability in a Databricks serverless workspace on AWS, particularly when integrating with tools like CloudWatch, Splunk, or Kibana, enabling audit log delivery to S3 is a crucial first st...
- 2 kudos
- 575 Views
- 1 replies
- 1 kudos
Databricks Oauth errors with terraform deployment in gcp
Trying to deploy databricks workspace in GCP using terraform with a customer managed VPC. Only difference from the terraform provider configuration is that I have a pre-created shared VPC in a host project, and a dedicated workspace project with the ...
- 575 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey @snazkx ,Thank you for your question.We have a set of modules built by the Databricks field team, which helps you to create your workspace with a new VPC or use a pre-existing VPC (BYOVPC)You can find more information on our git repo.Could you pl...
- 1 kudos
- 7305 Views
- 3 replies
- 0 kudos
Enable automatic schema evolution for Delta Lake merge for an SQL warehouse
Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...
- 7305 Views
- 3 replies
- 0 kudos
- 0 kudos
This feature has been released. Please see docs for more details - https://docs.databricks.com/aws/en/sql/language-manual/delta-merge-into#with-schema-evolution
- 0 kudos
- 319 Views
- 1 replies
- 0 kudos
Community account & Cluster activation
Hi all,I am facing troubles to access the community version, when I access the log in process states that I have not a workspace, and I do as I have an account on the free version running out well. The point is that I am doing an exercise for master'...
- 319 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...
- 0 kudos
- 483 Views
- 1 replies
- 1 kudos
Resolved! Community log in
Hi all,I am facing troubles to access the community versión, when I acces the log in process states that I have not a workspace, and do, as I have an account on the free version running out well. T'he point is that I am doing an exercise for master's...
- 483 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...
- 1 kudos
- 798 Views
- 3 replies
- 1 kudos
Resolved! Regarding Traditional workspace - Classic and Serverless Architecture
Why does Databricks require creating AWS resources on our AWS account (IAM role, VPC, subnets, security groups) when deploying a Traditional workspace, even if we plan to use only serverless compute, which runs fully in the Databricks account and onl...
- 798 Views
- 3 replies
- 1 kudos
- 1 kudos
Hey @APJESK , Databricks requires AWS resources such as IAM roles, VPCs, subnets, and security groups when deploying a Traditional workspace—even if you plan to use only serverless compute—because of how the platform distinguishes between workspace t...
- 1 kudos
- 874 Views
- 3 replies
- 0 kudos
Terraform - Assign permissions to Entra Id group
Dear All, Using a Terraform workspace-level provider, I am trying to add an Entra Id group to the account, and then assign permissions to the group. The Terraform provider runs in the context of an Entra Id user account with workspace admin permissi...
- 874 Views
- 3 replies
- 0 kudos
- 0 kudos
@SørenBrandt2 Here are few quick checks you can do and rerun.1. Please make sure the Service Principle running the terraform code have Group Manager role on the specific account group. With that role, it can read that group at the account and retrie...
- 0 kudos
- 404 Views
- 1 replies
- 0 kudos
Resolved! Lakebridge Access
Shortly after Bladebridge was acquired I requested access to the Converter , but was told that that won't be available to partners as tool, but only as a professional service. Is this still the case or can I get my team access to the Converter? ...
- 404 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Paul_Headey! For the most accurate and up-to-date information on this, please reach out to your Databricks representative or contact help@databricks.com.
- 0 kudos
- 654 Views
- 3 replies
- 2 kudos
Can I subscribe to Databricks in AWS Marketplace via Terraform/automation? How to handle this ?
how to subscribe the Databricks product from AWS - market place using Terraform or other automation tool
- 654 Views
- 3 replies
- 2 kudos
- 2 kudos
For now, We got clarification that Databricks subscription on AWS market place can't be automate. I will be in touch more question are on the way
- 2 kudos
- 4396 Views
- 3 replies
- 0 kudos
How to upload a file to Unity catalog volume using databricks asset bundles
Hi,I am working on a CI CD blueprint for developers, using which developers can create their bundle for jobs / workflows and then create a volume to which they will upload a wheel file or a jar file which will be used as a dependency in their noteboo...
- 4396 Views
- 3 replies
- 0 kudos
- 0 kudos
With this setup, users who are entitled to access the catalog will have the access to use the volume, if permissions are set in this way. And, users will be able to utilize the notebook and we need to provide documentation either to clone the noteboo...
- 0 kudos
- 1690 Views
- 6 replies
- 5 kudos
Resolved! Event-driven Architecture with Lake Monitoring without "Trigger on Arrival" on DABs
AWS databricksI want to create data quality monitoring and event-driven architecture without trigger on file arrival but once at deploy.I plan to create a job which trigger once at deploy.The job run this tasks sequentially.1. run script to create ex...
- 1690 Views
- 6 replies
- 5 kudos
- 5 kudos
@tana_sakakimiya ah, I think I see the difference. My screenshot says that "external tables" backed by delta lake will work. This means, you'll need to have the table already created in databricks, from your external location i.e. make an external ta...
- 5 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |