- 615 Views
- 3 replies
- 0 kudos
How setup jdbc connection to snowflake
Hi, im on free Databricks tier account and need help understanding how create jdbc connection to snowflake, as I'm getting error:Catalog > (+) > Create a connection: Connection name= jdbc_snowflake Connection type= JDBC [next] Url= jdbc//org-ac...
- 615 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @emanueol, The error you are seeing ("Connection type JDBC requires environment settings") is because you selected "JDBC" as the connection type, which is a generic connector intended for databases that do not have a dedicated connection type in D...
- 0 kudos
- 152 Views
- 1 replies
- 0 kudos
Payment with Netsuite Customer Payment Portal
I was using a trial account and credit was finished. Now i have done my first month and my card has been rejected,At the moment I received a warning email where i should pay at this platform named as Netsuite Customer Payment Portal but I can't login...
- 152 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @vino2000, Thanks for reaching out. I understand the frustration -- your trial credits ran out, your card was declined, and now you have received an email pointing you to the NetSuite Customer Payment Portal but you cannot log in or reset your pas...
- 0 kudos
- 276 Views
- 3 replies
- 1 kudos
External connectivity to VNET Injected & SCC enabled workspace
Hi All,We have a Databricks installation that is in our internal azure vnets (using vnet injection). And secure cluster connectivity is enabled as well. We have couple of external partners who wants to connect to our workspaces using APIs and JDBC. W...
- 276 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi mat723, That approach is sound and is actually a fairly common pattern. Standing up a public-facing proxy/gateway that is locked down to only the partner's IP ranges, with the backend connecting to the Databricks workspace over Private Link, gives...
- 1 kudos
- 824 Views
- 6 replies
- 4 kudos
Databricks Apps Processes and Pain Points
I'm really interested in learning the processes of the creation and use of Databricks Apps in production that people are using right now. Anybody who has done so, I'd love it if you could provide info on what some of your pain points during these pro...
- 824 Views
- 6 replies
- 4 kudos
- 4 kudos
Hi @odrobek, Great question -- Databricks Apps is still maturing, and sharing real-world workflows helps everyone. I have been working with Apps across several projects, so here is a rundown of what the production development cycle typically looks li...
- 4 kudos
- 313 Views
- 2 replies
- 1 kudos
Streamlit alternative
Hi all,I have a simple app that contains an editable grid and displays some graphs. The Streamlit app is slow, and end users need a faster solution. We are using sql warehouse (serverless)The app will show data of the existing delta tables and user w...
- 313 Views
- 2 replies
- 1 kudos
- 1 kudos
I have had good results using Dash for apps that need charts and interactive UIs. It’s a solid option if you’re moving beyond Streamlit. Why Dash works well for your use case: Charts: Plotly (built into Dash) gives you interactive charts and good ...
- 1 kudos
- 5357 Views
- 6 replies
- 3 kudos
Resolved! How to create Storage Credential using Service Principal [Azure]
As the document indicates, An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor role or higher on the access connector resource in ...
- 5357 Views
- 6 replies
- 3 kudos
- 3 kudos
Late reply.The databricks managed Access Connector is used by Databricks to manage its own "managed storage account" ( the one with the randomized characters in the name), which is used "for its own business", when spinning up and managing compute cl...
- 3 kudos
- 695 Views
- 3 replies
- 0 kudos
Lakebase not accessible in Private Network
We have a VNET Injected workspace in Azure. There are multiple SQL Warehouse which are easily accessible from Private Network - both directly from VM and via VPN on client's machine. We deployed Lakebase. Inside the Workspace, connectivity is working...
- 695 Views
- 3 replies
- 0 kudos
- 0 kudos
Summary Lakebase operates in the Databricks Serverless compute plane, meaning it does not reside inside your injected VNET. Connection timeouts from external tools are typically caused by blocked outbound ports or missing front-end Private Link confi...
- 0 kudos
- 858 Views
- 2 replies
- 1 kudos
Using Terraform to GRANT SELECT ON ANY FILE securable
I have a use case where service principals will read .csv files from Azure Storage Account and create views from them. This used to work in our legacy environment but we are currently migrating to Unity Catalog and when we tested our existing jobs we...
- 858 Views
- 2 replies
- 1 kudos
- 1 kudos
How about... (not tried it myself)resource "databricks_sql_permissions" "grant_select_any_file" {any_file = true privilege_assignments { principal = "your-role" privileges = ["SELECT"] }}
- 1 kudos
- 2408 Views
- 5 replies
- 5 kudos
Resolved! Azure Databricks Control Plane connectivity issue after migrating to vWAN
Hello everyone,Recently, I received a client request to migrate our Azure Databricks environment from a Hub-and-Spoke architecture to a vWAN Hub architecture with an NVA (Network Virtual Appliance).Here’s a quick overview of the setup:The Databricks ...
- 2408 Views
- 5 replies
- 5 kudos
- 5 kudos
@nodeb Can you please mark your reply as solution. It will help other users find the resolution fast.
- 5 kudos
- 293 Views
- 2 replies
- 1 kudos
Resolved! Guidance Needed on Databricks Project Lifecycle & Best Practices
Hello Community,Our company is new to Databricks implementations, and we are starting our initial projects. We would like to understand the typical project lifecycle and best practices followed by experienced teams.Could you please share insights on:...
- 293 Views
- 2 replies
- 1 kudos
- 1 kudos
Hey @vamsi_simbus , I work in the training delivery organization as a trainer. My best advise is to create a Databricks Academy account and take the free self-paced training. Two courses in particular come to mind: DevOps Essentials for Data Enginee...
- 1 kudos
- 489 Views
- 5 replies
- 1 kudos
Resolved! Hive Metastore - Disable Legacy Access option not found
Hi,I've just provisioned a new Databricks Workspace and I would like to enable the hive megastore for some testing. The problem is I cannot find the "Disable legacy access" setting. I went several times to Workspace Settings -> Security but it is jus...
- 489 Views
- 5 replies
- 1 kudos
- 1 kudos
Yes — that’s correct. In a new account, legacy features are disabled by design. That includes the Databricks-hosted workspace Hive metastore. You cannot enable or use Hive the way it was historically used inside a workspace. That path is closed. If y...
- 1 kudos
- 327 Views
- 1 replies
- 0 kudos
Resolved! I would like to find my account manager
Hey, I'm the technical manager at Playstudios, I'm trying to reach out to our account manager in Israel. she's not answering my email/messages. Do we have a different account manager? Please reach our to me ASAP. Thanks, Liraz Nahmias
- 327 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @liraznahmias!Hope the issue has been resolved and that you’ve received a response from your Account Executive.
- 0 kudos
- 2034 Views
- 4 replies
- 3 kudos
Resolved! AWS-Databricks' workspace attached to a NCC doesn't generate Egress Stable IPs
I am facing an issue when configuring a Databricks workspace on AWS with a Network Connectivity Configuration (NCC).Even after attaching the NCC, the workspace does not generate Egress Stable IPs as expected.In the workspace configuration tab, under ...
- 2034 Views
- 4 replies
- 3 kudos
- 3 kudos
Hello @Sai_Ponugoti I'm facing the exact same issue. The NCC doesn't generate the egress IPs. I'm on the premium plan, but on a trial period. I have added payment method to the account. But still the configuration shows: "Egress Stable IPs: ." Could ...
- 3 kudos
- 485 Views
- 5 replies
- 1 kudos
Resolved! Disable Lakebase and Model Serving (Foundation Models) at Account and/or Workspace level
Hello Databricks Support Team,we would like to understand whether it is possible to disable specific Databricks product capabilities in our environment, both at Account level and at Workspace level.1) LakebaseWe would like to confirm if Databricks L...
- 485 Views
- 5 replies
- 1 kudos
- 1 kudos
Got it . Since Lakebase went GA recently its enabled by default now . You will need to open a support ticket to disable it . https://docs.databricks.com/aws/en/resources/support
- 1 kudos
- 484 Views
- 2 replies
- 1 kudos
Resolved! Asset Bundles + GitHub Actions: why does bundle deploy re-create UC schema and volume every run?
Hi everyone,I’m working with Databricks Asset Bundles and deploying via GitHub Actions (CI/CD). I’m seeing behavior I don’t fully understand.On every pipeline run (fresh git checkout/pull and then databricks bundle deploy to the same target environme...
- 484 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Ale_Armillotta , To answer each of your questions: Is this expected behavior for Asset Bundles Yes, deploy is declarative and will attempt “create” whenever the bundle’s tracked state doesn’t already include that resource (names aren’t used to co...
- 1 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
73 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 126 | |
| 53 | |
| 38 | |
| 38 | |
| 25 |