- 134 Views
- 1 replies
- 0 kudos
AiGatewayConfig non backward compatibly issue from 16.3 to 16.4
We're moving form version 16.3 to version 16.4 LTD, and looks like there is a non backward compatibly issue. This is the import that I have in my codefrom databricks.sdk.service.serving import ( # type: ignore # noqa ServedModelInput, # type:...
- 134 Views
- 1 replies
- 0 kudos
- 0 kudos
The error indicates that AiGatewayConfig cannot be imported from databricks.sdk.service.serving after upgrading from version 16.3 to 16.4 LTD, signaling a breaking change or removal in the SDK. Why This Happens With minor version updates, Databricks ...
- 0 kudos
- 3858 Views
- 1 replies
- 0 kudos
Would it be great if the job workflow supports running docker-based tasks
The current workflow function in Databricks gives a series of options such as DLT, Dbt, python scripts, python files, JAR, etc. It would be good to add a docker file to that and simplify the development process a lot, especially on the unit and integ...
- 3858 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mo_moattar ! Is this still some functionality you're interested in? If so, can you explain a bit more on the use case you're thinking? I'm happy to add this to our feature requests internally, but I know that the Product team will likely request ...
- 0 kudos
- 4545 Views
- 1 replies
- 0 kudos
Resolved! Databricks Community - Cannot See Reply To My Posts
Databricks Community - Cannot See Reply To My Posts.Am I the only one facing this issue or others are also facing the same.
- 4545 Views
- 1 replies
- 0 kudos
- 0 kudos
@clentin , I know this is a response to an older post, but I'm wondering if you ever got this resolved or not? I am able to view the responses to your initial post, so I took the liberty of adding them as screenshots for you. Hope this helps!
- 0 kudos
- 4163 Views
- 2 replies
- 1 kudos
Automating Version Control for Databricks Workflows
I am currently using Databricks Asset Bundles to manage and deploy workflows. While I have successfully automated the version control for notebooks, I am facing challenges with workflows. Specifically, I am looking to automate the process of fetching...
- 4163 Views
- 2 replies
- 1 kudos
- 1 kudos
Automating the reverse synchronization of Databricks workflow (Job) changes made in the Databricks UI back to a GitHub repository is a significant challenge, mainly due to the intentional directionality and guardrails imposed by Databricks Asset Bund...
- 1 kudos
- 4615 Views
- 8 replies
- 2 kudos
Issue with updating email with SCIM Provisioning
Hi all,For our set-up we have configured SCIM provisioning using Entra ID, group assignment on Azure is dealt with by IdentityIQ Sailpoint, and have enabled SSO for Databricks. It has/is working fine apart from one scenario. The original email assign...
- 4615 Views
- 8 replies
- 2 kudos
- 2 kudos
The other option is to raise a ticket with Databricks Accounts team. Our Databricks team worked on the backend and the new email was synced.
- 2 kudos
- 363 Views
- 3 replies
- 0 kudos
Account level Rest API to list workspaces has suddenly stopped working
We use databricks python sdk in one of our Azure databricks workspace to list all the workspaces present in our tenant. The code was working fine since 6-8 months till yesterday and it has started failing suddenly with error :Endpoint not found for /...
- 363 Views
- 3 replies
- 0 kudos
- 0 kudos
My bad, there was an issue with how the accountClient was created. I was able to resolve this. This is still an issue : I also noticed that databricks rest api documentation no longer shows list workspaces as an available API for Azurewhereas it show...
- 0 kudos
- 3299 Views
- 1 replies
- 0 kudos
Customer Managed VPC: Databricks IP Address Ranges
Hello,how often does Databricks change its public ip addresses (the ones that must be whitelisted in a customer managed vpc) and where can I find them?I found this list, but it seems to be incomplete.We moved from a managed vpc to a customer-managed ...
- 3299 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @tom_1 , you’re right to cross-check the published list—here’s how the IPs and ports fit together and where to get the authoritative values. Where to find the current Databricks IPs The official source is the Databricks “IP addresses and d...
- 0 kudos
- 3619 Views
- 1 replies
- 0 kudos
Error on github association
Hello,I'm having an error when trying to link a GitHub account to store some scripts. Looks like my profile keeps on eternal loading too. Does anyone know how I can fix that?
- 3619 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @EdsonDEV , Thanks for the screenshot—your Linked accounts page is showing “Error fetching credentials,” which blocks linking GitHub and can make the settings view spin indefinitely. What typically causes this A broken or stale linked Git cr...
- 0 kudos
- 360 Views
- 2 replies
- 3 kudos
Resolved! Azure Databricks Cluster Pricing
Hi, I am trying to workout a rough total pricing of Azure Databricks Cluster using the following assumption. I want to spin a cluster on D13 v2 vms with 9 executors, so in total 1+9 = 10 nodes. I want to use the cluster for 10 hours a day, 30 hours a...
- 360 Views
- 2 replies
- 3 kudos
- 3 kudos
Here is the simple calculation I use based on dollars and assuming the infra is in EUS.Cost ComponentsAzure VM Cost (D13 v2)On-demand price: $0.741/hour per VMMonthly VM cost:10 VMs×300 hours×$0.741=$2,223Yearly VM cost:10×3600×$0.741=$26,6762 2. Dat...
- 3 kudos
- 4051 Views
- 1 replies
- 1 kudos
How to monitor serverless compute usage in real time
Hello, I'm using Databricks Connect to connect a dash app to my Databricks account. My use case is similar to this example: https://github.com/databricks-demos/dbconnect-examples/tree/main/python/PlotlyI've been able to get everything configured and ...
- 4051 Views
- 1 replies
- 1 kudos
- 1 kudos
There is currently no direct, real-time equivalent in the Databricks UI’s “Compute” tab for monitoring serverless (SQL serverless or Data Engineering serverless) compute usage in the same way as classic clusters, where you see live memory, DBU/hr, an...
- 1 kudos
- 1255 Views
- 1 replies
- 1 kudos
Databricks OAUTH(OIDC) with ORY Network
Hi,we are trying to setup OIDC AUTH for Databricks with our Ory Network account. So far we have been using it without any issues with all of our apps and now we wanted to set it up also for Databricks. Unfortunately after many attempts with different...
- 1255 Views
- 1 replies
- 1 kudos
- 1 kudos
To debug OIDC authentication issues (“oidc_generic_token_failure”) with Databricks using Ory Network as your identity provider, there are several steps and data sources you can leverage for deeper insights. Where to Find Detailed Error Information Da...
- 1 kudos
- 6072 Views
- 2 replies
- 8 kudos
Azure Databricks Multi Tenant Solution
Hello Everyone,For the past few months, we’ve been extensively exploring the use of Databricks as the core of our data warehousing product. We provide analytics dashboards to other organizations and are particularly interested in the Column-Level Sec...
- 6072 Views
- 2 replies
- 8 kudos
- 8 kudos
Implementing robust Row-Level Security (RLS) and Column-Level Security (CLS) in Azure Databricks for multi-tenant analytics—especially with seamless SSO from Power BI and custom apps—is a common concern for B2B SaaS providers scaling to large user ba...
- 8 kudos
- 4067 Views
- 1 replies
- 0 kudos
GitLab on DCS, Datarbricks Container Services
I would like to set up GitLab and Grafana servers using Databricks Container Services (DCS). The reason is that our development team is small, and the management costs of using EKS are not justifiable. We want to make GitLab and Grafana accessible in...
- 4067 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, it is possible to set up GitLab and Grafana servers using Databricks Container Services (DCS) for internal accessibility. DCS supports custom Docker containers and allows you to deploy server applications such as GitLab and Grafana, making it a ...
- 0 kudos
- 3882 Views
- 1 replies
- 0 kudos
VPAT Form
How do I find a Voluntary Product Accessibility Template (VPAT) from Databricks?
- 3882 Views
- 1 replies
- 0 kudos
- 0 kudos
To obtain a Voluntary Product Accessibility Template (VPAT) from Databricks, you must request it directly from Databricks support or your designated account manager. Databricks prepares and provides the VPAT upon request, detailing how their platform...
- 0 kudos
- 6960 Views
- 7 replies
- 4 kudos
Unable to access Databricks Volume from job triggered via API (Container Services)
Hi everyone,We’re facing a strange issue when trying to access a Databricks Volume from a job that is triggered via the Databricks REST API (not via Workflows). These jobs are executed using container services, which may be relevant, perhaps due to i...
- 6960 Views
- 7 replies
- 4 kudos
- 4 kudos
Databricks Volumes (especially Unity Catalog (UC) volumes) often have strict execution context requirements and typically expect the workload to run in Databricks-managed clusters or notebooks where the specialized file system and security context ar...
- 4 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
53 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 111 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |