- 2315 Views
- 6 replies
- 1 kudos
Databricks Apps with FastAPI
I have a FastAPI Databricks App deployed to a hosted workspace. How can I call the API from external tools like Postman?P.S. I was able to call the API within the same browser.
- 2315 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi,Make sure your token endpoint url is like this: https://<databricks-instance>/oidc/v1/token.Refer here for workspace-level access token: https://docs.databricks.com/aws/en/dev-tools/auth/oauth-m2mAlso make sure you have granted access of the servi...
- 1 kudos
- 1004 Views
- 2 replies
- 0 kudos
Databricks Apps not working in postman
I have a question regarding Databricks Apps. I have deployed my databricks Apps, and its working on my laptop, but when I try to open the same url in my mobile its redirecting to databricks signin page, and also its not working through postman as wel...
- 1004 Views
- 2 replies
- 0 kudos
- 0 kudos
Is this issue specifically with the Databricks Apps am I right? Are you getting and erro message?
- 0 kudos
- 1375 Views
- 3 replies
- 2 kudos
Databricks Apps
I've created a Databricks App that is essentially a REST Api in Flask. It returns a json object. Data is in sql-warehouse. It works when calling directly in browser but I want to access it with rest api and Bearer token. But when I login with a servi...
- 1375 Views
- 3 replies
- 2 kudos
- 2 kudos
Did it work, I am trying something similar? If yes, please provide how you achieved it?
- 2 kudos
- 1446 Views
- 1 replies
- 0 kudos
Is it possible to run all Python (not just Spark) via Databricks cluster from local IDE?
Hi all,When connecting to a Databricks cluster from a local IDE like VS Code, Spark operations already run on the cluster.But is there a way to route all Python code — not just Spark — to run directly on the Databricks cluster as well?Thanks!
- 1446 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @tasozgurcem! If you want your whole Python script to run on the cluster, you can either upload the file or run it as a job using the Databricks extension.Check this out: https://docs.databricks.com/aws/en/dev-tools/vscode-ext/runhttps://docs.d...
- 0 kudos
- 1453 Views
- 5 replies
- 0 kudos
Workspace Failed to Launch – Internal Error
I'm encountering an issue while trying to launch a Databricks workspace on GCP.Has anyone experienced this before or know what might be causing it?Would appreciate any help!Error Message: "An internal error occurred when launching the workspace." 
- 1453 Views
- 5 replies
- 0 kudos
- 0 kudos
I have all the quotas, and the Databricks support team told me to post in the community
- 0 kudos
- 1152 Views
- 2 replies
- 0 kudos
How to Change Created By Field in SQL Warehouses Without Recreating Them?
I’m trying to update the "Created by" field on an existing SQL warehouse without having to recreate the entire warehouse. I attempted to use the Databricks Python SDK to run an update command, but unfortunately, it didn’t change the field.Has anyone ...
- 1152 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @KingAJ ,Maybe you can try to use following REST API endpoint? There's a creator_name attribute that you can use.Update a warehouse | SQL Warehouses API | REST API reference | Databricks on AWS
- 0 kudos
- 1159 Views
- 2 replies
- 1 kudos
Databricks Audit Logs Analysis
Please share the attributes related to the audit logsHow is the audit logs can be utilized by cyber security team? What are the insights into the audit logs and how we can maintain the compliance? What are the non-compliance items can be identified f...
- 1159 Views
- 2 replies
- 1 kudos
- 1 kudos
The audit logs will only get the information of the events and actions being performed by users, service principals in the workspace, so there is no compliance actions being tracked itself.The Service name is a subgroup of items you want to check, so...
- 1 kudos
- 1753 Views
- 1 replies
- 0 kudos
not able to configure databricks with aws
I am trying to set up Databricks with my AWS account but I am facing some issues, I followed the steps to create a new Databricks workspace using the AWS CloudFormation template or Databricks setup guide, But the CloudFormation stack fails every time...
- 1753 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @dipin_wantoo! Can you please check the CloudFormation Stack Events for the exact error? That should help identify why the stack is failing. If you're looking to get started with Databricks as an individual user, the Express Setup is a simple w...
- 0 kudos
- 1712 Views
- 3 replies
- 1 kudos
Library installation failed for library due to user error for pypi
Hi!I get the below error when a cluster job starts up and tries to install a Python .whl file. (Which is hosted on an Azure Artefact feed, though this seems more like a problem of trying to read from a disk/network storage). The failure is seemingly ...
- 1712 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks both - I think the problem is that this library installation is called when creating a new Job & Task via the rest endpoint. Where the libraires are specified in the .json file. So short version, don't think I can 'get at' the pip install call...
- 1 kudos
- 2144 Views
- 2 replies
- 1 kudos
Resolved! Why am I seeing NAT Gateway in the cost? Serverless Compute.
I have Azure Databricks premium subscription. I am running the Python interactive Notebooks in the Databricks Wokrspace using Serverless compute since the last few days. Today I received an alter in my email saying the monthly Billing already crossed...
- 2144 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks @szymon_dybczak I deleted the Workspace and the NAT Gatway service got deleted from the vnet. I created a simple single node cluster to run my code.
- 1 kudos
- 1455 Views
- 1 replies
- 0 kudos
Resolved! Can I add another user to Free edition
Is it possible to add another user to Free edition ?I am wanting to test what they can see when they connect as a restricted user i.e. only granted Browse on 1 catalogThanks
- 1455 Views
- 1 replies
- 0 kudos
- 3848 Views
- 7 replies
- 0 kudos
Failing PowerBi connection with data Bricks via SQL warehouse
I'm encountering an 'Invalid credentials' error (Session ID: 4601-b5a6-0daf792752a2, Region: us) when connecting Power BI to an Azure Databricks SQL Warehouse using an SPN. The SPN has CAN MANAGE access at SQL warehouse, admin rights at account and...
- 3848 Views
- 7 replies
- 0 kudos
- 0 kudos
Hello sai, sorry for the experience I usually available at desk during EMEA time zone. Apologies for the delay. Can you please try this?In "Advanced settings" of the data source within the gateway configuration and set the "Connection Encryption se...
- 0 kudos
- 2472 Views
- 1 replies
- 0 kudos
Looking for insights on enabling Databricks Automatic Provisioning
We currently have a SCIM provisioning connector set up to synchronize identities from Entra ID to Unity Catalog.We’re now considering enabling Databricks Automatic Provisioning but want to fully understand the potential impact on our environment befo...
- 2472 Views
- 1 replies
- 0 kudos
- 0 kudos
TranslateHello Xaveri.Good day!Here are few links related to databricks provisioninghttps://docs.databricks.com/aws/en/admin/users-groups/scim/aadhttps://www.databricks.com/blog/announcing-automatic-identity-management-azure-databricksBut do let me k...
- 0 kudos
- 2734 Views
- 4 replies
- 4 kudos
Privileged Identity Management for Databricks with Microsoft Entra ID
Privileged Identity Management (PIM) can be used to secure access to critical Databricks roles with Just-in-Time (JIT) access. This approach helps organizations enforce time-bound permissions, approval workflows, and centralized auditing for sensitiv...
- 2734 Views
- 4 replies
- 4 kudos
- 1551 Views
- 2 replies
- 0 kudos
Delete unassigned catalogs
Hi everybody,due to some not-so-optimal Infrastructure as code experiments with terraform I ended up a lot (triple digit) of catalogs in a metastore that are not assigned to any workspace and that i want to delete.Unfortunately, there is no way to ev...
- 1551 Views
- 2 replies
- 0 kudos
- 0 kudos
Yeah, I see those catalogs and i know that I could reattach and delete them. As i have around 100 those catalogs it would be nice to iterate through them by getting a list, e.g. using the cli or the rest API. And then force delete them , as described...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
61 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 120 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |