- 7071 Views
- 7 replies
- 5 kudos
Resolved! How to get rid of a pesky gen AI feature in the editor ?
Hi,The editor interface has that gen AI feature following empty lines with a cursor. I find that very distracting and irritating. More over, once a line is deleted that unsolicited thing is interfering with code (snapshots included).How to get rid of...
- 7071 Views
- 7 replies
- 5 kudos
- 5 kudos
Summary:I was not able to solve the UI/UX artifact on my own (on user side)The UI/UX issue was resolved somewhere on Databricks side. The UI/AX artifact is no longer interfering with work.
- 5 kudos
- 1100 Views
- 1 replies
- 0 kudos
Resolved! Attach a databricks_instance_pool to databricks_cluster_policy via terraform
Hello Team,I am trying to create a databricks instance pool and attach it to a cluster policy in our terraform code. But I am having hard time finding a good documentation. Has any one done it? Below is my sample code and I am getting errorI keep get...
- 1100 Views
- 1 replies
- 0 kudos
- 0 kudos
Fixed it! "instance_pool_id" : { type = "fixed" values = "databricks_instance_pool.dev_test_cluster_pool.id"}
- 0 kudos
- 1248 Views
- 2 replies
- 0 kudos
Lakehouse federation -' on behalf of ' queries
Is it possible to achieve the following in a lake-house federation setup using Azure Databricks?1. Establish an external connection (EC1) to an external data source (EDS) using the credentials of user U1.2. Create a foreign catalog (FC1) utilizing EC...
- 1248 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for explaining the authorization flow, @rangu . In the example mentioned, does Databricks support passing the user’s credentials to an external data source? For instance, can it pass the OAuth token for the user along with the externalID crede...
- 0 kudos
- 2394 Views
- 4 replies
- 0 kudos
Libraries installation governance
Dear allI like to know the best practices around libraries installation on Databricks compute - all-purpose, job.The need is to screen the libraries, conduct vulnerability tests, and then let them be installed through a centralized CI/CD process. How...
- 2394 Views
- 4 replies
- 0 kudos
- 0 kudos
@filipniziol thanks again for your time. The thing is we like to block access to these URLs as at times we found developers & data scientists downloading packages that were marked as vulnerable by Maven.
- 0 kudos
- 1323 Views
- 2 replies
- 0 kudos
Azure Databricks Serverless Compute
Hello,Looking for documents related to Azure Databricks Serverless Compute. What are the things we need to consider for security point of view when we decide to use serverless compute?
- 1323 Views
- 2 replies
- 0 kudos
- 0 kudos
These steps are really helpful. I especially appreciate the reminder to check my credentials and consider browser-related issues, as those are often overlooked. I'll make sure to clear my cache and cookies first, and if that doesn't work, I’ll try us...
- 0 kudos
- 950 Views
- 1 replies
- 0 kudos
How to NOT install or disable or uninstall Databricks Delta Live Tables dlt module on jobs cluster?
I need to NOT have the Databricks Delta Live Tables (DLT) Python stub installed on job cluster b/c of naming conflict w/ pip library dlt (and I also don't need delta live tables).There is no "simple" way of uninstalling. It's not installed via pip as...
- 950 Views
- 1 replies
- 0 kudos
- 0 kudos
For anyone facing a similar problem: I've addressed the issue of the dlt module conflict on my job cluster, by using an init script to remove the dlt module from the cluster's Python environment.Simply by doing:%bash #!/bin/bash rm -rf /databricks/sp...
- 0 kudos
- 1128 Views
- 1 replies
- 0 kudos
legacy repo error fetching git status files over 200MB
Working directory contains files that exceed the allowed limit of 200 MB. how to solve this?
- 1128 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @JessieWen ,What you can do besides removing some files from the repo, is to use "Sparce mode" and select only certain paths to be synchronized with Databricks repos. Hope it helps
- 0 kudos
- 6982 Views
- 8 replies
- 0 kudos
Databricks SQL connectivity in Python with Service Principals
Tried to use M2M OAuth connectivity on Databricks SQL Warehouse in Python:from databricks.sdk.core import Config, oauth_service_principal from databricks import sql .... config = Config(host=f"https://{host}", client_...
- 6982 Views
- 8 replies
- 0 kudos
- 0 kudos
Did anyone get this to work? I have tried the code above but I get a slightly different error but I don't see the same level of details from the logs2024-10-04 14:59:25,508 [databricks.sdk][DEBUG] Attempting to configure auth: pat2024-10-04 14:59:25,...
- 0 kudos
- 13263 Views
- 1 replies
- 0 kudos
Connect Community Edition to Power BI Desktop
I have submitted this question several times to Databricks over the past few weeks, and I have gotten no response at all, not even an acknowledgement that my request was received.Please help.How can I connect a certain dataset in Databricks Community...
- 13263 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod,It seams the Commnity Edition doesn't let us to generate the personal-access-token any more. Could you let us know some where we can get the token in the Comminity Edition?Thanks.
- 0 kudos
- 905 Views
- 1 replies
- 0 kudos
cluster administrator
Is individual cluster more cost effective or shared group cluster?
- 905 Views
- 1 replies
- 0 kudos
- 0 kudos
This is very generic, it depends upon use case. If you have a bunch of users trying to read data from catalogs, and perform data analysis or analytics creating a common cluster will be more cost effective and provided better performance. Also, largel...
- 0 kudos
- 1758 Views
- 1 replies
- 0 kudos
How to assign user group for email notification in databricks Alerts
How can I assign a azure databricks user group to an alert for notification?Current scenario is whenever we need to add a user for alert email notification we are manually adding that user email address to each we setup (more than 100) which is very ...
- 1758 Views
- 1 replies
- 0 kudos
- 0 kudos
One option is to handle the logic inside the python notebook to trigger alerts using emali and smtp lib which accepts databricks local groups and AD groups that are synched.
- 0 kudos
- 1957 Views
- 1 replies
- 0 kudos
Resolved! Resource organization in a large company
Hello,We are using Azure Databricks in a single tenant. We will have many teams working in multiple (Unity Enabled) Workspaces using a variety of Catalogs, External Locations, Storage Credentials, ect. Some of those resources will be shared (e.g., an...
- 1957 Views
- 1 replies
- 0 kudos
- 0 kudos
We are using Azure Databricks in a single tenant. We will have many teams working in multiple (Unity Enabled) Workspaces using a variety of Catalogs, External Locations, Storage Credentials, etc. Some of those resources will be shared (e.g., an Exter...
- 0 kudos
- 2799 Views
- 3 replies
- 0 kudos
HTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/seg
while connecting to an api from databricks notebook with the bearer token I am getting the below errorHTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/segments?page=1 (Caused by SSLError(SSLCertVerifica...
- 2799 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @SunilSamal The error you are encountering, SSLCertVerificationError, indicates that the SSL certificate verification failed because the local issuer certificate could not be obtained. This is a common issue when the SSL certificate chain is inco...
- 0 kudos
- 1950 Views
- 4 replies
- 1 kudos
Resolved! Unable to get S3 connection working
I can't get past the error below. I've read and reread the instructions several times at the URL below and for the life of me cannot figure out what I'm missing in my AWS setup. Any tips on how to track down my issue? https://docs.databricks.com/en/c...
- 1950 Views
- 4 replies
- 1 kudos
- 1 kudos
I got it working, there was a weird typo where the role ARN was duplicated. Thanks.
- 1 kudos
- 938 Views
- 1 replies
- 0 kudos
samples catalog doesnt have an information schema
we are looking to do an integration with databricks, and i've noticed that the samples database doesn't have an INFORMATION_SCHEMA we rely on the existence of the information_schema to help us understand what views / tables exist in each catalog. w...
- 938 Views
- 1 replies
- 0 kudos
- 0 kudos
The "samples" catalog in Databricks does not have an INFORMATION_SCHEMA because it is designed primarily for demonstration and educational purposes, rather than for production use. This schema is typically included in catalogs created on Unity Catalo...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
55 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 113 | |
| 37 | |
| 34 | |
| 26 | |
| 25 |