- 820 Views
- 1 replies
- 0 kudos
cluster administrator
Is individual cluster more cost effective or shared group cluster?
- 820 Views
- 1 replies
- 0 kudos
- 0 kudos
This is very generic, it depends upon use case. If you have a bunch of users trying to read data from catalogs, and perform data analysis or analytics creating a common cluster will be more cost effective and provided better performance. Also, largel...
- 0 kudos
- 1547 Views
- 1 replies
- 0 kudos
How to assign user group for email notification in databricks Alerts
How can I assign a azure databricks user group to an alert for notification?Current scenario is whenever we need to add a user for alert email notification we are manually adding that user email address to each we setup (more than 100) which is very ...
- 1547 Views
- 1 replies
- 0 kudos
- 0 kudos
One option is to handle the logic inside the python notebook to trigger alerts using emali and smtp lib which accepts databricks local groups and AD groups that are synched.
- 0 kudos
- 1754 Views
- 1 replies
- 0 kudos
Resolved! Resource organization in a large company
Hello,We are using Azure Databricks in a single tenant. We will have many teams working in multiple (Unity Enabled) Workspaces using a variety of Catalogs, External Locations, Storage Credentials, ect. Some of those resources will be shared (e.g., an...
- 1754 Views
- 1 replies
- 0 kudos
- 0 kudos
We are using Azure Databricks in a single tenant. We will have many teams working in multiple (Unity Enabled) Workspaces using a variety of Catalogs, External Locations, Storage Credentials, etc. Some of those resources will be shared (e.g., an Exter...
- 0 kudos
- 2263 Views
- 3 replies
- 0 kudos
HTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/seg
while connecting to an api from databricks notebook with the bearer token I am getting the below errorHTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/segments?page=1 (Caused by SSLError(SSLCertVerifica...
- 2263 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @SunilSamal The error you are encountering, SSLCertVerificationError, indicates that the SSL certificate verification failed because the local issuer certificate could not be obtained. This is a common issue when the SSL certificate chain is inco...
- 0 kudos
- 3702 Views
- 0 replies
- 0 kudos
Unity Catalog Volume mounting broken by cluster environment variables (http proxy)
Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...
- 3702 Views
- 0 replies
- 0 kudos
- 2939 Views
- 0 replies
- 0 kudos
Delete Users that are Maintenance Readers
I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...
- 2939 Views
- 0 replies
- 0 kudos
- 3178 Views
- 0 replies
- 0 kudos
VS Code Databricks Connect Cluster Configuration
I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...
- 3178 Views
- 0 replies
- 0 kudos
- 1681 Views
- 4 replies
- 1 kudos
Resolved! Unable to get S3 connection working
I can't get past the error below. I've read and reread the instructions several times at the URL below and for the life of me cannot figure out what I'm missing in my AWS setup. Any tips on how to track down my issue? https://docs.databricks.com/en/c...
- 1681 Views
- 4 replies
- 1 kudos
- 1 kudos
I got it working, there was a weird typo where the role ARN was duplicated. Thanks.
- 1 kudos
- 2904 Views
- 0 replies
- 0 kudos
Getting "Data too long for column session_data'" creating a CACHE table
Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting Data too long for column 'session_data'. The query I'm using isn't referencing a session_data colu...
- 2904 Views
- 0 replies
- 0 kudos
- 835 Views
- 1 replies
- 0 kudos
samples catalog doesnt have an information schema
we are looking to do an integration with databricks, and i've noticed that the samples database doesn't have an INFORMATION_SCHEMA we rely on the existence of the information_schema to help us understand what views / tables exist in each catalog. w...
- 835 Views
- 1 replies
- 0 kudos
- 0 kudos
The "samples" catalog in Databricks does not have an INFORMATION_SCHEMA because it is designed primarily for demonstration and educational purposes, rather than for production use. This schema is typically included in catalogs created on Unity Catalo...
- 0 kudos
- 1770 Views
- 1 replies
- 0 kudos
Issue when connecting to Databricks using OAuth 2.0 token connection string
Following the instructions on Authentication settings for the Databricks ODBC Driver | Databricks on AWS I have created both Personal access token and OAuth 2.0 token connection options in an application. However, I realized that, when I use OAuth 2....
- 1770 Views
- 1 replies
- 0 kudos
- 0 kudos
This will require further analysis, based on the information it seems there is no direct way to restrict this. Are you able to submit a support case so this can be analyzed?
- 0 kudos
- 2590 Views
- 1 replies
- 0 kudos
Resolved! Databricks connect - SQL Server - Login error with all purpose cluster
Hello everyone,I'm using the Databricks connect feature to connect to an SQL server on cloud.I created a foreign catalog based on the connection but whenever I try to access the tables, I get a login error : I have tried with a serverless cluster and...
- 2590 Views
- 1 replies
- 0 kudos
- 0 kudos
Solved.Turns out it was a networking issue, once the subnets from databricks where allowed by the cloud sql server we managed to connect. The error message is misleading because the credentials were correct
- 0 kudos
- 5648 Views
- 3 replies
- 1 kudos
Resolved! PERMISSION_DENIED: User is not an owner of Table/Schema
Hi,We have recently added a service principal for running and managing all of our jobs. The service principal has ALL PRIVILEGES to our catalogs/schemas/and table. But we're still seeing the error message `PERMISSION_DENIED: User is not an owner of T...
- 5648 Views
- 3 replies
- 1 kudos
- 1 kudos
I think the feedback button is the right place. At least I don't know of another way.
- 1 kudos
- 8387 Views
- 1 replies
- 0 kudos
Resolved! [DATA_SOURCE_NOT_FOUND] Failed to find data source
Context:Hello, I was using a workflow for a periodic process, with my team we were using a Job Compute, but the libraries were not working (even though we had a PIP_EXTRA_INDEX_URL defined in the Environment Variables of the Cluster, so we now use a ...
- 8387 Views
- 1 replies
- 0 kudos
- 0 kudos
I installed in the cluster this library:spark_mssql_connector_2_12_1_4_0_BETA.jarA colleague passed me this .jar file. It seems that can be obtained from here: https://github.com/microsoft/sql-spark-connector/releases.This allows the task to end succ...
- 0 kudos
- 1500 Views
- 1 replies
- 0 kudos
Exam for Databricks Certified Data Engineer Associte
My Databricks professional data Engineer certification exam got suspended. My Exam just went for half hour, it was showing me error for eye movement when I was reading question, exam suspended on 11th of July 2024 and still showing in progress assess...
- 1500 Views
- 1 replies
- 0 kudos
- 0 kudos
I'm sorry to hear your exam was suspended. Please file a ticket with our support team and allow the support team 24-48 hours for a resolution. You should also review this documentation:Room requirementsBehavioral considerations
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
40 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
88 | |
37 | |
25 | |
23 | |
17 |