- 420 Views
- 1 replies
- 0 kudos
legacy repo error fetching git status files over 200MB
Working directory contains files that exceed the allowed limit of 200 MB. how to solve this?
- 420 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @JessieWen ,What you can do besides removing some files from the repo, is to use "Sparce mode" and select only certain paths to be synchronized with Databricks repos. Hope it helps
- 0 kudos
- 951 Views
- 0 replies
- 0 kudos
Databricks - Cost difference between Job Clusters and DLT
Wanted to know about the cost comparison and certain specific feature details between job clusters and DLT,Per the Pricing page (based on both Azure Pricing and Databricks Pricing page) following is the understanding - Region: US EastProvisionedJobs ...
- 951 Views
- 0 replies
- 0 kudos
- 4586 Views
- 8 replies
- 0 kudos
Databricks SQL connectivity in Python with Service Principals
Tried to use M2M OAuth connectivity on Databricks SQL Warehouse in Python:from databricks.sdk.core import Config, oauth_service_principal from databricks import sql .... config = Config(host=f"https://{host}", client_...
- 4586 Views
- 8 replies
- 0 kudos
- 0 kudos
Did anyone get this to work? I have tried the code above but I get a slightly different error but I don't see the same level of details from the logs2024-10-04 14:59:25,508 [databricks.sdk][DEBUG] Attempting to configure auth: pat2024-10-04 14:59:25,...
- 0 kudos
- 12407 Views
- 1 replies
- 0 kudos
Connect Community Edition to Power BI Desktop
I have submitted this question several times to Databricks over the past few weeks, and I have gotten no response at all, not even an acknowledgement that my request was received.Please help.How can I connect a certain dataset in Databricks Community...
- 12407 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod,It seams the Commnity Edition doesn't let us to generate the personal-access-token any more. Could you let us know some where we can get the token in the Comminity Edition?Thanks.
- 0 kudos
- 528 Views
- 1 replies
- 0 kudos
cluster administrator
Is individual cluster more cost effective or shared group cluster?
- 528 Views
- 1 replies
- 0 kudos
- 0 kudos
This is very generic, it depends upon use case. If you have a bunch of users trying to read data from catalogs, and perform data analysis or analytics creating a common cluster will be more cost effective and provided better performance. Also, largel...
- 0 kudos
- 768 Views
- 1 replies
- 0 kudos
How to assign user group for email notification in databricks Alerts
How can I assign a azure databricks user group to an alert for notification?Current scenario is whenever we need to add a user for alert email notification we are manually adding that user email address to each we setup (more than 100) which is very ...
- 768 Views
- 1 replies
- 0 kudos
- 0 kudos
One option is to handle the logic inside the python notebook to trigger alerts using emali and smtp lib which accepts databricks local groups and AD groups that are synched.
- 0 kudos
- 381 Views
- 0 replies
- 0 kudos
Creating Group in Terraform using external_id
The documentation here doesn't give much information about how to use `external_id` when creating a new group. If I reference the object_id for an Azure AD Group, the databricks group gets created but the members from the AD group are not added, nor ...
- 381 Views
- 0 replies
- 0 kudos
- 795 Views
- 1 replies
- 0 kudos
Resolved! Resource organization in a large company
Hello,We are using Azure Databricks in a single tenant. We will have many teams working in multiple (Unity Enabled) Workspaces using a variety of Catalogs, External Locations, Storage Credentials, ect. Some of those resources will be shared (e.g., an...
- 795 Views
- 1 replies
- 0 kudos
- 0 kudos
We are using Azure Databricks in a single tenant. We will have many teams working in multiple (Unity Enabled) Workspaces using a variety of Catalogs, External Locations, Storage Credentials, etc. Some of those resources will be shared (e.g., an Exter...
- 0 kudos
- 967 Views
- 3 replies
- 0 kudos
HTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/seg
while connecting to an api from databricks notebook with the bearer token I am getting the below errorHTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/segments?page=1 (Caused by SSLError(SSLCertVerifica...
- 967 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @SunilSamal The error you are encountering, SSLCertVerificationError, indicates that the SSL certificate verification failed because the local issuer certificate could not be obtained. This is a common issue when the SSL certificate chain is inco...
- 0 kudos
- 678 Views
- 0 replies
- 0 kudos
Unity Catalog Volume mounting broken by cluster environment variables (http proxy)
Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...
- 678 Views
- 0 replies
- 0 kudos
- 195 Views
- 0 replies
- 0 kudos
Delete Users that are Maintenance Readers
I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...
- 195 Views
- 0 replies
- 0 kudos
- 348 Views
- 0 replies
- 0 kudos
VS Code Databricks Connect Cluster Configuration
I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...
- 348 Views
- 0 replies
- 0 kudos
- 724 Views
- 4 replies
- 1 kudos
Resolved! Unable to get S3 connection working
I can't get past the error below. I've read and reread the instructions several times at the URL below and for the life of me cannot figure out what I'm missing in my AWS setup. Any tips on how to track down my issue? https://docs.databricks.com/en/c...
- 724 Views
- 4 replies
- 1 kudos
- 1 kudos
I got it working, there was a weird typo where the role ARN was duplicated. Thanks.
- 1 kudos
- 206 Views
- 0 replies
- 0 kudos
Getting "Data too long for column session_data'" creating a CACHE table
Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting Data too long for column 'session_data'. The query I'm using isn't referencing a session_data colu...
- 206 Views
- 0 replies
- 0 kudos
- 425 Views
- 1 replies
- 0 kudos
samples catalog doesnt have an information schema
we are looking to do an integration with databricks, and i've noticed that the samples database doesn't have an INFORMATION_SCHEMA we rely on the existence of the information_schema to help us understand what views / tables exist in each catalog. w...
- 425 Views
- 1 replies
- 0 kudos
- 0 kudos
The "samples" catalog in Databricks does not have an INFORMATION_SCHEMA because it is designed primarily for demonstration and educational purposes, rather than for production use. This schema is typically included in catalogs created on Unity Catalo...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
75 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
24 | |
15 | |
9 |