- 540 Views
- 3 replies
- 0 kudos
Proxy (Zscaler) & Databricks/Spark Connect "Cannot check peer: missing selected ALPN property"
Summary:We use Zscaler and are trying to use Databricks Connect to develop pyspark code locally. At first, we received SSL HTTP errors, which we resolved by ensuring Python's request library could find Zscaler's CA cert (setting REQUESTS_CA_BUNDLE en...
- 540 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello Stevenayers-bge,checking if you come across any solution on above mentioned issue?if yes could you please post here, I really appreciate
- 0 kudos
- 347 Views
- 3 replies
- 0 kudos
Generate a Workflow that Waits for Library Installation
I have a process in DBX/DAB and I am using Service Principal for generating a token for reaching the artifacts feed, for security this token lasts 1 hour.import requests YOUR_AZURE_TENANT_ID = ... YOUR_SERVICE_PRINCIPAL_CLIENT_ID = ... YOUR_SECRET_S...
- 347 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @PabloCSD,here are some refined solutions that keep costs low and ensure the main workflow waits until the token is generated:Instead of separating the token generation and main tasks, consider generating the token directly within the initializati...
- 0 kudos
- 840 Views
- 7 replies
- 2 kudos
Resolved! Could not access databricks on iphone any webbrowser
Hi around passed 2 weeks, I tried to access databricks via my ios(17.6) safari and chrome.both web browser could not access.I tried to clear cache and login with Global Administrator in Azure AD before already it same still blank page.could you pleas...
- 840 Views
- 7 replies
- 2 kudos
- 2 kudos
Hi,I can confirm it works as well. According to my info, there was a release to fiz some Chrome issues which had a positive impact on iOS devices.Glad it works again.
- 2 kudos
- 430 Views
- 3 replies
- 0 kudos
Resolved! Terraform Destroy not able to prune Databricks Provisioned GKE Cluster on GCP
Hi there,newbie here in Databricks on GCP. I provisioned my Databricks workspace with Terraform and all worked well. Now when I would like to target destroy my workspace, issues occur:When I do terraform destroy -target module.workspace, the workspac...
- 430 Views
- 3 replies
- 0 kudos
- 0 kudos
Ha, that's true, too. I forget how long it takes things to delete, but I've run into it many time. Best of luck to you!
- 0 kudos
- 4813 Views
- 3 replies
- 0 kudos
Resolved! I have questions about "Premium Automated Serverless Compute - Promo DBU."
"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high? In the picture, I am using AzureThank you in advance for the advice
- 4813 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello everyone!I had the same problem for two months. I went crazy looking for what was spending my entire subscription amount. Premium Automated Serverless Compute - Promo DBU > €8,236 After searching everywhere for information about it and reading ...
- 0 kudos
- 9874 Views
- 2 replies
- 1 kudos
Notebook and folder owner
Hi allWe can use this API https://docs.databricks.com/api/workspace/dbsqlpermissions/transferownership to transfer the ownership of a Query.Is there anything similar for notebooks and folders?
- 9874 Views
- 2 replies
- 1 kudos
- 1 kudos
This api only allows to set permissions based on permission level , which doesn't include changing OWNER. Any suggestions on this particular request.
- 1 kudos
- 204 Views
- 3 replies
- 0 kudos
How to get cost per job which runs on ALL_PURPOSE_COMPUTE ??
with system.billing.usage table i could get cost per jobs which are runs on JOB_COMPUTE but not for jobs which runs on ALL_PURPOSE_COMPUTE.
- 204 Views
- 3 replies
- 0 kudos
- 0 kudos
If nowhere DBU is captured for jobs under ALL_PURPOSE_COMPUTE then cost breakdown-based cluster events is very difficult as more than 2 jobs can parallel. So mapping is very difficult to break down cost for specific job.let me know if I am missing an...
- 0 kudos
- 126 Views
- 1 replies
- 2 kudos
slow cluster start up time up to 30 min gcp
instance type: e2-highmem-2
- 126 Views
- 1 replies
- 2 kudos
- 2 kudos
please use a higher-powered instance type (e.g. n2-highmem-4). The instance type you are currently using (i.e. e2-highmem-2) is significantly underpowered and will result in slower cluster launch times.
- 2 kudos
- 348 Views
- 4 replies
- 0 kudos
Unity Catalog hive_metastore schemas
Hi all,Apologies if this is the wrong group but I was looking in Unity Catalog and noticed that you have different schemas in the hive_metastore depending on if you select a cluster or if you select a warehouse. Could someone please explain what the ...
- 348 Views
- 4 replies
- 0 kudos
- 0 kudos
No schemas are directly attached to compute resources, whether it's an all-purpose cluster or a SQL warehouse in serverless mode.
- 0 kudos
- 304 Views
- 1 replies
- 0 kudos
Databricks Workflow/Jobs View Log Permission
If we don't want to expose admin right to user group. What should we do to allow a specific user group to have permission to view all of the job logs in a Databricks account? We don't want to grant job level permission too.Thanks,VC
- 304 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, I guess you can use Databricks API to list jobs and set Can view permission to all jobs.Sample code below: import requestsfrom databricks_cli.sdk import ApiClient, JobsService, PermissionsService# Initialize the API clientapi_client = ApiClient( ...
- 0 kudos
- 257 Views
- 1 replies
- 0 kudos
data ingestion from external system - auth via client certificate
Hi Community,we have the requirement to ingest data in azure databricks from external systems.Our customer ask us to use Client Certificate as authentication method.Requests - https://requests.readthedocs.io/en/latest/user/advanced/Aiohttp - https://...
- 257 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @cuhlmann ,As I understand you need to ingest data into Azure Databricks from external systems, and your customer requires using client certificate authentication. The challenge is that the client certificate is stored in Azure Key Vault, but the ...
- 0 kudos
- 139 Views
- 2 replies
- 1 kudos
Bill for Premium subscription
hi, there, I have subscribed the Premium plan of databricks, How can I get the bills for this subscription? I didn't find it from the account settings. Anyone can help?
- 139 Views
- 2 replies
- 1 kudos
- 1 kudos
AWS https://docs.databricks.com/en/admin/account-settings/account.html Azure https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account GCP https://docs.gcp.databricks.com/en/admin/account-settings/account.html
- 1 kudos
- 2787 Views
- 4 replies
- 0 kudos
Override default Personal Compute policy using terraform / disable Personal Compute policy
I want to programmatically do some adjustments to the default personal compute resource or preferably create my own custom one based on the same configuration or policy family (in which all users can gain access to) when deploying a new workspace usi...
- 2787 Views
- 4 replies
- 0 kudos
- 0 kudos
Only way I got it working was by importing the pre-existing policy into terraform and do an overwrite as already mentioned by @jsimonovic . The full code example looks like this:import { id = "001BF0AC280610B4" # Polcy ID of the pre-existing person...
- 0 kudos
- 390 Views
- 1 replies
- 1 kudos
Resolved! Retention for hive_metastore tables
HiI have a notebook that creates tables in the hive_metastore with the following code: df.write.format("delta").mode("overwrite").saveAsTable(output_table_name) Which is the retantion for the data saved in the hive metastore? is there any configurati...
- 390 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi mattiagsAs long as you do not delete the data via notebook or in the data lake, it will not be deleted in any other way. This means that there is no retention time in this sense, or conversely, it is infinite until you deliberately delete the data...
- 1 kudos
- 154 Views
- 0 replies
- 0 kudos
Configuration of NCC for Serverless to access SQL server running in a Azure VM
Hi Team, I am following this link to configure NCC for a Serverless compute to access a SQL Server running in a Azure VM. https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/This references to adding privat...
- 154 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
54 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
37 | |
9 | |
9 | |
8 | |
8 |