- 249 Views
- 3 replies
- 0 kudos
How to get cost per job which runs on ALL_PURPOSE_COMPUTE ??
with system.billing.usage table i could get cost per jobs which are runs on JOB_COMPUTE but not for jobs which runs on ALL_PURPOSE_COMPUTE.
- 249 Views
- 3 replies
- 0 kudos
- 0 kudos
If nowhere DBU is captured for jobs under ALL_PURPOSE_COMPUTE then cost breakdown-based cluster events is very difficult as more than 2 jobs can parallel. So mapping is very difficult to break down cost for specific job.let me know if I am missing an...
- 0 kudos
- 188 Views
- 1 replies
- 2 kudos
slow cluster start up time up to 30 min gcp
instance type: e2-highmem-2
- 188 Views
- 1 replies
- 2 kudos
- 2 kudos
please use a higher-powered instance type (e.g. n2-highmem-4). The instance type you are currently using (i.e. e2-highmem-2) is significantly underpowered and will result in slower cluster launch times.
- 2 kudos
- 436 Views
- 4 replies
- 0 kudos
Unity Catalog hive_metastore schemas
Hi all,Apologies if this is the wrong group but I was looking in Unity Catalog and noticed that you have different schemas in the hive_metastore depending on if you select a cluster or if you select a warehouse. Could someone please explain what the ...
- 436 Views
- 4 replies
- 0 kudos
- 0 kudos
No schemas are directly attached to compute resources, whether it's an all-purpose cluster or a SQL warehouse in serverless mode.
- 0 kudos
- 353 Views
- 1 replies
- 0 kudos
Databricks Workflow/Jobs View Log Permission
If we don't want to expose admin right to user group. What should we do to allow a specific user group to have permission to view all of the job logs in a Databricks account? We don't want to grant job level permission too.Thanks,VC
- 353 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, I guess you can use Databricks API to list jobs and set Can view permission to all jobs.Sample code below: import requestsfrom databricks_cli.sdk import ApiClient, JobsService, PermissionsService# Initialize the API clientapi_client = ApiClient( ...
- 0 kudos
- 296 Views
- 1 replies
- 0 kudos
data ingestion from external system - auth via client certificate
Hi Community,we have the requirement to ingest data in azure databricks from external systems.Our customer ask us to use Client Certificate as authentication method.Requests - https://requests.readthedocs.io/en/latest/user/advanced/Aiohttp - https://...
- 296 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @cuhlmann ,As I understand you need to ingest data into Azure Databricks from external systems, and your customer requires using client certificate authentication. The challenge is that the client certificate is stored in Azure Key Vault, but the ...
- 0 kudos
- 172 Views
- 2 replies
- 1 kudos
Bill for Premium subscription
hi, there, I have subscribed the Premium plan of databricks, How can I get the bills for this subscription? I didn't find it from the account settings. Anyone can help?
- 172 Views
- 2 replies
- 1 kudos
- 1 kudos
AWS https://docs.databricks.com/en/admin/account-settings/account.html Azure https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account GCP https://docs.gcp.databricks.com/en/admin/account-settings/account.html
- 1 kudos
- 2940 Views
- 4 replies
- 0 kudos
Override default Personal Compute policy using terraform / disable Personal Compute policy
I want to programmatically do some adjustments to the default personal compute resource or preferably create my own custom one based on the same configuration or policy family (in which all users can gain access to) when deploying a new workspace usi...
- 2940 Views
- 4 replies
- 0 kudos
- 0 kudos
Only way I got it working was by importing the pre-existing policy into terraform and do an overwrite as already mentioned by @jsimonovic . The full code example looks like this:import { id = "001BF0AC280610B4" # Polcy ID of the pre-existing person...
- 0 kudos
- 439 Views
- 1 replies
- 1 kudos
Resolved! Retention for hive_metastore tables
HiI have a notebook that creates tables in the hive_metastore with the following code: df.write.format("delta").mode("overwrite").saveAsTable(output_table_name) Which is the retantion for the data saved in the hive metastore? is there any configurati...
- 439 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi mattiagsAs long as you do not delete the data via notebook or in the data lake, it will not be deleted in any other way. This means that there is no retention time in this sense, or conversely, it is infinite until you deliberately delete the data...
- 1 kudos
- 230 Views
- 0 replies
- 0 kudos
Configuration of NCC for Serverless to access SQL server running in a Azure VM
Hi Team, I am following this link to configure NCC for a Serverless compute to access a SQL Server running in a Azure VM. https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/This references to adding privat...
- 230 Views
- 0 replies
- 0 kudos
- 1368 Views
- 3 replies
- 0 kudos
Resolved! Databricks on AWS - Changes to your Unity Catalog storage credentials
Hi Context: On June 30, 2023, AWS updated its IAM role trust policy, which requires updating Unity Catalog storage credentials. Databricks previously sent an email communication to customers in March 2023 on this topic and updated the documentation a...
- 1368 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you for the response @MoJaMa - we will try it out tomorrow and post an update here.
- 0 kudos
- 258 Views
- 0 replies
- 0 kudos
Pre-loading docker images to cluster pool instances still requires docker URL at cluster creation
I am trying to pre-load a docker image to a Databricks cluster pool instance.As per this article I used the REST API to create the cluster pool and defined a custom Azure container registry as the source for the docker images.https://learn.microsoft....
- 258 Views
- 0 replies
- 0 kudos
- 1191 Views
- 4 replies
- 1 kudos
Resolved! How to use Databricks CLI as a service principal?
Hi all,I have a question about how to use Databricks CLI on my local environment as a service principal?I have installed Databricks CLI and configured the file `.databrickscfg` as shown below. [DEFAULT] host = https://adb-123123123.1.azuredatabr...
- 1191 Views
- 4 replies
- 1 kudos
- 1 kudos
got you.I found a working solution. Try this one:[devsp] azure_workspace_resource_id = /subscriptions/bc0cd1..././.../Databricks/workspaces/my-workspace azure_tenant_id = bc0cd1... azure_client_id = fa0cd1... azure_client_secr...
- 1 kudos
- 265 Views
- 1 replies
- 1 kudos
Terraform - Azure Databricks workspace without NAT gateway
Hi all,I have experienced an increase in costs - even when not using Databricks compute.It is due to the NAT-gateway, that are (suddenly) automatically deployed.When creating Azure Databricks workspaces using Terraform:A NAT-gateway is created. When ...
- 265 Views
- 1 replies
- 1 kudos
- 1 kudos
try by adding more properties:Also, Ensure that the subnets used by Azure Databricks do not have settings that require a NAT gateway.Consider using private endpoints for Azure Databricks to avoid the need for a NAT gateway. infrastructure_encryptio...
- 1 kudos
- 812 Views
- 4 replies
- 0 kudos
- 812 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you so much!! I solve this by reinstalling chrome browser. I got this issue last week and can not solve it even if wait, clear cache, restart..etc. But worked in another type of browser. So I reinstall Chrome browser and it worked. Thank you
- 0 kudos
- 963 Views
- 6 replies
- 0 kudos
Exact cost for job execution calculation
Hi everybody,I want to calculate the exact cost of single job execution. In all examples I can find on the internet it uses the tables system.billing.usage and system.billing.list_prices. It makes sense to calculate the sum of DBUs consumed and multi...
- 963 Views
- 6 replies
- 0 kudos
- 0 kudos
@radothede, I've clarified this with Databricks and my assumption was correct. The formula sum(usage_quantity * list_prices.pricing.default)is only right, if the time window in the usage table is 1 hour. For every window that is not 1 hour, the fract...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
62 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
13 | |
9 | |
9 | |
9 |