- 1864 Views
- 3 replies
- 2 kudos
Databricks runtime and Java Runtime
The Databricks runtime is shipped with two Java Runtimes: JRE 8 and JRE 17. While the first one is used by default, you can use the environment variable JNAME to specify the other JRE: JNAME: zulu17-ca-amd64.FWIW, AFAIK JNAME is available since DBR 1...
- 1864 Views
- 3 replies
- 2 kudos
- 2 kudos
@AlexeyEgorovThis post is bit outdated, as, starting from Databricks Runtime 16, JDK 17 is the new default.
- 2 kudos
- 348 Views
- 3 replies
- 0 kudos
Resolved! Control plane set-up
Dear all,In this video from Databricks, Azure Databricks Security Best Practices - https://www.youtube.com/watch?v=R1X8ydIR_Bc&t=623sduring this duration in the video 13.25 - 14.35the presenter talks about benefits of private endpoints. He makes the ...
- 348 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534, Does this control plane then contains management services for several customers? - Yes, Control Plane has management services that are used across customers in the region. Due to which the presenter says traffic can be isolated fro...
- 0 kudos
- 292 Views
- 2 replies
- 0 kudos
Timeout on docker pull in Databricks Container Services
Hello,There is a timeout that limits the size of images used in Docker Container Service. When using images containing large ML libraries, the size often exceeds the limit that could be pulled. Is there any plan to add parametrization of this timeout...
- 292 Views
- 2 replies
- 0 kudos
- 0 kudos
@Walter_C Thank you for your response. Unfortunately, this doesn't resolve my issue. Using the GPU version of Torch even without additional dependencies comes very close to the limit. Once I add the necessary components for training or using the ML m...
- 0 kudos
- 1049 Views
- 5 replies
- 0 kudos
policy_id in databricks asset bundle workflow
We are using databricks asset bundle for code deployment and biggest issue I am facing is that policy_id is different in each environment.I tried with environment variable sin azure devops and also with declaring the variables in databricks.yaml and ...
- 1049 Views
- 5 replies
- 0 kudos
- 0 kudos
Solved by the lookup function https://docs.databricks.com/en/dev-tools/bundles/variables.html#retrieve-an-objects-id-value
- 0 kudos
- 493 Views
- 4 replies
- 1 kudos
Resolved! How to get logged in user name/email in the databricks streamlit app?
I have created a Databricks App using streamlit and able to deploy and use it successfully.I need to get the user name/email address of the logged in user and display in the streamlit app. Is this possible?If not possible at the moment, any roadmap f...
- 493 Views
- 4 replies
- 1 kudos
- 1 kudos
Thanks @BigRoux and @Shannon_O for your response.I tried st.experimental_user.get() API but it gives me hardcoded response as test@example.com Suggestion to use X-Forwarded-Email header worked and I got correct email address. Thanks for your help. ...
- 1 kudos
- 242 Views
- 1 replies
- 0 kudos
Is it possible to disable file download in Volumes interface?
Workspace security administration panel offers to disable downloads in notebook folders and workspaces. However, it seems that even if all those downloads are disabled, the "Volumes" panel of Unity Catalog still offers a file download button. Is it p...
- 242 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @staskh ,Unfortunately, I don't think it is possible to disable it via UI currently. But volumes are governed by UC permission, so maybe you can try to set permission to read/write to approved group of users and take permission of users who should...
- 0 kudos
- 218 Views
- 1 replies
- 0 kudos
Issues with Delta Sharing API when using Service Principal Token
Hello,I am currently working with the Delta Sharing API and have encountered an issue when using a Service Principal token for authentication. The API call returns the following error:[CANNOT_INFER_EMPTY_SCHEMA] Can not infer schema from empty datase...
- 218 Views
- 1 replies
- 0 kudos
- 0 kudos
Please find the response below: 1) The Delta Sharing API supports both personal access tokens and service principal tokens for authentication. 2) Service principals need to be granted specific roles and permissions to access data. This includes assi...
- 0 kudos
- 360 Views
- 1 replies
- 0 kudos
- 360 Views
- 1 replies
- 0 kudos
- 0 kudos
The below documentation shows how to install libraries in a cluster. https://docs.databricks.com/en/libraries/cluster-libraries.html#install-a-library-on-a-cluster
- 0 kudos
- 206 Views
- 3 replies
- 2 kudos
GCP Databricks GKE cluster with 4 nodes
I am working on setting up GCP Databricks and successfully created first GCP-Databricks workspace, but what I observed is it is incurring additional charges even i am using 14days free trail. It is GKE cluster with 4 nodes which are spin up as part o...
- 206 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you @BigRoux,Just want to dig more into this as is there any way to reduce this nodes using CLI or creating customer managed network.
- 2 kudos
- 282 Views
- 2 replies
- 2 kudos
Resolved! Databricks All-purpose compute Pricing
Hello, I am now struggling how to calculate the cost of my job cluster.My configuration is as below:If I have to run the above cluster 18 hours per day, in Standard Tier and East Asia Region, how much will be the pricing of the cluster?Any help provi...
- 282 Views
- 2 replies
- 2 kudos
- 2 kudos
@karen_c Let me make a small correction.It seems that you have checked the option for Spot Instances, which should make the cost slightly lower. Please refer to the far-right column of the attached pricing table for more details.Additionally, you hav...
- 2 kudos
- 723 Views
- 7 replies
- 0 kudos
How do I simply disable someone's user account
I'm trying to do something seemingly very simple - disable someone's user account. I don't even want to delete the user, just disable it for the time being. How do I go about doing that?
- 723 Views
- 7 replies
- 0 kudos
- 0 kudos
Hello! I tried this call. I hid the sensitive information with "HIDDEN" in the example: curl --request PATCH 'https://HIDDEN.cloud.databricks.com/api/2.0/preview/scim/v2/Users/HIDDEN' \ --header 'Accept: application/scim+json' \ --header 'Content-Ty...
- 0 kudos
- 1626 Views
- 5 replies
- 2 kudos
Resolved! Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook
The problemAfter setting up Unity Catalog and a managed Volume, I can upload/download files to/from the volume, on Databricks Workspace UI.However, I cannot access the volume from notebook. I created an All-purpose compute, and run dbutils.fs.ls("/Vo...
- 1626 Views
- 5 replies
- 2 kudos
- 2 kudos
I found the reason and a solution, but I feel this is a bug. And I wonder what is the best practice.When I enable the ADSL Gen2's Public network access from all networks as shown below, I can access the volume from a notebook.However, if I enable the...
- 2 kudos
- 417 Views
- 2 replies
- 1 kudos
Access to system.billing.usage tables
I have Account, Marketplace, Billing Admin roles. I have visibility to system.billing.list_prices table only.How do I get access to system.billing.usage tables? Databricks instance is on AWS.Thanks
- 417 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Alberto_Umana, Thanks for your response. I needed Metastore Admin permissions too. In account console, I changed the Metastore Admin to be a group, became a part of the group. With this other tables were visible. With this permission using the gr...
- 1 kudos
- 376 Views
- 3 replies
- 0 kudos
Best Practices for Daily Source-to-Bronze Data Ingestion in Databricks
How can we effectively manage source-to-bronze data ingestion from a project perspective, particularly when considering daily scheduling strategies using either Auto Loader or Serverless Warehouse COPY INTO commands?
- 376 Views
- 3 replies
- 0 kudos
- 0 kudos
No, it is not a strict requirement. You can have a single node job cluster run the job if the job is small.
- 0 kudos
- 497 Views
- 5 replies
- 0 kudos
Any Databricks system tables contain info of the saved/pre-defined queries
How can I find the saved/pre-defined queries in Databricks system tables?system.query.history seems NOT having the info, like query-id or query-name
- 497 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi Bryan, Databricks system tables do not store saved queries. Query history table captures the query execution details, including: Statement IDExecution statusUser who ran the queryStatement text (if not encrypted)Statement typeExecution durationRes...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
65 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
41 | |
23 | |
9 | |
9 | |
9 |