- 1103 Views
- 1 replies
- 0 kudos
Resolved! Set Deafult Cluster
HelloIs there a way to set one of the clusters in my workspace to be the default cluster when choosing from list of available clusters when running my Code?Thanks in Advanced.
- 1103 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Krubug ,Thanks for bringing up your concerns, always happy to help Upon checking internally, it seems that we do not currently support setting a default cluster for the workspace. But we do have a Feature request titled "Workspace - Settings - S...
- 0 kudos
- 2449 Views
- 4 replies
- 1 kudos
Resolved! Azure Databricks Assistant and Private Link
Hi,I was wondering if anyone is using the Databricks Assistant using the standard private link configuration of their network.Kind of curious how data is passed from the Databricks Workspace, in a private link configuration, to the Azure OpenAI servi...
- 2449 Views
- 4 replies
- 1 kudos
- 1 kudos
I asked our SA at Databricks. AWS is different if anyone is looking for that answerAzure Open AI connects to the control planeIt doe not use private endpoints, though since you are an Azure customer, the traffic stays on the Azure global network (no ...
- 1 kudos
- 2185 Views
- 1 replies
- 1 kudos
Feature request: Run cluster in a different region
We are one of those which unfortunately decided (earlier) to set up our infastructure in one of the sub-par Azure regions, and we happen to dont have any GPU clusters available.We would like to test out some GPU enhanced ML in databricks, but at the ...
- 2185 Views
- 1 replies
- 1 kudos
- 1 kudos
@Erik I think it would be really useful if we could create a cluster in another region from our normal workspace This does not seem like a possible thing for me as the Control plane resources and the Data plane resources are bound to that region. If...
- 1 kudos
- 2441 Views
- 4 replies
- 0 kudos
Language Preferences changes all day
Since a few days I have the phenomenon that the language in the Databricks Workspace always changes to German. I have to change the language back to English in the settings every day. My PC has German set as the default language, but I want to keep t...
- 2441 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @stefankoch-OLD I had the same issue from a french perspective To fix this issue, I had to configure the Language + Region settings on the Azure portal, and set the Regional format to English (Europe) ... it was previously on France. After a ...
- 0 kudos
- 8912 Views
- 3 replies
- 4 kudos
Databricks Runtime JDK 17 upgrade
HeyI'm using Databricks Runtime 13.2 https://docs.databricks.com/en/release-notes/runtime/13.2.html#system-environmentIt uses JDK 8. Question: Is it possible to upgrade the Java version to JDK 17?Thanks!
- 8912 Views
- 3 replies
- 4 kudos
- 4 kudos
@orlik @sroeena34 JDK 17 is in public preview for Databricks runtime versions 13.1 and above. Pls refer to this https://docs.databricks.com/en/dev-tools/sdk-java.html#create-a-cluster-that-uses-jdk-17
- 4 kudos
- 2046 Views
- 2 replies
- 0 kudos
Connecting to Azure Databricks deployed using Vnet injection over Public Internet
I'm trying to connect to Azure Databricks (deployed using Vnet injection method) from a 3rd party service running on Azure in the same region. When I try to connect using the Databricks hostname directly in my connection the host name always resolves...
- 2046 Views
- 2 replies
- 0 kudos
- 0 kudos
@arpit thanks for the link and I did review the doc. But it doesn't really say whether it its always required to use an Azure Private Link to connect to databricks deployed with Vnet injection method from a 3rd party service also on deployed on Azure...
- 0 kudos
- 1192 Views
- 1 replies
- 0 kudos
AWS S3 Object Lock Timeline
Hello, I see that Object Lock is not currently supported by Databricks: https://kb.databricks.com/en_US/delta/object-lock-error-write-delta-s3Is there any timeline / roadmap for the support of this feature?
- 1192 Views
- 1 replies
- 0 kudos
- 0 kudos
@timlopez We already have a feature request created on this one. Our product team is currently reviewing this feature request and will decide further on this feature in the product roadmap.
- 0 kudos
- 1238 Views
- 1 replies
- 0 kudos
- 1238 Views
- 1 replies
- 0 kudos
- 0 kudos
@User16752244127 Please elaborate on what do you mean by "features". I am tagging a generic documentation, but if you are looking for something specific please let me know.
- 0 kudos
- 1134 Views
- 1 replies
- 0 kudos
- 1134 Views
- 1 replies
- 0 kudos
- 0 kudos
@mnziza We expect to rollout in Feb in region: Azure Australia East
- 0 kudos
- 1626 Views
- 1 replies
- 0 kudos
Azure SP OAuth wont generate token
Hi all,Hopefully someone has had this issue and fixed it before. No matter how many times I try, no token is generated (yes, i am in a premium tier). Any ideas? I tried using the API but I got this error that apparently is because I am in Azure. for ...
- 1626 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @_rafa ,Currently, we do not have an API similar to the "https://<databricks-instance>/api/2.0/token-management/on-behalf-of/tokens" in azure as Azure supports the service principle authentication by default. Please authenticate to normal PAT ...
- 0 kudos
- 2917 Views
- 1 replies
- 0 kudos
Azure Databricks SQL Warehouse tags not being populated
In Azure cost monitor I am trying to get list of my cluster names and cost associated to it.It works fine with job cluster and all purpos clusters but not for SQL Warehouses.When you look at below example, where I am querying only for my SQL Warehous...
- 2917 Views
- 1 replies
- 0 kudos
- 0 kudos
@Wojciech_BUK I would recommend you get in touch with the Azure Pricing team once to check this in a great deatil.
- 0 kudos
- 1171 Views
- 1 replies
- 0 kudos
How to delete or update cluster-scoped init scripts
We are working on to deprecate the dbfs based cluster-scoped init scripts and replace them with workspace based, therefore finding a way to delete the dbfs cluster-scoped init scripts from all the clusters that are running in our environment using RE...
- 1171 Views
- 1 replies
- 0 kudos
- 0 kudos
@dbx_8451 There is no direct way for doing it, you may need to automate using the list Cluster API.
- 0 kudos
- 1966 Views
- 1 replies
- 0 kudos
No enable option for Customize containers with Databricks Container Services on Azure and GCP
Customize containers with Databricks Container ServicesI want to create a spark cluster with the hail container [1]. But there is no Enable Container Services on the Workspace configuration for Azure Databricks and GCP Databricks.The API does not ret...
- 1966 Views
- 1 replies
- 0 kudos
- 1697 Views
- 1 replies
- 0 kudos
Change users status in account console
One of our users has status = Inactive in Admin setting. I tried to set it to active using rest api and the documentation but seems like api is not changing status.I tested api and I can list users so api setting are ok.api request: import requests i...
- 1697 Views
- 1 replies
- 0 kudos
- 0 kudos
@alesventus Can you test the same using CLI or may be try to make the status as inactive for a dummy user? Just want to validate if the API is actually working for you or not.
- 0 kudos
- 1059 Views
- 1 replies
- 0 kudos
spark.databricks documentation
I cannot find any documentation related to the spark.databricks.* I was able to find the spark related documentation but it does not contain any information on possible properties or arguments for spark.databricks in particular. Thank you!
- 1059 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @ianc , Databricks have documentation for the product feature and its underlying spark properties(as applicable). If you are looking for one place documentation for all the spark.databricks.*, it is not available. hOwever we can help to find pr...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
77 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
25 | |
15 | |
9 |