- 57 Views
- 2 replies
- 1 kudos
No possibility to schedule DLT once per minute
HelloI wanted to setup DLT to run every minute, previously it was possible with setting up in JSON "schedule": "1 * * * *" - but now I see that is not accepted - is there any other workaround to make it working ?
- 57 Views
- 2 replies
- 1 kudos
- 1 kudos
in that case we can attach dlt to job ... but in json for dlt - I was not able to do that still
- 1 kudos
- 88 Views
- 3 replies
- 2 kudos
Notebook is stuck and cluster goes into waiting state while using spark libraries
Hey,We have installed the com.databricks:spark-xml_2.12:0.18.0 library in our VNET-injected Databricks workspace to read XML files from a storage account. The notebook runs successfully for text files when the cluster is started without the library i...
- 88 Views
- 3 replies
- 2 kudos
- 2 kudos
Since it's a maven dependency it should be simply HTTP and port 80/443.Besides, are you aware that native XML support is included since runtime 14.3? This replaces the library spark-xml.
- 2 kudos
- 165 Views
- 1 replies
- 1 kudos
Do Databricks Update the Default Python Libraries in Cluster Runtimes?
Hi all,I’ve been trying to find information about whether Databricks regularly updates the default Python libraries in their cluster runtimes. I checked two different sources but didn’t find clear details.Default python libraries in runtime 11.3 LTS ...
- 165 Views
- 1 replies
- 1 kudos
- 1 kudos
Yes, Databricks does that when releasing new versions of the runtime. Just compare the libraries of the other runtimes.
- 1 kudos
- 264 Views
- 3 replies
- 1 kudos
Can't login to new AWS databricks workspace
Just get our first AWS Databricks workspace created by following the instructions: https://docs.databricks.com/en/admin/workspace/create-uc-workspace.htmladopted the customer-managed VPC and private link for the setup. Upon reviewing the network conf...
- 264 Views
- 3 replies
- 1 kudos
- 1 kudos
the problem resolved after networking team currently set the DNS to point to the endpoint. We can login through the workspace link now.
- 1 kudos
- 239 Views
- 2 replies
- 0 kudos
Terraform Failed to get oauth access token. Please retry after logout and login again. with GCP
Hi I'm having trouble creating a databricks_mws_vpc_endpoint with Terraform.I already created 2 Private Service Connect (PSC) and I'm trying to create the vpc endpoint for Databricks but I'm getting this error:BAD_REQUEST: Failed to get oauth access ...
- 239 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @NelsonE, Please try databricks auth login again. Also, could you please share some error stacks?Additionally, please share the workspace ID (send an email to community@databricks.com) / GCP region so that we can investigate on it further.
- 0 kudos
- 184 Views
- 0 replies
- 0 kudos
Custom Runtime marketplace
Hi! Is there a possibility to share the solution accelerator on the custom runtime via the databricks marketplace?
- 184 Views
- 0 replies
- 0 kudos
- 262 Views
- 2 replies
- 1 kudos
Resolved! Delta sharing gold layer data within organisation but outside of Vnet
Hi all,The organisation I’m working for has a data engineering team using medallion architecture and wants to share materialised views in the gold layer to members of the organisation who do not have access to the Vnet to be delivered in power BI rep...
- 262 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks for that information will take it to the infrastructure team
- 1 kudos
- 732 Views
- 3 replies
- 1 kudos
session expired Error
Within two minutes of logging into Databricks Community Edition, I am receiving the following error message: "Session expired." The session has ended; please log in again." I've attempted to log in again, but the message "Authenticate the session is ...
- 732 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Rajatkumar002, Please review the response and let us know if it answers your question. Your feedback is valuable to us and the community. If the response resolves your issue, kindly mark it as the accepted solution. This will help close the threa...
- 1 kudos
- 276 Views
- 2 replies
- 0 kudos
Terraform databricks new feature
I recently saw a new feature in databricks that allows to set an upper limit of run time for jobs, so it fails when the limit is breached. See image below:Are we able to terraform code this new feature?
- 276 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Rjdudley, Thanks for reaching out! Please review the response and let us know which best addresses your question. Your feedback is valuable to us and the community. If the response resolves your issue, kindly mark it as the accepted solution. T...
- 0 kudos
- 228 Views
- 0 replies
- 0 kudos
How to assign user group for email notification in databricks Alerts
How can I assign a azure databricks user group to an alert for notification?Current scenario is whenever we need to add a user for alert email notification we are manually adding that user email address to each we setup (more than 100) which is very ...
- 228 Views
- 0 replies
- 0 kudos
- 215 Views
- 1 replies
- 0 kudos
Account Access token
Can you please help me with how to authenticate account level rest apis?curl --request GET https://accounts.cloud.databricks.com/api/2.0/accounts/{account_id}/usageI use Permanent Access Token to authenticate workspace APIs but it's not working for w...
- 215 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @shubhdhanuka ,To get access to account API you cannot use personal access token. Instead, you must use the Microsoft Entra ID tokens of Azure Databricks account admins. Azure Databricks account admins can be users or service principals.Below you ...
- 0 kudos
- 272 Views
- 2 replies
- 1 kudos
Databricks + Apache Iceberg = advantageous or wasted effort due to duplicate functionality ?
Trying to design a Lakehouse. Spark is at the base layer. Now wondering if adding Apache Iceberg sitting below Spark will be of help, or, not ? Preferring Iceberg for its auto indexing, ACID query facilities over big hetergenous datasets. Wonder if i...
- 272 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello, if you're planning on building your own open source stack of spark+iceberg, it can be a good choice. If you're on Databricks, however, you're going to miss out a *lot* on delta features that are baked into the platform. Specifically compute +...
- 1 kudos
- 299 Views
- 1 replies
- 0 kudos
Resolved! Prevent service principal UUID from appearing on job name
Hello!I am using service principal id to authenticate my databricks bundle. But when the job runs, this id is automatically appended to both the name and tags column on the jobs run page. In my databricks.yml file I have name: "[${var.environment}]" ...
- 299 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi!Sounds like the "development" mode. DAB will automatically prefix your jobname with <env> <user name> if you set "mode" to "development" in the databricks.yml file. The name lookup for service principals apparently doesn't work nicely and you get ...
- 0 kudos
- 246 Views
- 2 replies
- 2 kudos
List deployed Databricks asset bundles (DABs)?
Is there a databricks cli command or REST API to list all the DABs that have been deployed to a workspace?
- 246 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Schofield, Hi, Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your fee...
- 2 kudos
- 2049 Views
- 2 replies
- 3 kudos
Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)
Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...
- 2049 Views
- 2 replies
- 3 kudos
- 3 kudos
Whats the solution for this? facing same issue.
- 3 kudos
-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
36 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »