- 2612 Views
- 1 replies
- 1 kudos
Resolved! Run tasks conditionally "Always" condition missing?
Does the new feature 'Run If' that allows you to run tasks conditionally lack the 'ALWAYS' option? In order to execute the task both when there is OK and error from the dependencies
- 2612 Views
- 1 replies
- 1 kudos
- 1 kudos
You can choose the All Done option to run the task in both the scenarios
- 1 kudos
- 6656 Views
- 2 replies
- 3 kudos
How to send alert when cluster is running for too long
Hello,Our team recently experienced an issue where a teammate started a new workflow job then went on vacation. This job ended up running continuously without failing for 4.5 days. The usage of the cluster did not seem out of place during the workday...
- 6656 Views
- 2 replies
- 3 kudos
- 3 kudos
@Retired_mod,I ended up creating a job leveraging the Databricks Python SDK to check cluster and active job run times. The script will raise an error and notify the team if the cluster hasn't terminated or restarted in the past 24 hours or if a job h...
- 3 kudos
- 1361 Views
- 0 replies
- 0 kudos
Unable to connect to databricks compute server on making first request to the server
Hello all, I am facing an issue while making the first request to the Databricks compute server It will be taking so much time, and in response 504 Gateway timeout error is throwing Could anyone please suggest what is the best possibility we can do?
- 1361 Views
- 0 replies
- 0 kudos
- 989 Views
- 0 replies
- 0 kudos
DBX Sync Command --unmatched-behaviour=unspecified-delete-unmatched not working
We are using dbx command to sync the objects from the local to Databricks workspace, we are using the below command to sync the data,dbx sync workspace --unmatched-behaviour=unspecified-delete-unmatched -s /tmp -d /tmpWe have deleted some files loca...
- 989 Views
- 0 replies
- 0 kudos
- 679 Views
- 0 replies
- 0 kudos
Analyze data metodoly
Hello,I have an ETL process that ingests data into bronze tables, transforms the data, and then ingests it into silver tables before finally populating the gold tables. This workflow is executed every 5 minutes. When I want to analyze the data or app...
- 679 Views
- 0 replies
- 0 kudos
- 929 Views
- 0 replies
- 0 kudos
Unable to login using company email - help migrate account please
Hi, originally I accidentally made a customer academy account with my company that is a databricks partner. Then I made an account using my personal email and listed my company email as the partner email for the partner academy account. that account ...
- 929 Views
- 0 replies
- 0 kudos
- 1073 Views
- 1 replies
- 0 kudos
I can't access to my account
Hi, I can't access to my account, and need to book an exam. I completed my registration at: https://www.webassessor.com/form/createAccount.do, and when I try to login I have this error: "Login or Password is incorrect"Please help me with this issue. ...
- 1073 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DavidValdez Looks like you were able to schedule your exam. If you experience any other issues you can request support here.We also have a new FAQ: https://www.databricks.com/learn/certification/faq
- 0 kudos
- 12374 Views
- 1 replies
- 0 kudos
Accessing TenantId via secret to connect to Azure Data Lake Storage Gen2 doesn't work
Hello,I'm following instructions in this article to connect to ADLS gen2 using Azure service principal. I can access service principal's app id and secret via Databricks key vault backed secret scope. However, this doesn't work for directory-id and I...
- 12374 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod , Thanks for the prompt reply. As per the document, the syntax is the text highlighted in red below for accessing keys from secret scope in spark config. I used the same for app id too and that works. But I if use the same syntax for ...
- 0 kudos
- 1268 Views
- 1 replies
- 0 kudos
Snowflake Data Formatting Issue
I'm loading snowflake data to delta tables in databricks, few columns in snowflake data have datatype as Number (20,7) after loading to delta table it is taking as decimal (20,7), for example, if the value is 0.0000000 in snowflake then it is showing...
- 1268 Views
- 1 replies
- 0 kudos
- 0 kudos
explicit casting seems like the way to go.First try with one column, to see if that solves your issue.If so, you can write a function that casts all decimal columns to a certain precision, something like this:def convert_decimal_precision_scale(df, p...
- 0 kudos
- 2698 Views
- 4 replies
- 1 kudos
Why is importing python code supported in Repos but not in Workspaces ?
Hi, we currently use a one repo approach which does not require a local development environment (we utilize azure dev ops and nutter for automated tests). We also have shared code accross pipelines and started with %run-sytle modularization and have ...
- 2698 Views
- 4 replies
- 1 kudos
- 1 kudos
the why is most probably because of different development tracks/teams between workspace and repos.If they will consilidate in functionality? Can't tell, only Databricks knows that; but it seems reasonable to assume the files will also be added to w...
- 1 kudos
- 1701 Views
- 3 replies
- 0 kudos
hive_metastore Access Control by different cluster type
Hello Databricks Community,I'm reaching out with a query regarding access control in the hive_metastore. I've encountered behavior that I'd like to understand better and potentially address.To illustrate the situation:I've set up three users for test...
- 1701 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Debayan, thank you for your reply. with hive_metastore, still I cannot get the level of isolation, which means that if anyone activates the Single node cluster, she/he can see all the catalog, schema, and table. However, with Unity catalog appli...
- 0 kudos
- 1624 Views
- 0 replies
- 0 kudos
Workflows pricing
Hi there,I checked the Databricks page on the pricing of Databricks Workflows ( https://www.databricks.com/product/pricing/jobs ) and have a question regarding the cost components: the pricing page only mentions compute costs (depending whether it's ...
- 1624 Views
- 0 replies
- 0 kudos
- 2154 Views
- 0 replies
- 0 kudos
Databricks community edition and cloud platform
I have been using databricks in work for a few years now and absolutely love it.I have been wanting to use it at home but I dont have a ton of money to spend.Any community members can advise a good cloud option to run databricks on that's relatively ...
- 2154 Views
- 0 replies
- 0 kudos
- 4354 Views
- 0 replies
- 0 kudos
how to resolve "Error pushing changes" Remote ref update was rejected issue
how to resolve "Error pushing changes" Remote ref update was rejected issue even after having all edit access on remote ado repo
- 4354 Views
- 0 replies
- 0 kudos
- 881 Views
- 0 replies
- 0 kudos
Thinking of Building a new generative AI speech model for personal use
I have been using a voice cloning AI for quite some time now, it works pretty fine and that made me think if I could deploy Databricks to to build and train a machine learning model for the speech technology industry. At first I'm trying to apply it ...
- 881 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |