- 1136 Views
- 5 replies
- 0 kudos
Serveless compute does need has cloud accout(AWS、Google 、Azure)
I am a Databricks beginner, and I would like to ask if the Compute created in the Databricks account , it means also exists in the cloud account (e.g., AWS)? If the AWS account is deactivated, the existing compute will not be usable. This is what I h...
- 1136 Views
- 5 replies
- 0 kudos
- 0 kudos
@FanMichelleTW No, Databricks recommends using serverless compute, and you can use serverless compute as well.To do so, open a notebook and check the top-right corner to see if a serverless compute option is in a Ready state. If it is, simply select ...
- 0 kudos
- 4228 Views
- 4 replies
- 4 kudos
Creating a hierarchy without recursive statements
I am looking to build a hierarchy from a parent child relationship table, which I would typically use a recursive statement for in SQL Server / Azure SQL. This would mean setting an anchor, most commonly the top record of the tree, and then join back...
- 4228 Views
- 4 replies
- 4 kudos
- 4 kudos
Thank you, I will give this a try. I'll let you know how it goes.
- 4 kudos
- 1515 Views
- 4 replies
- 0 kudos
Error with Read XML data using the spark-xml library
hi, would appritiate any help with an error with loading an XML file with spark-xml library.my enviorment :14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12)library : com.databricks:spark-xml_2.12:0.15.0on databricks notebook.when running this scrip...
- 1515 Views
- 4 replies
- 0 kudos
- 0 kudos
UPDATE:It is now possible to read xml files directly: https://docs.databricks.com/en/query/formats/xml.html Make sure to update your Databricks Runtime to 14.3 and above, and remove the spark-xml maven library from your cluster.
- 0 kudos
- 560 Views
- 1 replies
- 0 kudos
Handling Over-Usage of Capacity in Databricks Jobs/Processes
Hi all,Is there a tool or method in Databricks to ensure data integrity and stability when a job or process exceeds the allocated capacity? Specifically, I’m looking for ways to:Prevent failures or data loss due to resource overuse.Automatically scal...
- 560 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @smanda88 - For point 1, please see; https://docs.databricks.com/en/lakehouse-architecture/reliability/best-practices.html For 2, you can use auto-scaling, please refer to: https://docs.databricks.com/en/lakehouse-architecture/cost-optimization...
- 0 kudos
- 547 Views
- 1 replies
- 0 kudos
Where to find Jupyter Notebook course materials for Get Started with Databricks for Generative AI
Hello, I can't seem to find any way to gain access to the Jupyter Notebook demo source of "Get Started with Databricks for Generative AI" course. Please help. Thank you kindly in advance.
- 547 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @nathanmle! We are sorry to inform you that we are no longer offering notebooks or the DBC files for the self-paced courses due to recent changes.If you’re interested in working on labs in a provided Databricks environment, you can purchase the...
- 0 kudos
- 2958 Views
- 0 replies
- 0 kudos
Databricks CE - Where is the quickstart tutorial?
Hello! I was looking through Databricks tutorials online, but my interface looks different from many of the videos I'm seeing. What happened to the Quickstart tutorials on the home page? Are they no longer available on the dashboard?
- 2958 Views
- 0 replies
- 0 kudos
- 1927 Views
- 1 replies
- 2 kudos
Databricks asset bundles dependencies
Is anyone aware of a way to include a requirements.txt within the job definition of a databricks asset bundle? Documentation mentions how to have dependencies in workspace files, or Unity Catalog volumes, but I wanted to ask if it is possible to decl...
- 1927 Views
- 1 replies
- 2 kudos
- 3289 Views
- 1 replies
- 0 kudos
Error uploading files to a Unity Catalog volume in Databricks
Hi everyone,I'm developing an API in Flask that interacts with Databricks to upload files to a Unity Catalog volume, but I'm encountering the following error:{"error_code": "ENDPOINT_NOT_FOUND", "message": "No se encontró API para 'POST /unity-catalo...
- 3289 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @EngHol, This endpoint: /api/2.0/unity-catalog/volumes/upload is not a valid one, hence the issue. Looking at the API for volumes, unfortunately there is no way to upload to a volume: https://docs.databricks.com/api/workspace/volumes
- 0 kudos
- 801 Views
- 2 replies
- 0 kudos
Hide function definition in Unity catalog
Hi ,I have created a function to anonymize user id using secret.I want to give access of this function to other users so they can execute it without giving access to the secret .Is it possible in databricks? I have tested it and see user is not able ...
- 801 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @NehaR, I am afraid it might not be possible without giving secret access to the users. Another approach would be to use a Service Principal.
- 0 kudos
- 3115 Views
- 0 replies
- 0 kudos
CloudFormation Stack Failure: Custom::CreateWorkspace in CREATE_FAILED State
I am trying to create a workspace using AWS CloudFormation, but the stack fails with the following error:"The resource CreateWorkspace is in a CREATE_FAILED state. This Custom::CreateWorkspace resource is in a CREATE_FAILED state. Received response s...
- 3115 Views
- 0 replies
- 0 kudos
- 930 Views
- 1 replies
- 1 kudos
Resolved! Container lifetime?
When launching a job via "Create and trigger a one-time run" (docs), when using a custom image (docs), what's the lifetime of the container? Does it create the cluster, start the container, run the job, then terminate the container? Or does the runni...
- 930 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @mrstevegross Cluster Creation: When you submit a job using the "Create and trigger a one-time run" API, a new cluster is created if one is not specified.Container Start: The custom Docker image specified in the cluster configuration is us...
- 1 kudos
- 3211 Views
- 0 replies
- 0 kudos
DLT detecting changes but not applying them
We have three source tables used for a streaming dimension table in silver. Around 50K records are changed in one of the source tables, and the DLT pipeline shows that it has updated those 50K records, but they remain unchanged. The only way to pick ...
- 3211 Views
- 0 replies
- 0 kudos
- 853 Views
- 3 replies
- 0 kudos
No confirmation mail received after scheduling the exam
Hi team, I have scheduled my Databricks Data Engineer Associate exam for 12th Feb 2025 using the below mail id, but I still have not received any confirmation mail there. I have checked spam folder too.Could you please resend it to barnitac@kpmg.com ...
- 853 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi team, I have cleared my exam today. Unfortunately I have not received a single mail either to confirm my exam or to confirm test completion and result. @Cert-Team
- 0 kudos
- 896 Views
- 2 replies
- 1 kudos
Partnership with Databricks
Hello,What are the pre-requisites to become Databricks partner?
- 896 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @Bala_K! For information on becoming a Databricks partner, please email partnerops@Databricks.com. They can guide you through the prerequisites and next steps.
- 1 kudos
- 3044 Views
- 0 replies
- 0 kudos
How to Define Constants at Bundle Level in Databricks Asset Bundles for Use in Notebooks?
I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...
- 3044 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
112 | |
56 | |
42 | |
30 |