- 1746 Views
- 5 replies
- 0 kudos
Building a Custom Usage Dashboard using APIs for Job-Level Cost Insights
Since Databricks does not provide individual cost breakdowns for components like Jobs or Compute, we aim to create a custom usage dashboard leveraging APIs to display the cost of each job run across Databricks, Azure Data Factory (ADF), or serverless...
- 1746 Views
- 5 replies
- 0 kudos
- 0 kudos
Hey,Yes, I am not Azure expert but, Databricks REST API can help you extract usage data for serverless resources, allowing you to integrate this information into custom dashboards or external tools like Grafana.On the Azure side, costs related to wil...
- 0 kudos
- 3226 Views
- 4 replies
- 1 kudos
Resolved! Permission denied during write
Hey everyone,I have a pipeline that fetches data from s3 and stores them under the Databricks .tmp/ folder.The pipeline is always able to write around 200 000 files before I get a Permission Denied error. This happens in the following code block: os....
- 3226 Views
- 4 replies
- 1 kudos
- 1 kudos
Thanks for your reply Walter! The filenames are already unique, retries produce the same result and I have the necessary permission as I was able to write the other 200 000 files (with the same program that is running continuous). It does makes sense...
- 1 kudos
- 15692 Views
- 12 replies
- 0 kudos
Resolved! databricks data engineer associate exam
Hello Team, I encountered Pathetic experience while attempting my 1st Databricks certification. I was giving the exam and Abruptly, Proctor asked me to show my desk, everything i showed every corner of my bed.. It was neat and clean with no suspiciou...
- 15692 Views
- 12 replies
- 0 kudos
- 0 kudos
Hi @gokul2 the badge was issued on Dec 2. We just resent the email. Please check your spam. If you continue to have issues, please file a ticket with our support team: https://help.databricks.com/s/contact-us?ReqType=training
- 0 kudos
- 1471 Views
- 3 replies
- 0 kudos
Resolved! How to grant custom container AWS credentials for reading init script?
I'm using a customer container *and* init scripts. At runtime, I get this error:Cluster '...' was terminated. Reason: INIT_SCRIPT_FAILURE (CLIENT_ERROR). Parameters: instance_id:i-0440ddd3a2d5cce79, databricks_error_message:Cluster scoped init script...
- 1471 Views
- 3 replies
- 0 kudos
- 0 kudos
Followup: I got the AWS creds working by amending our AWS role to permit read/write access to our S3 bucket. Woohoo!
- 0 kudos
- 1765 Views
- 3 replies
- 0 kudos
Resolved! Format when specifying docker_image url?
I am providing a custom Docker image to my Databricks/Spark job. I've created the image and uploaded it to our private ECR registry (the URL is `472542229217.dkr.ecr.us-west-2.amazonaws.com/tectonai/mrstevegross-testing:latest`). Based on the docs (h...
- 1765 Views
- 3 replies
- 0 kudos
- 0 kudos
Thanks, that's pretty much what I did; a lot of terraform configuration to get the AWS account set up properly, and now I'm able to tell DBR to load the container. (FWIW, I'm encountering *new* access issues; I started a thread here (https://communit...
- 0 kudos
- 1002 Views
- 1 replies
- 1 kudos
Unable to Register Models After Uploading Artifacts to DBFS in Databricks
Hi everyone,I'm currently working on a project where I'm migrating models and artifacts from a source Databricks workspace to a target one. I've written a script to upload the model artifacts from my local system to DBFS in the target workspace (usi...
- 1002 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sudheer2, Does it give you any error while trying to register the model?
- 1 kudos
- 837 Views
- 2 replies
- 0 kudos
Scheduling multiple jobs (workflows) in DABs
Hello, I'm wondering how can I schedule multiple jobs (workflow).I'd like to do something like this but on a workflow level. tasks: - task_key: task_1 sql_task: warehouse_id: ${var.warehouse_id} paramet...
- 837 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Greg_c, You can try with this sctructure: - In the main databricks.yml # databricks.ymlbundle:name: master-bundle include:- resources/*.yml # Other bundle configurations... In resource directory, create a YAML for each job: # resources/job1.ymlre...
- 0 kudos
- 1922 Views
- 4 replies
- 1 kudos
ALIAS Not accepted 42601
I am unable to run the following query generated from my backend at databricks sideQuery: SELECT "A".`cut` AS "Cut" , "A".`color` AS "Color" , "A".`carat` AS "Carat" , "A".`clarity` AS "Clarity" FROM databricksconnect.default.diamonds "A" Error logs...
- 1922 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @malhm ,Double quotes are not supported in column alias. In Databricks SQL/Spark SQL one uses backticks instead of double quotes like in PostgreSQL.Check the docs:https://spark.apache.org/docs/3.5.1/sql-ref-identifier.html
- 1 kudos
- 3721 Views
- 3 replies
- 0 kudos
Resolved! DataBricks x Query Folding Power BI
I ran a native Power BI query in DataBricks in import mode and query folding was not enabled. No query folding?
- 3721 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Iguinrj11 ,The trick is to configure Databricks.Query instead of Databricks.Catalogs.Check this article and let us know if that helps:https://www.linkedin.com/pulse/query-folding-azure-databricks-tushar-desai/
- 0 kudos
- 1363 Views
- 2 replies
- 0 kudos
Dynamic Bloom Filters for Inner Joins
I have a question regarding combining the use of Bloom filters with Liquid Clustering to further reduce the data read during a join/merge on top of dynamic file pruning. Testing both combined worked extremely well together for point queries. However ...
- 1363 Views
- 2 replies
- 0 kudos
- 0 kudos
We do not recommend Bloom filters Index on the Delta Tables as they have to be manually maintained. If you prefer photon - please try predictive I/O with Liquid Clustering.
- 0 kudos
- 1610 Views
- 4 replies
- 0 kudos
Automate run as workflow parameter to default to current user
I am trying to run a workflow within Databricks. I have 2 workflows, workflow one which always runs as the service principal, as all data gets accessed and wrangled within this workflow, and workflow 2 which always defaults to the last run account. I...
- 1610 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, how are you expecting to achieve this? Do you want users who are manually triggering this workflow first update to their run_as? or you want to make this happen programatically?
- 0 kudos
- 1037 Views
- 2 replies
- 0 kudos
Create csv and upload on azure
Can some write a sql query , which queries a table like select * from stages.benefit , creates a csv and upload on azure
- 1037 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @subhadeep ,You can achieve this in SQL similarly to how you write a dataframe into a table or blob path. We will create an external table pointing to the blob path or mounted blob path. Note that this table does not support ACID transactions and ...
- 0 kudos
- 2143 Views
- 10 replies
- 2 kudos
01_demo_setup error
HelloI was following "Demo: Creating and Working with a Delta Table"while I have a community edition user.The first command in the Notebook is: %run ./setup/01_demo_setup But I got the following error:Notebook not found: Users/<my-email-was-here..>/s...
- 2143 Views
- 10 replies
- 2 kudos
- 2 kudos
Hey!Sad news guys... if you go to Course Logistics Review you can read:"We are pleased to offer a version of this course that also contains hands-on practice via a Databricks Academy Labs subscription. With a Databricks Academy Labs subscription, you...
- 2 kudos
- 1102 Views
- 1 replies
- 0 kudos
Databricks app giving 'upstream request timeout '
Hello all,We are developing an app which is based on flask, which is used to download logs from databricks dbfs location. For this useful case we are using databricks inbuilt App feature to deploy our app.While we pass a smaller file it is getting do...
- 1102 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey!It looks like the issue you’re facing might be related to the proxy timeout when downloading large files from DBFS. Since modifying the proxy settings might not be an option, there are a couple of alternative approaches you could consider to miti...
- 0 kudos
- 6209 Views
- 3 replies
- 0 kudos
50%-off Databricks certification voucher
Hello Databricks Community Team, I am reaching out to inquire about the Databricks certification voucher promotion for completing the Databricks Learning Festival (Virtual) courses.I completed one of the Databricks Learning Festival courses July 2024...
- 6209 Views
- 3 replies
- 0 kudos
- 0 kudos
I have already finished the course, how do I get the discount?
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
BI Integrations
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 120 | |
| 57 | |
| 42 | |
| 35 |