- 1354 Views
- 1 replies
- 0 kudos
Is it possible to view Databricks cluster metrics using REST API
I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...
- 1354 Views
- 1 replies
- 0 kudos
- 0 kudos
There is currently no option available to get this metrics available through API, but is coming soon.
- 0 kudos
- 2315 Views
- 1 replies
- 1 kudos
Unable to reconstruct state at version ? as the transaction log has been truncated
We have a small table, which undergoes a merge operation on a daily basis. This causes that currently the table has 83 versions.When trying to query this table, we receive the following error: DeltaFileNotFoundException: dbfs:/mnt/XXXXX/warehous...
- 2315 Views
- 1 replies
- 1 kudos
- 1 kudos
It is difficult to say exactly what caused the issue as there are multiple underlying reasons. But as a general advice, you can try setting the "delta.checkpointRetentionDuration=7 days" and also you can try switching to a different DBR version to se...
- 1 kudos
- 1884 Views
- 4 replies
- 0 kudos
Need reattempt to Certified Data Engineer Associate
Hello Team/ @Cert-Team ,I encountered Pathetic experience while attempting my 1st DataBricks certification. My exam got suspended within 10min after starting my exam. Abruptly, Proctor asked me to show my desk, walls, room, wallet, id card, adapter,...
- 1884 Views
- 4 replies
- 0 kudos
- 3470 Views
- 2 replies
- 2 kudos
Resolved! DLT Unity catalog schema no storage location mention details
Hi TeamAs part of an earlier That I did with Databricks team , I got the info for that if one wants to dump the data in unity catalog schema from a DLT pipeline , The specific schema's Storage location must not be specified . else he DLT pipeline wil...
- 3470 Views
- 2 replies
- 2 kudos
- 2 kudos
Thanks @Walter_C for the explanation and confirming the understanding. Really Appreaciate
- 2 kudos
- 8344 Views
- 0 replies
- 1 kudos
5 tips to get the most out of your Databricks Assistant
Back in July, we released the public preview of the new Databricks Assistant, a context-aware AI assistant available in Databricks Notebooks, SQL editor and the file editor that makes you more productive within Databricks, including: Generate SQL or ...
- 8344 Views
- 0 replies
- 1 kudos
- 1959 Views
- 0 replies
- 0 kudos
Bringing breakthrough data intelligence to industries
Gen AI for All: Empowering Every Role Across Industries The new frontier of data intelligence is here. As more companies pursue industry-changing transformations, they face the same monumental challenge: how to democratize data and AI. In this new r...
- 1959 Views
- 0 replies
- 0 kudos
- 1881 Views
- 1 replies
- 0 kudos
Spark doesn't register executors when new workers are allocated
Our pipelines sometimes get stuck (example).Some workers get decommissioned due to spot termination and then the new workers get added. However, after (1) Spark doesn't notice new executors: And I don't know why. I don't understand how to debug this,...
- 1881 Views
- 1 replies
- 0 kudos
- 0 kudos
@ivanychev - Firstly, New workers are added and spark notice them hence, there is an init script logging in the event log stating the init script ran on the newly added workers. For debugging, please check the Spark UI - executor tab. Secondly, Fo...
- 0 kudos
- 3190 Views
- 1 replies
- 0 kudos
cannot create external location: invalid Databricks Workspace configuration
HI AllI am trying to create databricks storage credentials , external location and catalog with terraform.cloud : AzureMy storage credentials code is working correctly . But the external location code is throwing below error when executing the Terraf...
- 3190 Views
- 1 replies
- 0 kudos
- 0 kudos
HI @Retired_modThanks for the reply . Even after correcting my databricks workspace provider configuration . I am not able to create 3 external location in databricks workspace.i am using below code in terraform .Provider .tf provider "databricks" {...
- 0 kudos
- 916 Views
- 1 replies
- 0 kudos
Databricks Widget
Hi,I was previously working databricks runtime 10.0 and now just upgraded to 13.0 runtime.I was using dashboard to display the widgets. Before it was just showing the widget label, but now it shows the widget name below it as well. Also it shows the ...
- 916 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @aman_yadav007, which widget type did you use? Can you please try a different widget type or check the widget type and its arguments from this example: https://docs.databricks.com/en/notebooks/widgets.html#databricks-widgets
- 0 kudos
- 2714 Views
- 2 replies
- 0 kudos
DLT pipeline unity catalog error
Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...
- 2714 Views
- 2 replies
- 0 kudos
- 0 kudos
I get a similar error, when there is a mistake in the @dlt.table() definition for a table. In my case the culprit is usually the path.
- 0 kudos
- 1703 Views
- 2 replies
- 0 kudos
Unable to migrate an empty parquet table to delta lake in Databricks
I'm trying to convert my Databricks Tables from Parquet to Delta. While most of the tables have data and are successfully converted to delta some of the empty parquet tables fail with an error message as below -CONVERT TO DELTA <schema-name>.parquet_...
- 1703 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello Bharathi, Ideally the ETL job should not generate the empty parquet files in the respective location as it's an overhead to read the empty file and it's a not best practice.Assuming this can be easily fix in ETL job while getting the rows count...
- 0 kudos
- 452 Views
- 0 replies
- 0 kudos
L'importance de Databricks dans le SEO
Le SEO est un domaine dynamique et complexe qui évolue constamment avec les technologies et les algorithmes de recherche. L'utilisation de Databricks, une plateforme d'analyse basée sur le cloud, a révolutionné la manière dont les spécialistes du SEO...
- 452 Views
- 0 replies
- 0 kudos
- 6311 Views
- 2 replies
- 0 kudos
Read CSV files in Azure Databricks notebook, how to read data when columns in CSV files are in the w
I have a task to revise CSV ingestion in Azure Databricks. The current implementation uses the below settings: source_query = ( spark.readStream.format("cloudFiles") .option("cloudFiles.format", "csv") .schema(defined_schema) .option(...
- 6311 Views
- 2 replies
- 0 kudos
- 0 kudos
Also, I am looking for a solution that works with both correct files and malformed files using PySpark.
- 0 kudos
- 10315 Views
- 2 replies
- 1 kudos
Creating High Quality RAG Applications with Databricks
Retrieval-Augmented-Generation (RAG) has quickly emerged as a powerful way to incorporate proprietary, real-time data into Large Language Model (LLM) applications. Today we are excited to launch a suite of RAG tools to help Databricks users build hig...
- 10315 Views
- 2 replies
- 1 kudos
- 1 kudos
It seems like you're sharing an announcement or promotional content related to Databricks and their launch of a suite of tools for Retrieval-Augmented-Generation (RAG) applications. These tools are aimed at helping Databricks users build high-quality...
- 1 kudos
- 4396 Views
- 2 replies
- 1 kudos
Power BI Import Model Refresh from Databricks SQL Whse - Query has been timed out due to inactivity
We have an intermittant issue where occasionally a partition in our Power BI Import Dataset times out at 5 hours. When I look at Query History in Databricks SQL, I see a query that failed with the following error message: "Query has been timed out ...
- 4396 Views
- 2 replies
- 1 kudos
- 1 kudos
The only solution we have been able to come up with was to create a Notebook in Databricks that uses the Power BI API to check the status of a Refresh. We schedule it a bit after we expect the Refresh to complete. If it is still running, we kill th...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »