- 2130 Views
- 5 replies
- 0 kudos
DLT pipeline unity catalog error
Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...
- 2130 Views
- 5 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 909 Views
- 3 replies
- 1 kudos
Failed to create Cluster on GCP
I am getting following error while trying to create a Cluster for my Workspace Cluster creation failed: Constraint constraints/compute.disableSerialPortLogging violated for projectCloud ENV is GCP and we can't turn off the constraint mentioned above....
- 909 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi, Haven't found any solution so far. What I hoping for is to create a cluster in a way it doesn't require SerialPortLogging so that the Policy Constraint we have e.g. disableSerialPortLogging doesn't come in the way. Not sure how can we do that. Ma...
- 1 kudos
- 1309 Views
- 4 replies
- 0 kudos
Unable to migrate an empty parquet table to delta lake in Databricks
I'm trying to convert my Databricks Tables from Parquet to Delta. While most of the tables have data and are successfully converted to delta some of the empty parquet tables fail with an error message as below -CONVERT TO DELTA <schema-name>.parquet_...
- 1309 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 871 Views
- 2 replies
- 0 kudos
databricks data engineer associate exam I missed retake exam
Hi Team,I had a written exam on Jan 2nd, 2024 but I failed the exam with 65% I misunderstood I needed to take the exam after 14 days.I have missed the chance. Could you please give me a chance to write exam.email:tudururavikiran@gmail.comThanks & Reg...
- 871 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 5932 Views
- 9 replies
- 1 kudos
Resolved! Terraform databricks_storage_credential has wrong External ID
We create storage credentials using Terraform. I don't see any way to specify a given External ID (DBR Account ID) when creating the credentials via Terraform or in the web UI console. However, today when I tried creating a new set of credentials usi...
- 5932 Views
- 9 replies
- 1 kudos
- 1 kudos
I tried the proposed solution using an account provider like this provider "databricks" { account_id = "ACCOUNT_ID" host = "https://accounts.cloud.databricks.com" } for creating the storage credential. However, that did not work. I got an e...
- 1 kudos
- 2640 Views
- 1 replies
- 2 kudos
Built-In Governance for Your Databricks Workspace
Databricks Unity Catalog simplifies data and AI governance by providing a unified solution for organizations to securely discover, access, monitor, and collaborate on a range of data and AI assets. This includes tables, ML models, files and functions...
- 2640 Views
- 1 replies
- 2 kudos
- 1300 Views
- 2 replies
- 3 kudos
DLT Medallion Incremental Ingestion Pattern Approach
Hi there, I have a question regarding what would be the "recommended" incremental ingestion approach using DLT to pull raw landing data into bronze and then silver? The original approach I've been considering is to have raw CSV files arrive in a land...
- 1300 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @ChristianRRL, Your original approach of using a bronze streaming table to ingest raw CSV files and a silver streaming table to de-duplicate the data and enforce data types is a common pattern. This approach is beneficial when dealing with large d...
- 3 kudos
- 1571 Views
- 1 replies
- 0 kudos
Databricks SQL Identifier Variables
Hi all.Just trying to implement adb sql scripts using identifier clause but I have errors like that using an example:DECLARE mytab = 'tab1'; CREATE TABLE IDENTIFIER(mytab) (c1 INT);The feature is not supported: Temporary variables are not yet support...
- 1571 Views
- 1 replies
- 0 kudos
- 0 kudos
@RobsonNLPT - The feature development is still in place. Just the docs are released prior to the feature availability which is an usual process. The feature will be released on preview channel with a tentative ETA on Feb 20 as of now. Alternatively...
- 0 kudos
- 1925 Views
- 1 replies
- 0 kudos
How to overwrite the existing file using databricks cli
If i use databricks fs cp then it does not overwrite the existing file, it just skip copying the file. Any suggestion how to overwrite the file using databricks cli?
- 1925 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @vijaykumar99535, To overwrite an existing file using the Databricks CLI, you can use the --overwrite option with the cp command. Here’s an example: databricks fs cp <source_path> <destination_path> --overwrite The --overwrite option ensu...
- 0 kudos
- 817 Views
- 1 replies
- 0 kudos
Can only connect from tableau cloud using Compute Cluster
We are trying to connect tableau cloud to databricks. We have a serverless sql warehouse and a pro warehouse, both of those warehouses are not able to connect.Can’t connect to DatabricksDetailed Error Message There was an unknown connection error to...
- 817 Views
- 1 replies
- 0 kudos
- 0 kudos
@questions - It seems the current catalog is set to empty. can you please change the default catalog name to hive_metastore ?
- 0 kudos
- 565 Views
- 1 replies
- 0 kudos
Data view in Side Panel
Does anyone know why I cannot see the Data view in the side panel under workspace. I see catalog instead of data. Is this something that has been upgraded?
- 565 Views
- 1 replies
- 0 kudos
- 0 kudos
@DB_Keith - Data Explorer is renamed to catalog Explorer. Please refer to the release notes. https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/september#data-explorer-is-now-catalog-explorer
- 0 kudos
- 540 Views
- 1 replies
- 0 kudos
Parsed Logical Plan report UnresolvedHint RANGE_JOIN
I'm new to RANGE_JOIN so this may be completely normal, but I'd like confirmation.Whenever I put a RANGE_JOIN hint in my query SELECT /*+ RANGE_JOIN(pr2, 3600) */ event.FirstIP4Record FROM SCHEMA_NAME_HERE.dnsrequest event INNER JOIN SC...
- 540 Views
- 1 replies
- 0 kudos
- 0 kudos
@hukel - The query above does not have a range join, the range filter is not a join condition and it is evaluated as a regular filter. Please refer to the criteria on range join optimization for joins. Have a condition that can be interpreted as ...
- 0 kudos
- 651 Views
- 1 replies
- 0 kudos
Unable to create a Unity Catalog
https://accounts.azuredatabricks.net/data/createI am unable to access the above link to start with Unity Catalog.
- 651 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you an account admin? If you access to https://accounts.azuredatabricks.net/ are you able to see the console or just the workspaces that are currently available for you?
- 0 kudos
- 2237 Views
- 1 replies
- 0 kudos
Resolved! Have code stay hidden even when the notebook is copied
When I save a certain Python notebook where I have selected Hide Code and Hide Results on certain cells, those conditions persist. For example, when I come back the next day in a new session, the hidden material is still hidden.When the notebook is ...
- 2237 Views
- 1 replies
- 0 kudos
- 0 kudos
In Databricks, the 'Hide Code' and 'Hide Results' actions are part of the interactive notebook UI and are not saved as part of the notebook source code. Therefore, these settings won't persist when the notebook is copied or moved to a new location th...
- 0 kudos
- 1068 Views
- 1 replies
- 0 kudos
Is it possible to view Databricks cluster metrics using REST API
I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...
- 1068 Views
- 1 replies
- 0 kudos
- 0 kudos
There is currently no option available to get this metrics available through API, but is coming soon.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »