cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NP7
by New Contributor II
  • 2130 Views
  • 5 replies
  • 0 kudos

DLT pipeline unity catalog error

 Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...

  • 2130 Views
  • 5 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
4 More Replies
salib
by New Contributor II
  • 909 Views
  • 3 replies
  • 1 kudos

Failed to create Cluster on GCP

I am getting following error while trying to create a Cluster for my Workspace Cluster creation failed: Constraint constraints/compute.disableSerialPortLogging violated for projectCloud ENV is GCP and we can't turn off the constraint mentioned above....

  • 909 Views
  • 3 replies
  • 1 kudos
Latest Reply
salib
New Contributor II
  • 1 kudos

Hi, Haven't found any solution so far. What I hoping for is to create a cluster in a way it doesn't require SerialPortLogging so that the Policy Constraint we have e.g. disableSerialPortLogging doesn't come in the way. Not sure how can we do that. Ma...

  • 1 kudos
2 More Replies
Bharathi-Rajen
by New Contributor II
  • 1309 Views
  • 4 replies
  • 0 kudos

Unable to migrate an empty parquet table to delta lake in Databricks

I'm trying to convert my Databricks Tables from Parquet to Delta. While most of the tables have data and are successfully converted to delta some of the empty parquet tables fail with an error message as below -CONVERT TO DELTA <schema-name>.parquet_...

  • 1309 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
3 More Replies
ravituduru
by New Contributor
  • 871 Views
  • 2 replies
  • 0 kudos

databricks data engineer associate exam I missed retake exam

Hi Team,I had a written exam on Jan 2nd, 2024 but I failed the exam with 65% I misunderstood I needed to take the exam after 14 days.I have missed the chance. Could you please give me a chance to write exam.email:tudururavikiran@gmail.comThanks & Reg...

  • 871 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
dvmentalmadess
by Valued Contributor
  • 5932 Views
  • 9 replies
  • 1 kudos

Resolved! Terraform databricks_storage_credential has wrong External ID

We create storage credentials using Terraform. I don't see any way to specify a given External ID (DBR Account ID) when creating the credentials via Terraform or in the web UI console. However, today when I tried creating a new set of credentials usi...

Screenshot 2023-11-28 at 6.37.50 PM.png
  • 5932 Views
  • 9 replies
  • 1 kudos
Latest Reply
Mathias_Peters
Contributor
  • 1 kudos

I tried the proposed solution using an account provider like this provider "databricks" { account_id = "ACCOUNT_ID" host = "https://accounts.cloud.databricks.com" } for creating the storage credential. However, that did not work. I got an e...

  • 1 kudos
8 More Replies
Sujitha
by Community Manager
  • 2640 Views
  • 1 replies
  • 2 kudos

Built-In Governance for Your Databricks Workspace

Databricks Unity Catalog simplifies data and AI governance by providing a unified solution for organizations to securely discover, access, monitor, and collaborate on a range of data and AI assets. This includes tables, ML models, files and functions...

Screenshot 2024-01-12 at 1.50.49 PM.png Screenshot 2024-01-12 at 1.49.35 PM.png
  • 2640 Views
  • 1 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Moderator
  • 2 kudos

Thank you for sharing this @Sujitha 

  • 2 kudos
ChristianRRL
by Contributor III
  • 1300 Views
  • 2 replies
  • 3 kudos

DLT Medallion Incremental Ingestion Pattern Approach

Hi there, I have a question regarding what would be the "recommended" incremental ingestion approach using DLT to pull raw landing data into bronze and then silver? The original approach I've been considering is to have raw CSV files arrive in a land...

  • 1300 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @ChristianRRL, Your original approach of using a bronze streaming table to ingest raw CSV files and a silver streaming table to de-duplicate the data and enforce data types is a common pattern. This approach is beneficial when dealing with large d...

  • 3 kudos
1 More Replies
RobsonNLPT
by Contributor
  • 1571 Views
  • 1 replies
  • 0 kudos

Databricks SQL Identifier Variables

Hi all.Just trying to implement adb sql scripts using identifier clause but I have errors like that using an example:DECLARE mytab = 'tab1'; CREATE TABLE IDENTIFIER(mytab) (c1 INT);The feature is not supported: Temporary variables are not yet support...

  • 1571 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@RobsonNLPT  - The feature development is still in place. Just the docs are released prior to the feature availability which is an usual process. The feature will be released on preview channel with a tentative ETA on Feb 20 as of now.  Alternatively...

  • 0 kudos
vijaykumar99535
by New Contributor III
  • 1925 Views
  • 1 replies
  • 0 kudos

How to overwrite the existing file using databricks cli

If i use databricks fs cp then it does not overwrite the existing file, it just skip copying the file. Any suggestion how to overwrite the file using databricks cli?

  • 1925 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @vijaykumar99535, To overwrite an existing file using the Databricks CLI, you can use the --overwrite option with the cp command.    Here’s an example:   databricks fs cp <source_path> <destination_path> --overwrite The --overwrite option ensu...

  • 0 kudos
questions
by New Contributor
  • 817 Views
  • 1 replies
  • 0 kudos

Can only connect from tableau cloud using Compute Cluster

We are trying to connect tableau cloud to databricks.  We have a serverless sql warehouse and a pro warehouse, both of those warehouses are not able to connect.Can’t connect to DatabricksDetailed Error Message There was an unknown connection error to...

  • 817 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@questions  - It seems the current catalog is set to empty. can you please change the default catalog name to hive_metastore ?

  • 0 kudos
DB_Keith
by New Contributor
  • 565 Views
  • 1 replies
  • 0 kudos

Data view in Side Panel

Does anyone know why I cannot see the Data view in the side panel under workspace.  I see catalog instead of data.  Is this something that has been upgraded?  

  • 565 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@DB_Keith - Data Explorer is renamed to catalog Explorer. Please refer to the release notes.  https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/september#data-explorer-is-now-catalog-explorer

  • 0 kudos
hukel
by Contributor
  • 540 Views
  • 1 replies
  • 0 kudos

Parsed Logical Plan report UnresolvedHint RANGE_JOIN

I'm new to RANGE_JOIN so this may be completely normal, but I'd like confirmation.Whenever I put a RANGE_JOIN hint in my query SELECT /*+ RANGE_JOIN(pr2, 3600) */ event.FirstIP4Record FROM SCHEMA_NAME_HERE.dnsrequest event INNER JOIN SC...

  • 540 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@hukel  - The query above does not have a range join, the range filter is not a join condition and it is evaluated as a regular filter.  Please refer to the criteria on range join optimization for joins.  Have a condition that can be interpreted as ...

  • 0 kudos
Amarjit
by New Contributor
  • 651 Views
  • 1 replies
  • 0 kudos

Unable to create a Unity Catalog

https://accounts.azuredatabricks.net/data/createI am unable to access the above link to start with Unity Catalog.

  • 651 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

Are you an account admin? If you access to https://accounts.azuredatabricks.net/ are you able to see the console or just the workspaces that are currently available for you?

  • 0 kudos
DavidKxx
by Contributor
  • 2237 Views
  • 1 replies
  • 0 kudos

Resolved! Have code stay hidden even when the notebook is copied

When I save a certain Python notebook where I have selected Hide Code and Hide Results on certain cells, those conditions persist.  For example, when I come back the next day in a new session, the hidden material is still hidden.When the notebook is ...

  • 2237 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

In Databricks, the 'Hide Code' and 'Hide Results' actions are part of the interactive notebook UI and are not saved as part of the notebook source code. Therefore, these settings won't persist when the notebook is copied or moved to a new location th...

  • 0 kudos
Prashanthkumar
by New Contributor III
  • 1068 Views
  • 1 replies
  • 0 kudos

Is it possible to view Databricks cluster metrics using REST API

I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...

Prashanthkumar_0-1705104692634.png
  • 1068 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

There is currently no option available to get this metrics available through API, but is coming soon.

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors