cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

inagar
by New Contributor
  • 705 Views
  • 2 replies
  • 0 kudos

Unable to capture the Query result via JDBC client execution

As shown in below screenshots MERGE INTO command produces information about the result (num_affected_rows, num_updated_rows, num_deleted_rows, num_inserted_rows).Unable to get this information when the same query is being executed via JDBC client. Is...

Screenshot 2024-06-26 at 1.30.19 AM.png Screenshot 2024-06-26 at 1.36.10 AM (1).png Screenshot 2024-06-26 at 1.45.23 AM.png
  • 705 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Delta API can help you get these details.  Reference - https://docs.databricks.com/en/delta/history.html#history-schema

  • 0 kudos
1 More Replies
Demudu
by New Contributor II
  • 630 Views
  • 3 replies
  • 4 kudos

Resolved! How to read Databricks UniForm format tables present in ADLS

We have Databricks UniForm format (iceberg) tables are present in azure data lake storage (ADLS) which has already integrated with Databricks unity catalog. How to read Uniform format tables using Databricks as a query engine?

  • 630 Views
  • 3 replies
  • 4 kudos
Latest Reply
fmadeiro
Contributor
  • 4 kudos

Query Using Unity Catalog:SQL:sqlCopiar códigoSELECT * FROM catalog_name.schema_name.table_name;PySpark:pythonCopiar códigodf = spark.sql("SELECT * FROM catalog_name.schema_name.table_name") df.display()Direct Access by Path: If not using Unity Catal...

  • 4 kudos
2 More Replies
egndz
by New Contributor II
  • 7184 Views
  • 2 replies
  • 0 kudos

Cluster Memory Issue (Termination)

Hi,I have a single-node personal cluster with 56GB memory(Node type: Standard_DS5_v2, runtime: 14.3 LTS ML). The same configuration is done for the job cluster as well and the following problem applies to both clusters:To start with: once I start my ...

egndz_2-1712845742934.png egndz_1-1712845616736.png
  • 7184 Views
  • 2 replies
  • 0 kudos
raktim_das
by New Contributor II
  • 1249 Views
  • 2 replies
  • 1 kudos

Creating table in Unity Catalog with file scheme dbfs is not supported

 code:# Define the path for the staging Delta tablestaging_table_path = "dbfs:/user/hive/warehouse/staging_order_tracking"spark.sql( f"CREATE TABLE IF NOT EXISTS staging_order_tracking USING DELTA LOCATION '{staging_table_path}'" )Creating table in U...

  • 1249 Views
  • 2 replies
  • 1 kudos
Latest Reply
SaiPrakash_653
New Contributor II
  • 1 kudos

I believe, using mount point only we can able to connect to our storage account - containers. If this is anti pattern by data bricks what is the way? Can you please explain what is external location of UC is it our local system folders or something n...

  • 1 kudos
1 More Replies
hiepntp
by New Contributor III
  • 917 Views
  • 2 replies
  • 2 kudos

Resolved! Cannot find "Databricks Apps"

Hi, I saw a demo about "Databricks Apps" 2 months ago. I haven't used Databricks for about 3 months, and I recently recreated a Premium Workspace to try something out (I use Azure), however I can't find "Apps" when I click "New". How can I enable and...

  • 917 Views
  • 2 replies
  • 2 kudos
Latest Reply
hiepntp
New Contributor III
  • 2 kudos

Thankyou, I found it.

  • 2 kudos
1 More Replies
GandinDaniel1
by New Contributor II
  • 348 Views
  • 4 replies
  • 0 kudos

Issue Querying Registered Tables on Glue Catalog via Data bricks

Im having an issue to query registered tables on glue catalog thru databricks with the following error: AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view looker.ccc_data cannot be found.Verify the spelling and correctness of the schema a...

  • 348 Views
  • 4 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Can you specify the full catalog.schema.table? and also check the current schema SELECT current_schema();

  • 0 kudos
3 More Replies
gabrielsantana
by New Contributor II
  • 255 Views
  • 2 replies
  • 1 kudos

Delete non-community Databricks account

Hi everyone!I have mistakenly created a non-community account using my personal email address.I would like to delete it in order to create a new account using my business email.How should I proceed? I tried to find this option on the console, with no...

  • 255 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @gabrielsantana!Could you please try raising a ticket with the Databricks support team?

  • 1 kudos
1 More Replies
vijaykumar99535
by New Contributor III
  • 3862 Views
  • 1 replies
  • 1 kudos

How to overwrite the existing file using databricks cli

If i use databricks fs cp then it does not overwrite the existing file, it just skip copying the file. Any suggestion how to overwrite the file using databricks cli?

  • 3862 Views
  • 1 replies
  • 1 kudos
Latest Reply
Swap
New Contributor II
  • 1 kudos

You can use the --overwrite option to overwrite your file.https://docs.databricks.com/en/dev-tools/cli/fs-commands.html

  • 1 kudos
gf
by New Contributor II
  • 385 Views
  • 1 replies
  • 1 kudos

Resolved! The databricks jdbc driver has a memory leak

https://community.databricks.com/t5/community-platform-discussions/memory-leak/td-p/80756 My question is the same as above Unable to upload pictures, I had to dictateQuestion from ResultFileDownloadMonitor. M_requestList parametersBecause is ResultFi...

  • 385 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Hello @gf thanks for your question, it seems that this has been reported with Simba, but no fix has been provided yet, as a temporary workaround, you can consider using reflection to periodically clean up the m_requestList by removing KV pairs whose ...

  • 1 kudos
hari-prasad
by Valued Contributor II
  • 385 Views
  • 3 replies
  • 0 kudos

Databricks Champions Sign-Up Page Failing or Throwing Error

Following link is throwing error when tried to Sign-Uphttps://advocates.databricks.com/users/sign_up?join-code=community 

  • 385 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Let me check on this internally, I am not sure if this actually requires invitation. 

  • 0 kudos
2 More Replies
Sudheer2
by New Contributor III
  • 404 Views
  • 5 replies
  • 0 kudos

Issue with Adding New Members to Existing Groups During Migration in User group Service Principle

 Hi all,I have implemented a migration process to move groups from a source workspace to a target workspace using the following code. The code successfully migrates groups and their members to the target system, but I am facing an issue when it comes...

  • 404 Views
  • 5 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

I have provided response in https://community.databricks.com/t5/get-started-discussions/migrating-service-principals-from-non-unity-to-unity-enabled/m-p/103017#M4679 

  • 0 kudos
4 More Replies
Tanay
by New Contributor II
  • 414 Views
  • 1 replies
  • 1 kudos

Resolved! Why does a join on (df1.id == df2.id) result in duplicate columns while on="id" does not?

Why does a join with on (df1.id == df2.id) result in duplicate columns, but on="id" does not?I encountered an interesting behavior while performing a join on two Data frames. Here's the scenario: df1 = spark.createDataFrame([(1, "Alice"), (2, "Bob"),...

  • 414 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Tanay , Your intuition is correct here. In Apache Spark, the difference in behavior between on (df1.id == df2.id) and on="id" in a join stems from how Spark resolves and handles column naming during the join operation.When you use the first synta...

  • 1 kudos
Roig
by New Contributor II
  • 252 Views
  • 2 replies
  • 0 kudos

Create multiple dashboard subscription with filters

Hi Databricks community, We developed a dashboard that surfaces up several important KPIs for each project we have.In the top filter, we select the project name and the time frame and the dashboard will present the relevant KPIs and charts. I can eas...

  • 252 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

You can achieve this by setting up different schedules for each project and specifying the default filter values accordingly   Create the Dashboard: Ensure your dashboard is set up with the necessary filters, including the project filter.   Set Defau...

  • 0 kudos
1 More Replies
Cloud_Architect
by New Contributor III
  • 2226 Views
  • 4 replies
  • 0 kudos

How to get the Usage/DBU Consumption report without using system tables

Is there a way to get the usage/DBU consumption report without using system tables?

  • 2226 Views
  • 4 replies
  • 0 kudos
Latest Reply
TracyJackson
New Contributor II
  • 0 kudos

You can get DBU consumption reports using the Azure Portal (for Azure SQL), through Metrics under your database's "Usage" section, or via Dynamic Management Views (DMVs) like sys.dm_db_resource_stats in SSMS. Third-party tools like SQL Sentry also of...

  • 0 kudos
3 More Replies
Nicolas_Izidoro
by New Contributor II
  • 365 Views
  • 5 replies
  • 1 kudos

Não estou conseguindo logar na minha conta Databricks communit

Galera não consigo logar na minha conta Databricks Communit fala que meu email não tem nada criado nele, mas eu tenho essa conta a um bom tempo já e nunca me ocorreu isso, já até tentei criar uma outra conta com esse mesmo email, mas não consigo cria...

  • 365 Views
  • 5 replies
  • 1 kudos
Latest Reply
Nicolas_Izidoro
New Contributor II
  • 1 kudos

infelizmente Não tenho nenhuma url

  • 1 kudos
4 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors