cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Trung
by Contributor
  • 731 Views
  • 2 replies
  • 1 kudos

Resolved! DataBricks best practice to manage resource correspond deleted user

currently I have some prblem about my DataBricks workspace when an user was deleted and it cause some issue:Applications or scripts that use the tokens generated by the user will no longer be able to access the Databricks APIJobs owned by the user wi...

  • 731 Views
  • 2 replies
  • 1 kudos
Latest Reply
Trung
Contributor
  • 1 kudos

@Vivian Wilfred​ it really useful for my case, many thanks!

  • 1 kudos
1 More Replies
Nath
by New Contributor II
  • 1267 Views
  • 3 replies
  • 2 kudos

Resolved! Error with multiple FeatureLookup calls outside databricks

I access databricks feature store outside databricks with databricks-connect on my IDE pycharm.The problem is just outside Databricks, not with a notebook inside Databricks.I use FeatureLookup mecanism to pull data from Feature store tables in my cus...

  • 1267 Views
  • 3 replies
  • 2 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 2 kudos

Also, Please refer to the below KB for additional resolution - https://learn.microsoft.com/en-us/azure/databricks/kb/dev-tools/dbconnect-protoserializer-stackoverflow

  • 2 kudos
2 More Replies
elgeo
by Valued Contributor II
  • 574 Views
  • 0 replies
  • 3 kudos

Number of parquet files per delta table

Hello. We would like to understand how many parquet files are created per data table. To be more specific, we refer to the current snapshot of the table. For example, we noticed that while we performed initial inserts to a table, one parquet file was...

  • 574 Views
  • 0 replies
  • 3 kudos
ncouture
by Contributor
  • 1622 Views
  • 6 replies
  • 4 kudos

How to include visualizations returned from %run in the caller notebooks dashboard?

I have a notebook (nb1) that calls another one (nb2) via the %run command. This returns some visualizations that I want to add to a dashboard of the caller notebook (nb1-db). When I select the visualization drop down, then select Add to dashboard, th...

  • 1622 Views
  • 6 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @Nicholas Couture​, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to othe...

  • 4 kudos
5 More Replies
dsura
by Contributor
  • 3628 Views
  • 7 replies
  • 19 kudos

Resolved! Azure AAD token with Databricks for User defined managed Identity inside Docker Container

Hi, We are currently using a Azure AAD Token inorder to authenticate with Databricks instead of generating Personal Access Tokens from Databricks. We have a multi-tenant architecture and so we are using Azure container instances to run multiple trans...

  • 3628 Views
  • 7 replies
  • 19 kudos
Latest Reply
Kaniz
Community Manager
  • 19 kudos

Hi @Dharit Sura​ â€‹, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others....

  • 19 kudos
6 More Replies
AbhishekBreeks
by New Contributor II
  • 12531 Views
  • 9 replies
  • 4 kudos

Referential Integrity (Primary Key / Foreign Key Constraint) - Azure Databricks SQL

Hello, Please suggest how can we implement Referential Integrity (Primary Key / Foreign Key Constraint) - between different tables defined on Azure Databricks Database. Basically the syntax to add Primary and Foreign Key constraint in the table defi...

  • 12531 Views
  • 9 replies
  • 4 kudos
Latest Reply
elgeo
Valued Contributor II
  • 4 kudos

Is there any alternate way you could suggest to implement and enforce primary key constraint?

  • 4 kudos
8 More Replies
Raymond_Garcia
by Contributor II
  • 1818 Views
  • 4 replies
  • 4 kudos

Resolved! Issue with databricks and DRIVER_LIBRARY_INSTALLATION_FAILURE.?

I have like 5 maven libraries, and with all of them, I have the same issue with Job or Notebooks. How much do I have to wait? is there another solution? Thank you very much!

issue with databricks
  • 1818 Views
  • 4 replies
  • 4 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 4 kudos

@Raymond Garcia​ , could you please open a support case with Databricks for the same? We will triage the issue and provide a solution.

  • 4 kudos
3 More Replies
mmlime
by New Contributor III
  • 1209 Views
  • 4 replies
  • 0 kudos

Resolved! Can I use VMs from Pool for my Workflow cluster?

Hi,there is no option to take VMs from a Pool for a new workflow (Azure Cloud)?default schema for a new cluster:{ "num_workers": 0, "spark_version": "10.4.x-scala2.12", "spark_conf": { "spark.master": "local[*, 4]", "spark...

  • 1209 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vivian_Wilfred
Honored Contributor
  • 0 kudos

@Michal Mlaka​ I just checked on the UI and I could find the pools listing under worker type in a job cluster configuration. It should work.

  • 0 kudos
3 More Replies
kthneighbor
by New Contributor II
  • 1507 Views
  • 5 replies
  • 2 kudos

Resolved! What will be the next LTS version after 10.4?

What will be the next LTS version after 10.4?

  • 1507 Views
  • 5 replies
  • 2 kudos
Latest Reply
youssefmrini
Honored Contributor III
  • 2 kudos

Hello, 11.3 LTS is now available https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/11.3

  • 2 kudos
4 More Replies
HenriqueMoniz
by New Contributor II
  • 980 Views
  • 1 replies
  • 2 kudos

How to access Delta Live Tables feature?

Hi, I tried following the Delta Live Tables quickstart (https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-quickstart.html), but I don't see the Pipelines tab under the Jobs page in my workspace. The same guide mentions...

  • 980 Views
  • 1 replies
  • 2 kudos
Latest Reply
virbickt
New Contributor III
  • 2 kudos

Hi, you need a Premium workspace for the Pipelines tab to show up. This is what I see on my workspace with Standard Pricing Tier selected: And this is what what I see on my workspace with the Premium Pricing Tier:

  • 2 kudos
Soma
by Valued Contributor
  • 1791 Views
  • 5 replies
  • 3 kudos

Resolved! Unable to create Key Vault secrets scope with NPIP Workspace

Hi Team for secure connection we created secured cluster withNPIP(https://learn.microsoft.com/en-us/azure/databricks/security/secure-cluster-connectivity) WORKSPACE hosted in a private VNET.We had a hub vnet with private endpoint for key vault ,We pe...

  • 1791 Views
  • 5 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @somanath Sankaran​ â€‹, We haven’t heard from you since the last response from @Hubert Dudek​, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others....

  • 3 kudos
4 More Replies
db-avengers2rul
by Contributor II
  • 5023 Views
  • 2 replies
  • 6 kudos

Resolved! AttributeError: 'list' object has no attribute 'columns' - PySpark

Hi All,i am getting the below error when i am ingesting the data from source file , source file is also attached , i have tried in both Community edition and Azure databricks as well getting the same error , can any one suggest me the solution ? # ...

  • 5023 Views
  • 2 replies
  • 6 kudos
Latest Reply
Kaniz
Community Manager
  • 6 kudos

Hi @Rakesh Reddy Gopidi​ â€‹, We haven’t heard from you since the last response from me, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise...

  • 6 kudos
1 More Replies
parulpaul
by New Contributor III
  • 1354 Views
  • 5 replies
  • 7 kudos
  • 1354 Views
  • 5 replies
  • 7 kudos
Latest Reply
parulpaul
New Contributor III
  • 7 kudos

No solution found

  • 7 kudos
4 More Replies
Labels
Top Kudoed Authors