cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NandiniN
by Databricks Employee
  • 3579 Views
  • 1 replies
  • 2 kudos

How to collect a thread dump from Databricks Spark UI.

If you observe a hung job, thread dumps are crucial to determine the root cause. Hence, it would be a good idea to collect the thread dumps before cancelling the hung job. Here are the Instructions to collect the Spark driver/executor thread dump:  ​...

  • 3579 Views
  • 1 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 2 kudos

Thank you for sharing @NandiniN

  • 2 kudos
traillog
by New Contributor
  • 1220 Views
  • 0 replies
  • 0 kudos

Response code 400 received when using VSCode on Windows 10 but no issue while using Ubuntu

I use VSCode on Windows 10 for building and deploying a workflow from my system and always encounter response code 400 when trying to deploy it. I am able to deploy the workflows via Ubuntu, but not via Windows. Has anyone encountered this issue befo...

  • 1220 Views
  • 0 replies
  • 0 kudos
zgreen
by New Contributor
  • 681 Views
  • 0 replies
  • 0 kudos

jobs.python_wheel_task.enty_point can't find entry points defined in dependency packages

Let's say I have packageA with no entry points, packageA dependents on dependencyA package, which has entry-points.In order to be able to use those entrypoints, i.e.```yamlpython_wheel_task:  package_name: packageA  entry_point:dependencyA_entry```I ...

  • 681 Views
  • 0 replies
  • 0 kudos
Verr
by New Contributor II
  • 1596 Views
  • 2 replies
  • 0 kudos

child notebook is not displaying output.

I have built pipeline to execute databricks notebook having SQL scripts. It is executing notebook but not able to see output for each cell. I am executing child notebook through driver notebook.

  • 1596 Views
  • 2 replies
  • 0 kudos
Latest Reply
koushiknpvs
New Contributor III
  • 0 kudos

Hi Verr,In short it depends on how your child notebook is configured. But I would start with the following points -Output Logging Settings: Check the logging settings for your notebook cells. Ensure that the cells are configured to display output. In...

  • 0 kudos
1 More Replies
GeKo
by New Contributor III
  • 4167 Views
  • 6 replies
  • 2 kudos

Resolved! column "storage_sub_directory" is now always NULL in system.information_schema.tables

Hello,I am running a job that depends on the information provided in column storage_sub_directory in system.information_schema.tables .... and it worked until 1-2 weeks ago.Now I discovered in the doc that this column is deprecated and always null , ...

Community Platform Discussions
Unity Catalog
unitycatalog
  • 4167 Views
  • 6 replies
  • 2 kudos
Latest Reply
GeKo
New Contributor III
  • 2 kudos

Many thanks for the update @NandiniN 

  • 2 kudos
5 More Replies
Lucifer
by New Contributor
  • 1147 Views
  • 1 replies
  • 0 kudos

displaying unity catalog metadata and other information in sharePoint

Is there any connectors or api which we can use to display metadata information stored in Unity catalog to business users using SharePoint.

  • 1147 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 0 kudos

Hi @Lucifer As metadata are stored in system schema as table that means you can use databricks to extract the data from databricks and display it to Sharepoint Docs - Statement Execution API: Run SQL on warehouses | Databricks on AWS

  • 0 kudos
_databreaks
by New Contributor II
  • 1093 Views
  • 0 replies
  • 0 kudos

DLT to push data instead of a pull

I am relatively new to Databricks, and from my recent experience it appears that every step in a DLT Pipeline, we define each LIVE TABLES (be it streaming or not) to pull data upstream.I have yet to see an implementation where data from upstream woul...

  • 1093 Views
  • 0 replies
  • 0 kudos
NarenderKumar
by New Contributor III
  • 3716 Views
  • 1 replies
  • 1 kudos

Resolved! Unable to provide access in unity catalog using SQL commands

I am trying to provide access in unity catalog using the SQL commands.I am following the below documentation:https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/It suggests to create SQL commands in belo...

NarenderKumar_2-1715253228379.png NarenderKumar_1-1715253206943.png NarenderKumar_3-1715253369442.png
  • 3716 Views
  • 1 replies
  • 1 kudos
Latest Reply
mhiltner
Databricks Employee
  • 1 kudos

Try this. For some reason the quotes are crazy when using these commands.  GRANT USAGE ON CATALOG `uda_dev` TO `your-group`   GRANT SELECT ON SCHEMA uda_dev.default TO `your-group` (without quotes for the schema) 

  • 1 kudos
Kuchnhi
by New Contributor
  • 807 Views
  • 1 replies
  • 0 kudos

Databricks workflow jobs run is taking Double time in EU Region

We have a scheduled job in Databricks workflow, This Job run is taking aroud 5 hours Previously before 1 month it was tasking on 2.5 hours. Can any one tell what may be the reason behind this. Note: There is no change has been made in this period of ...

  • 807 Views
  • 1 replies
  • 0 kudos
Latest Reply
Wojciech_BUK
Valued Contributor III
  • 0 kudos

You can check if you are using spot instances on your Job Cluster.btw. if you are using Azure West Europe is on very high demand and sometimes it takes time to provision compute.But it should be matter of minutes, not hours.Check maybe if your data v...

  • 0 kudos
ducng
by New Contributor II
  • 8945 Views
  • 3 replies
  • 0 kudos

Resolved! How to pass variables to a python file job

Hi everyone,It's relatively straight forward to pass a value to a key-value pair in notebook job. But for the python file job however, I couldn't figure out how to do it. Does anyone have any idea?Have been tried out different variations for a job wi...

ducng_0-1696256847837.png ducng_1-1696256883016.png
  • 8945 Views
  • 3 replies
  • 0 kudos
Latest Reply
Bruno-Castro
New Contributor II
  • 0 kudos

Thanks so much for this! By the way, is there a way to do it with the JSON interface? I am struggling to get the parameters if entered in this way  

  • 0 kudos
2 More Replies
Nagrjuna
by New Contributor II
  • 1076 Views
  • 1 replies
  • 0 kudos

Chat Bot with Azure blob and databricks

Hi Team, I am thinking to start a chat bot application for teams to query data from Azure blob and data bricks tables in python programming language.Please help me out on how i can start and which tools i can use for this requirement.Thanks in advanc...

  • 1076 Views
  • 1 replies
  • 0 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 0 kudos

@Nagrjuna , that's a great idea! Although we do not know about your use case completely, I am sure you would definitely fall in love with our AI/ML Products. To create a Python chat bot application that can pull data from Azure Blob Storage and Datab...

  • 0 kudos
benitoski
by New Contributor II
  • 1706 Views
  • 1 replies
  • 1 kudos

Resolved! Workspace FileNotFoundExecption

I have a model created with catboost and exported in onnx format in workspace and I want to download that model to my local machine.I tried to use the Export that is in the three points to the right of the model, but the model weighs more than 10 Mb ...

  • 1706 Views
  • 1 replies
  • 1 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 1 kudos

you need put file to FileStorehttps://docs.databricks.com/en/dbfs/filestore.html#save-a-file-to-filestore 

  • 1 kudos
Benedetta
by New Contributor III
  • 2084 Views
  • 1 replies
  • 0 kudos

What happened to the ephemeral notebook links????? and the job ids????

Hey Databricks,      Why did you remove the ephemeral notebook links and job Ids from the parallel runs? This has created a huge gap for us. We can no longer view the ephemeral notebooks, and also the Jobids are missing from the output. Waccha doing?...

  • 2084 Views
  • 1 replies
  • 0 kudos
Latest Reply
Benedetta
New Contributor III
  • 0 kudos

Hi Kaniz,    It's funny you mention these things - we are doing some of those - the problem now is that the JobId is obscured from the output meaning we can't tell which ephemeral notebook goes with which JobId.  It looks like the ephemeral notebook ...

  • 0 kudos
Sudheer2
by New Contributor III
  • 3557 Views
  • 0 replies
  • 0 kudos

Updating Databricks SQL Warehouse using Terraform

 We can Update SQL Warehouse manually in Databricks.Click SQL Warehouses in the sidebarIn Advanced optionsWe can find Unity Catalog toggle button there! While Updating Existing SQL Warehouse in Azure to enable unity catalog using terraform, I couldn'...

  • 3557 Views
  • 0 replies
  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors