cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

齐木木
by New Contributor III
  • 861 Views
  • 1 replies
  • 3 kudos

Resolved! The case class reports an error when running in the notebook

As shown in the figure, the case class and the json string are converted through fasterxml.jackson, but an unexpected error occurred during the running of the code. I think this problem may be related to the loading principle of the notebook. Because...

image.png local image
  • 861 Views
  • 1 replies
  • 3 kudos
Latest Reply
齐木木
New Contributor III
  • 3 kudos

code:var str="{\"app_type\":\"installed-app\"}" import com.fasterxml.jackson.databind.ObjectMapper import com.fasterxml.jackson.module.scala.DefaultScalaModule val mapper = new ObjectMapper() mapper.registerModule(DefaultScalaModule) ...

  • 3 kudos
TT1
by New Contributor III
  • 1363 Views
  • 3 replies
  • 8 kudos
  • 1363 Views
  • 3 replies
  • 8 kudos
Latest Reply
Kaniz
Community Manager
  • 8 kudos

Hi @Thao Ton​, We haven't heard from you on the last response from @Hubert Dudek​ and @Aman Sehgal​, and I was checking back to see if their suggestions helped you. Or else, If you have any solution, please share it with the community as it can be he...

  • 8 kudos
2 More Replies
yopbibo
by Contributor II
  • 1221 Views
  • 2 replies
  • 0 kudos

Resolved! Cluster configuration / notebook panel

Hi,Is it possible to let regular users to see all running notebooks (in the notebook panel of the cluster) on a specific cluster they can use (attach and restart).by default admins can see all running notebooks and users can see only their own notebo...

  • 1221 Views
  • 2 replies
  • 0 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 0 kudos

hi @Philippe CRAVE​ a user can see a notebook only if they have permission to that notebook. Else they won't be able to see it. Unfortunately there is no possibility for a normal user to see the notebooks attached to a cluster if they do not have per...

  • 0 kudos
1 More Replies
lei_armstrong
by New Contributor II
  • 6039 Views
  • 7 replies
  • 5 kudos

Executing Notebooks - Run All Cells vs Run All Below

Due to dependencies, if one of our cells errors then we want the notebook to stop executing.We've noticed some odd behaviour when executing notebooks depending on if "Run all cells in this notebook" is selected from the header versus "Run All Below"....

  • 6039 Views
  • 7 replies
  • 5 kudos
Latest Reply
pinecone
New Contributor II
  • 5 kudos

I second this request. It's odd that the behaviour is different when running all vs. running all below. Please make it consistent and document properly.

  • 5 kudos
6 More Replies
arda_123
by New Contributor III
  • 441 Views
  • 0 replies
  • 0 kudos

Databricks Notebook Dashboard

I want to update one widget based on another widget. It gets updated but the dropdown shows the last selected in the dashboard view, but if I go to the notebook view from the dashboard view it updates. Any help? is it a bug?

  • 441 Views
  • 0 replies
  • 0 kudos
Atul_Sharan
by New Contributor II
  • 3430 Views
  • 3 replies
  • 3 kudos

Resolved! Error Code: 3206 - Processed HTTP request failed.

The ADF(Azur Data Factory) pipelines jobs executing several Databricks Notebook activities in parallel have been failing regularly with the following error "Error Code: 3206 - Processed HTTP request failed." The issue gets resolved on its own upon re...

  • 3430 Views
  • 3 replies
  • 3 kudos
Latest Reply
willjoe
New Contributor III
  • 3 kudos

Method 1 - Close Conflicting Programsdown voteWhen you get a runtime error, keep in mind that it is happening due to programs that are conflicting with each other. The first thing you can do to resolve the problem is to stop these conflicting program...

  • 3 kudos
2 More Replies
Vibhor
by Contributor
  • 1858 Views
  • 5 replies
  • 1 kudos

Resolved! Notebook level automated pipeline monitoring or failure notif

Hi, is there any way other than adf monitoring where in automated way we can get notebook level execution details without getting to go to each pipeline and checking

  • 1858 Views
  • 5 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Vibhor Sethi​ - Would you be happy to mark @Werner Stinckens​' answer as best if it resolved your question?

  • 1 kudos
4 More Replies
Jan_A
by New Contributor III
  • 3212 Views
  • 3 replies
  • 3 kudos

Resolved! How to include notebook dashboards in repos (github)?

Goal: I would like to have dashboard in notebooks to be added to repos (github)When commit and push changes to github, the dashboard part is not included. Is there a way to include the dashboard in the repo?When later pull data, only notebook code is...

  • 3212 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

There is API to get dashboards. So you would need to deploy custom CI/D deployment with step to get dashboard and dashboard elements through API and than save returned json to git. You could also deploy some script to azure funtion or aws lambda to d...

  • 3 kudos
2 More Replies
test_data
by New Contributor III
  • 3305 Views
  • 4 replies
  • 2 kudos

need to move notebook file from workspace to dbfs.

Hi teami need to move the notebook file from workspace to dbfs. i have tried and getting an error there is no file ?

  • 3305 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Not sure what command are you using and what is error? As access to workspace is managed per user on databricks so %sh magic commands will not work as you could see other user files.Dbfs is data storage. Notebook code from Workspace can be moved to R...

  • 2 kudos
3 More Replies
DamienSicard
by New Contributor III
  • 6003 Views
  • 4 replies
  • 2 kudos

Resolved! Notebooks font size

Hi,Is there a way to increase the cells' font size and set it as a default setting ?Thanks.Best Damien

  • 6003 Views
  • 4 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @Damien Sicard​ , As @werners has stated, you can zoom your browser.

  • 2 kudos
3 More Replies
SailajaB
by Valued Contributor III
  • 10793 Views
  • 9 replies
  • 6 kudos

How to send a list as parameter in databricks notebook task

Hi,How we can pass a list as parameter in data bricks notebook to run the notebook parallelly for list of values.Thank you

  • 10793 Views
  • 9 replies
  • 6 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 6 kudos

another another way (in databricks you can achieve everything many ways) is to encode list using json library:import json print type(json.dumps([1, 2, 3])) #>> <type 'str'>

  • 6 kudos
8 More Replies
MadelynM
by New Contributor III
  • 1349 Views
  • 2 replies
  • 4 kudos

Resolved! Why isn't my notebook search function working?

My search function is broken. I can't search for notebook contents.

  • 1349 Views
  • 2 replies
  • 4 kudos
Latest Reply
lizou
Contributor II
  • 4 kudos

Here is a tool availableelsevierlabs-os/NotebookDiscovery: Notebook Discovery Tool for Databricks notebooks (github.com)How to Catalog and Discover Your Databricks Notebooks Faster - The Databricks Blog

  • 4 kudos
1 More Replies
User16826992666
by Valued Contributor
  • 1840 Views
  • 1 replies
  • 0 kudos

Resolved! Is there a limit to the number of data points displayed in notebook visualizations?

I know that when you display the results of queries in notebooks there is a limit to the number of rows that are shown. Is there a similar limit to the results that are displayed in visuals within notebooks?

  • 1840 Views
  • 1 replies
  • 0 kudos
Latest Reply
sean_owen
Honored Contributor II
  • 0 kudos

Yes, still limited to 1000 rows / data points. However, when your visualization involves things like sums or averages of a Spark DataFrame's result, those will be performed on the cluster, so would involve maybe many more than 1000 data points, even ...

  • 0 kudos
Labels