- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-18-2015 01:26 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2023 11:12 AM
You can use % run and then provide exact path of another notebook which one you want to call and for passing parameter in Pyspark just use Variable1= 'Valuea'
Variable2= 'Valueb'
For scala use :
$ Vara='Valuea'
$ Varb=' Valueb'
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-27-2023 01:08 PM
You can use a JSON file to temporarily store the arguments that you want to use in your notebook when passing arguments/variables to it. Using json.You can define the argument list and convert it to a JSON file ().Once the argument file is created, you can open it in a notebook and use the arguments by reading the contents and converting the JSON back into a dictionary with json.loads().
https://kb.databricks.com/jobs/pass-arguments-to-a-notebook-as-a-list
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-28-2023 08:50 AM
To pass arguments/variables to a notebook, you can use a JSON file to temporarily store the arguments and then pass it as one argument to the notebook. After passing the JSON file to the notebook, you can parse it with json.loads(). The argument list should be defined and converted to a JSON file using json.dumps(). Once the argument file is created, you can open it inside a notebook and use the arguments by reading the contents and converting the JSON back into a dictionary using json.loads().
PS: Check #DAIS2023 talks


- « Previous
-
- 1
- 2
- Next »