- 1770 Views
- 1 replies
- 2 kudos
Hello, I cloned a repo my_repo in the Dataricks space Repos.Inside my_repo, I created a notebook new_experiment where I can import functions from my_repo, which is really handy. When I want to modify a function in my_repo, I open my local IDE, do the...
- 1770 Views
- 1 replies
- 2 kudos
Latest Reply
Use
%reload_ext autoreload
instead, it will do your expected behavior.You just need to run it once, like %load_ext autoreload %autoreload 2
by
jcoggs
• New Contributor II
- 3235 Views
- 2 replies
- 1 kudos
I have a notebook that calls dbutils.fs.ls() for some derived file path in azure. Occasionally, this path may not exist, and in general I can't always guarantee that the path exists. When the path doesn't exist it throws an "ExecutionError" which app...
- 3235 Views
- 2 replies
- 1 kudos
Latest Reply
Hey @jcoggs The problem looks legit though never occurred to me as I try to keep my mounts manually fed to the pipeline using a parameters or a variable by doing this you will have more control over your pipelines see if you could do the same in your...
1 More Replies
by
Maaax
• New Contributor
- 919 Views
- 0 replies
- 0 kudos
Hello dear community,I use the following command to register the provider including shares to unity-catalog:databricks unity-catalog create-provider --name company --recipient-profile-json-file ~/Develop/profile.jsonOnce it is registered, I could see...
- 919 Views
- 0 replies
- 0 kudos