cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mjbobak
by New Contributor III
  • 23873 Views
  • 5 replies
  • 9 kudos

Resolved! How to import a helper module that uses databricks specific modules (dbutils)

I have a main databricks notebook that runs a handful of functions. In this notebook, I import a helper.py file that is in my same repo and when I execute the import everything looks fine. Inside my helper.py there's a function that leverages built-i...

  • 23873 Views
  • 5 replies
  • 9 kudos
Latest Reply
amitca71
Contributor II
  • 9 kudos

Hi,i 'm facing similiar issue, when deploying via dbx.I have an helper notebook, that when executing it via jobs works fine (without any includes)while i deploy it via dbx (to same cluster), the helper notebook results withdbutils.fs.ls(path)NameEr...

  • 9 kudos
4 More Replies
Tsar
by New Contributor III
  • 12050 Views
  • 10 replies
  • 12 kudos

Limitations with UDFs wrapping modules imported via Repos files?

We have been importing custom module wheel files from our AzDevOps repository. We are pushing to use the Databricks Repos arbitrary files to simplify this but it is breaking our spark UDF that wraps one of the functions in the library with a ModuleNo...

  • 12050 Views
  • 10 replies
  • 12 kudos
Latest Reply
Scott_B
New Contributor III
  • 12 kudos

If your notebook is in the same Repo as the module, this should work without any modifications to the sys path.If your notebook is not in the same Repo as the module, you may need to ensure that the sys path is correct on all nodes in your cluster th...

  • 12 kudos
9 More Replies
kkawka1
by New Contributor III
  • 10074 Views
  • 7 replies
  • 10 kudos

Resolved! Removing files saved in the root FileStore

We have just started working with databricks in one of my university modules, and the lecturers gave us a set of commands to practice saving data in the FileStore. One of the commands was the following:dbutils .fs.cp("/ databricks - datasets / weathh...

  • 10074 Views
  • 7 replies
  • 10 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 10 kudos

you can delete files using the data explorer in Databricks web UI.another option is to use %fs or %sh in a notebook.

  • 10 kudos
6 More Replies
Labels