- 26078 Views
- 5 replies
- 9 kudos
I have a main databricks notebook that runs a handful of functions. In this notebook, I import a helper.py file that is in my same repo and when I execute the import everything looks fine. Inside my helper.py there's a function that leverages built-i...
- 26078 Views
- 5 replies
- 9 kudos
Latest Reply
Hi,i 'm facing similiar issue, when deploying via dbx.I have an helper notebook, that when executing it via jobs works fine (without any includes)while i deploy it via dbx (to same cluster), the helper notebook results withdbutils.fs.ls(path)NameEr...
4 More Replies
- 29572 Views
- 4 replies
- 1 kudos
Hi,I am considering creating libraries for my databricks notebooks, and found that it is possible to import functions from modules saved in repos. Is it possible to move the .py files with the functions to Workspace/Shared and still import functions ...
- 29572 Views
- 4 replies
- 1 kudos
Latest Reply
Hi @Christine Pedersen Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell ...
3 More Replies
by
kll
• New Contributor III
- 7928 Views
- 2 replies
- 0 kudos
`AttributeError` when attempting to transfer files from `dbfs` filestore in DataBricks to a local directory. import pyspark.dbutils as pdbutils
pdbutils.fs.cp("/dbfs/Data/file1.csv", "/Users/Downloads/")
Traceback (most recent call last):
...
- 7928 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Keval Shah Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...
1 More Replies
- 9179 Views
- 5 replies
- 2 kudos
I have been following the documentation on the terraform databricks documentation in order to provision account level resources on AWS. I can create the workspace fine, add users, etc... However, when I go to use the provider in non-mws mode, I am re...
- 9179 Views
- 5 replies
- 2 kudos
Latest Reply
So the answer to this was that you need to explicitly pass the provider argument to each of the data resources blocks. The docs should be updated to accommodate that. i.e. data "databricks_spark_version" "latest" {
provider = databricks.workspace
...
4 More Replies
by
tariq
• New Contributor III
- 8262 Views
- 4 replies
- 0 kudos
I'm not sure how a simple thing like importing a module in python can be so broken in such a product. First, I was able to make it work using the following:import sys
sys.path.append("/Workspace/Repos/Github Repo/sparkling-to-databricks/src")
from ut...
- 8262 Views
- 4 replies
- 0 kudos
Latest Reply
I too wonder the same thing. How can importing a python module be so difficult and not even documented lol.No need for libraries..Here's what worked for me..Step1: Upload the module by first opening a notebook >> File >> Upload Data >> drag and drop ...
3 More Replies
- 15490 Views
- 6 replies
- 5 kudos
I am running a notebook on the Coursera platform.my configuration file, Classroom-Setup, looks like this:%python
spark.conf.set("com.databricks.training.module-name", "deep-learning")
spark.conf.set("com.databricks.training.expected-dbr", "6.4")
...
- 15490 Views
- 6 replies
- 5 kudos
Latest Reply
Hi @Maria Bruevich ,From the error description, it looks like the mlflow library is not present. You can use ML cluster as these type of cluster already have mlflow library. Please check the below document:https://docs.databricks.com/release-notes/r...
5 More Replies