Yes, you can:
https://docs.databricks.com/user-guide/notebooks/notebook-workflows.html#example
You will get the return value as you will do with a function.
Hi @Emiliano Parizzi,You could parsed the timestamp after loading the file with using the withColumn (cf. https://stackoverflow.com/questions/39088473/pyspark-dataframe-convert-unusual-string-format-to-timestamp).from pyspark.sql import Row from pysp...
Hi @rr_5454,
You will find the answer here https://forums.databricks.com/questions/10648/upload-local-files-into-dbfs-1.html
You will have to:
get the file to local file storagemove the file from dbfsload the file in a dataframe
This is one of the p...
Hi,
Finally, I did what you want.
You just have to write at the end of your notebook:
dbutils.notebook.exit(<json or string content>)
Then you set up a notebook activity in data factory. And in the azure function activity, you pass a string like thi...
Databricks FileSystem (DBFS) can handle some files locally or you can mount a point to a blob storage or a Data Lake. If you are using a Data Lake gen2, there are not yet an sdk for using Azure Function.
First, you will write the content of a datafr...