- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-09-2022 04:20 AM
Hi,
I am using the logger to log some parameters in my code and I want to save the file under DBFS. But for some reason the file is not getting created under DBFS. If I clear the state of the notebook and check the DBFS dir then file is present. Please let me know if anyone has any idea regarding this. I already tried all possible solutions available on internet but none of them fix the issue.
I have pasted the code below for reproducing the issue.
import logging
log_file = f"e.log"
logging.basicConfig(
filename="/dbfs/FileStore/" + log_file,
format="[%(filename)s:%(lineno)s %(asctime)s] %(message)s",
level=logging.INFO,
force=True
)
__LOG__ = logging.getLogger(__name__)
__LOG__.info("Starting ETL staging pipeline only!")
__LOG__.info("Starting ETL staging pipeline only!")
display(dbutils.fs.ls("dbfs:/FileStore/"))
- Labels:
-
Creating
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ07-13-2022 12:34 AM
HI @Kaniz Fatmaโ ,
The issue can be solved by replacing the file handler with a stream handler that contains the string io object. Also please reply to our question when you have some free time. And when someone points out a bug in Databricks it's good to notify the developing team.
Thanks,
Chandan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-09-2022 04:21 AM
@Kaniz Fatmaโ ,
Please let me know if you have any idea regarding this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-10-2022 02:55 AM
@Kaniz Fatmaโ,
Did you find any solution? Please let me know.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-10-2022 12:56 PM
Hi @Kaniz Fatmaโ ,
I am using 10.4 LTS.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-14-2022 07:12 AM
Hi @Kaniz Fatmaโ,
As I mentioned in my problem description the file is not getting created only. But if I clear the state of the notebook and check the DBFS dir the file is present. As
I am creating the log file in the notebook and need to upload it to AWS s3 location it has to work how normal python env works. I would recommend using the code I posted to reproduce the issue and fix it from Databrick's side. As other people might also face the same issue in the future.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-16-2022 11:39 PM
Hi @Kaniz Fatmaโ,
Any update on my request? please fix this issue from your end ASAP.
Thanks,
Chandan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ07-13-2022 12:34 AM
HI @Kaniz Fatmaโ ,
The issue can be solved by replacing the file handler with a stream handler that contains the string io object. Also please reply to our question when you have some free time. And when someone points out a bug in Databricks it's good to notify the developing team.
Thanks,
Chandan

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-10-2022 03:26 AM
Perhaps PyCharm sets a different working directory, meaning the file ends up in another place. Try providing a full path.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ06-10-2022 03:45 AM
Hi @Halen15 Noyesโ ,
I am executing the notebook in Databricks. And there we save the file under DBFS.

