cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

logging.basicConfig not creating a file in Databricks

chandan_a_v
Valued Contributor

Hi,

I am using the logger to log some parameters in my code and I want to save the file under DBFS. But for some reason the file is not getting created under DBFS. If I clear the state of the notebook and check the DBFS dir then file is present. Please let me know if anyone has any idea regarding this. I already tried all possible solutions available on internet but none of them fix the issue.

I have pasted the code below for reproducing the issue.

import logging

log_file = f"e.log"

logging.basicConfig(

  filename="/dbfs/FileStore/" + log_file,

  format="[%(filename)s:%(lineno)s %(asctime)s] %(message)s",

  level=logging.INFO,

  force=True

)

__LOG__ = logging.getLogger(__name__)

__LOG__.info("Starting ETL staging pipeline only!")

__LOG__.info("Starting ETL staging pipeline only!")

display(dbutils.fs.ls("dbfs:/FileStore/"))

1 ACCEPTED SOLUTION

Accepted Solutions

HI @Kaniz Fatma​ ,

The issue can be solved by replacing the file handler with a stream handler that contains the string io object. Also please reply to our question when you have some free time. And when someone points out a bug in Databricks it's good to notify the developing team.

Thanks,

Chandan

View solution in original post

11 REPLIES 11

chandan_a_v
Valued Contributor

@Kaniz Fatma​ ,

Please let me know if you have any idea regarding this.

@Kaniz Fatma​,

Did you find any solution? Please let me know.

Kaniz
Community Manager
Community Manager

Hi @Chandan Angadi​, Can you specify the current DBR runtime you've used?

Hi @Kaniz Fatma​ ,

I am using 10.4 LTS.

Kaniz
Community Manager
Community Manager

Hi @Chandan Angadi​, Did you get a chance to see this S.O thread? Please let us know if that helps you.

Hi @Kaniz Fatma​,

As I mentioned in my problem description the file is not getting created only. But if I clear the state of the notebook and check the DBFS dir the file is present. As

I am creating the log file in the notebook and need to upload it to AWS s3 location it has to work how normal python env works. I would recommend using the code I posted to reproduce the issue and fix it from Databrick's side. As other people might also face the same issue in the future.

Hi @Kaniz Fatma​,

Any update on my request? please fix this issue from your end ASAP.

Thanks,

Chandan

HI @Kaniz Fatma​ ,

The issue can be solved by replacing the file handler with a stream handler that contains the string io object. Also please reply to our question when you have some free time. And when someone points out a bug in Databricks it's good to notify the developing team.

Thanks,

Chandan

Kaniz
Community Manager
Community Manager

Hi @Chandan Angadi​, Awesome! Thanks for the update. I will look into it and get back to you.

Would you mind selecting the best answer for us?

Anonymous
Not applicable

Perhaps PyCharm sets a different working directory, meaning the file ends up in another place. Try providing a full path.

Hi @Halen15 Noyes​ ,

I am executing the notebook in Databricks. And there we save the file under DBFS.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.