cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

logging.basicConfig not creating a file in Databricks

chandan_a_v
Valued Contributor

Hi,

I am using the logger to log some parameters in my code and I want to save the file under DBFS. But for some reason the file is not getting created under DBFS. If I clear the state of the notebook and check the DBFS dir then file is present. Please let me know if anyone has any idea regarding this. I already tried all possible solutions available on internet but none of them fix the issue.

I have pasted the code below for reproducing the issue.

import logging

log_file = f"e.log"

logging.basicConfig(

  filename="/dbfs/FileStore/" + log_file,

  format="[%(filename)s:%(lineno)s %(asctime)s] %(message)s",

  level=logging.INFO,

  force=True

)

__LOG__ = logging.getLogger(__name__)

__LOG__.info("Starting ETL staging pipeline only!")

__LOG__.info("Starting ETL staging pipeline only!")

display(dbutils.fs.ls("dbfs:/FileStore/"))

1 ACCEPTED SOLUTION

Accepted Solutions

HI @Kaniz Fatmaโ€‹ ,

The issue can be solved by replacing the file handler with a stream handler that contains the string io object. Also please reply to our question when you have some free time. And when someone points out a bug in Databricks it's good to notify the developing team.

Thanks,

Chandan

View solution in original post

8 REPLIES 8

chandan_a_v
Valued Contributor

@Kaniz Fatmaโ€‹ ,

Please let me know if you have any idea regarding this.

@Kaniz Fatmaโ€‹,

Did you find any solution? Please let me know.

Hi @Kaniz Fatmaโ€‹ ,

I am using 10.4 LTS.

Hi @Kaniz Fatmaโ€‹,

As I mentioned in my problem description the file is not getting created only. But if I clear the state of the notebook and check the DBFS dir the file is present. As

I am creating the log file in the notebook and need to upload it to AWS s3 location it has to work how normal python env works. I would recommend using the code I posted to reproduce the issue and fix it from Databrick's side. As other people might also face the same issue in the future.

Hi @Kaniz Fatmaโ€‹,

Any update on my request? please fix this issue from your end ASAP.

Thanks,

Chandan

HI @Kaniz Fatmaโ€‹ ,

The issue can be solved by replacing the file handler with a stream handler that contains the string io object. Also please reply to our question when you have some free time. And when someone points out a bug in Databricks it's good to notify the developing team.

Thanks,

Chandan

Anonymous
Not applicable

Perhaps PyCharm sets a different working directory, meaning the file ends up in another place. Try providing a full path.

Hi @Halen15 Noyesโ€‹ ,

I am executing the notebook in Databricks. And there we save the file under DBFS.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group