<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Python logging: 'Operation not supported' after upgrading to DBRT 6.1 in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27639#M19504</link>
    <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;My organization has an S3 bucket mounted to the databricks filesystem under &lt;PRE&gt;&lt;CODE&gt;/dbfs/mnt&lt;/CODE&gt;&lt;/PRE&gt;. When using Databricks runtime 5.5 and below, the following logging code works correctly:&lt;/P&gt;log_file = '/dbfs/mnt/path/to/my/bucket/test.log' logger = logging.getLogger('test-logger') logger.setLevel(logging.INFO) handler = logging.FileHandler(str(log_file)) handler.setLevel(logging.INFO) logger.addHandler(handler)
&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;logger.info('test') &lt;/P&gt;
&lt;P&gt;After upgrading to Databricks runtime 6.1, the above code produces a logging error "OSError: [Errno 95] Operation not supported". Here's the stack trace that is printed:&lt;/P&gt;
&lt;PRE&gt;&lt;CODE&gt;Traceback (most recent call last):
  File "/databricks/python/lib/python3.7/logging/__init__.py", line 1038, in emit
    self.flush()
  File "/databricks/python/lib/python3.7/logging/__init__.py", line 1018, in flush
    self.stream.flush()
OSError: [Errno 95] Operation not supported
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;The strange thing is that regular Python file I/O works fine with the same file. (i.e. I can &lt;PRE&gt;&lt;CODE&gt;open()&lt;/CODE&gt;&lt;/PRE&gt; and &lt;PRE&gt;&lt;CODE&gt;write()&lt;/CODE&gt;&lt;/PRE&gt; to that filepath successfully.) Any idea what's going on?&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 28 Oct 2019 14:49:59 GMT</pubDate>
    <dc:creator>zachary_jones</dc:creator>
    <dc:date>2019-10-28T14:49:59Z</dc:date>
    <item>
      <title>Python logging: 'Operation not supported' after upgrading to DBRT 6.1</title>
      <link>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27639#M19504</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;My organization has an S3 bucket mounted to the databricks filesystem under &lt;PRE&gt;&lt;CODE&gt;/dbfs/mnt&lt;/CODE&gt;&lt;/PRE&gt;. When using Databricks runtime 5.5 and below, the following logging code works correctly:&lt;/P&gt;log_file = '/dbfs/mnt/path/to/my/bucket/test.log' logger = logging.getLogger('test-logger') logger.setLevel(logging.INFO) handler = logging.FileHandler(str(log_file)) handler.setLevel(logging.INFO) logger.addHandler(handler)
&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;logger.info('test') &lt;/P&gt;
&lt;P&gt;After upgrading to Databricks runtime 6.1, the above code produces a logging error "OSError: [Errno 95] Operation not supported". Here's the stack trace that is printed:&lt;/P&gt;
&lt;PRE&gt;&lt;CODE&gt;Traceback (most recent call last):
  File "/databricks/python/lib/python3.7/logging/__init__.py", line 1038, in emit
    self.flush()
  File "/databricks/python/lib/python3.7/logging/__init__.py", line 1018, in flush
    self.stream.flush()
OSError: [Errno 95] Operation not supported
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;The strange thing is that regular Python file I/O works fine with the same file. (i.e. I can &lt;PRE&gt;&lt;CODE&gt;open()&lt;/CODE&gt;&lt;/PRE&gt; and &lt;PRE&gt;&lt;CODE&gt;write()&lt;/CODE&gt;&lt;/PRE&gt; to that filepath successfully.) Any idea what's going on?&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 28 Oct 2019 14:49:59 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27639#M19504</guid>
      <dc:creator>zachary_jones</dc:creator>
      <dc:date>2019-10-28T14:49:59Z</dc:date>
    </item>
    <item>
      <title>Re: Python logging: 'Operation not supported' after upgrading to DBRT 6.1</title>
      <link>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27640#M19505</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;&amp;gt; The strange thing is that regular Python file I/O works fine with the same file. (i.e. I can open() and write() to that filepath successfully.) Any idea what's going on?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;If you try to do a further operation after opening the file it will throw the same error. I'm also experiencing the same issue.  
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Nov 2019 14:53:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27640#M19505</guid>
      <dc:creator>SanghyunLee</dc:creator>
      <dc:date>2019-11-21T14:53:31Z</dc:date>
    </item>
    <item>
      <title>Re: Python logging: 'Operation not supported' after upgrading to DBRT 6.1</title>
      <link>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27641#M19506</link>
      <description>&lt;P&gt;According to the limitations in the Docs, starting from runtime 6.0, random writes to dbfs are no longer supported.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Nov 2019 16:18:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27641#M19506</guid>
      <dc:creator>Etch</dc:creator>
      <dc:date>2019-11-21T16:18:57Z</dc:date>
    </item>
    <item>
      <title>Re: Python logging: 'Operation not supported' after upgrading to DBRT 6.1</title>
      <link>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27642#M19507</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;Probably it's worth to try to rewrite the emit ... &lt;A href="https://docs.python.org/3/library/logging.html#handlers" target="test_blank"&gt;https://docs.python.org/3/library/logging.html#handlers&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;This works for me: &lt;/P&gt;class OurFileHandler(logging.FileHandler):
&lt;P&gt;&lt;/P&gt; 
&lt;PRE&gt;&lt;CODE&gt; def emit(self, record):
   # copied from &lt;A href="https://github.com/python/cpython/blob/master/Lib/logging/__init__.p" target="test_blank"&gt;https://github.com/python/cpython/blob/master/Lib/logging/__init__.p&lt;/A&gt;
   if self.stream is None:
     self.stream = self._open()
   try:
     msg = self.format(record)
     stream = self.stream
     # issue 35046: merged two stream.writes into one.
     stream.write(msg + self.terminator)
     self.flush()
   except RecursionError:  # See issue 36272
     raise
   except Exception:
     self.handleError(record)


&lt;/CODE&gt;&lt;/PRE&gt; 
&lt;P&gt;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE&gt;# logger must be defined 
ch = logging.FileHandler(log_file_path)
ch = OurFileHandler(log_file_path)
formatter = logging.Formatter(
    '%(asctime)s: %(name)s (%(levelname)s): %(message)s'
)
ch.setFormatter(formatter)
logger.addHandler(ch)
&lt;/CODE&gt;&lt;/PRE&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Sep 2020 05:06:20 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/python-logging-operation-not-supported-after-upgrading-to-dbrt-6/m-p/27642#M19507</guid>
      <dc:creator>lycenok</dc:creator>
      <dc:date>2020-09-10T05:06:20Z</dc:date>
    </item>
  </channel>
</rss>

