<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Best practice for logging in Databricks notebooks? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24092#M16715</link>
    <description>&lt;P&gt;What is the best practice for logging in Databricks notebooks? &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have a bunch of notebooks that run in parallel through a workflow. I would like to keep track of everything that happens such as errors coming from a stream. I would like these logs to be maintained somewhere either in DBFS or in a storage account.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I got the built-in logging module working but you have to manually transfer the log file from a temp folder in &lt;I&gt;file: &lt;/I&gt;to dbfs:/FileStore/log_folder/text.log. DBFS throws an error if the log file is directly assigned to its path with the FileHandler.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;This basically works for my purposes but what is the actual best practice of doing it in Databricks?&lt;/P&gt;</description>
    <pubDate>Wed, 02 Nov 2022 22:30:05 GMT</pubDate>
    <dc:creator>Gim</dc:creator>
    <dc:date>2022-11-02T22:30:05Z</dc:date>
    <item>
      <title>Best practice for logging in Databricks notebooks?</title>
      <link>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24092#M16715</link>
      <description>&lt;P&gt;What is the best practice for logging in Databricks notebooks? &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have a bunch of notebooks that run in parallel through a workflow. I would like to keep track of everything that happens such as errors coming from a stream. I would like these logs to be maintained somewhere either in DBFS or in a storage account.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I got the built-in logging module working but you have to manually transfer the log file from a temp folder in &lt;I&gt;file: &lt;/I&gt;to dbfs:/FileStore/log_folder/text.log. DBFS throws an error if the log file is directly assigned to its path with the FileHandler.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;This basically works for my purposes but what is the actual best practice of doing it in Databricks?&lt;/P&gt;</description>
      <pubDate>Wed, 02 Nov 2022 22:30:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24092#M16715</guid>
      <dc:creator>Gim</dc:creator>
      <dc:date>2022-11-02T22:30:05Z</dc:date>
    </item>
    <item>
      <title>Re: Best practice for logging in Databricks notebooks?</title>
      <link>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24094#M16717</link>
      <description>&lt;P&gt;Please consider integration of databricks with datadog &lt;A href="https://www.datadoghq.com/blog/databricks-monitoring-datadog/" target="test_blank"&gt;https://www.datadoghq.com/blog/databricks-monitoring-datadog/&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Nov 2022 09:09:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24094#M16717</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2022-11-03T09:09:21Z</dc:date>
    </item>
    <item>
      <title>Re: Best practice for logging in Databricks notebooks?</title>
      <link>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24096#M16719</link>
      <description>&lt;P&gt;@Gimwell Young​&amp;nbsp;AS @Debayan Mukherjee​&amp;nbsp;mentioned if you configure verbose logging in workspace level, logs will be moved to your storage bucket that you have provided during configuration. from there you can pull logs into any of your licensed log monitoring tool like eg:  Splunk etc. also same config can be used to monitor unity catalog logs. as @Hubert Dudek​&amp;nbsp;mentioned if you configure datadog, you will have graphical view of resources that are being consumed by workspace like no of clusters active, jobs active etc...&lt;/P&gt;</description>
      <pubDate>Thu, 03 Nov 2022 14:42:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24096#M16719</guid>
      <dc:creator>karthik_p</dc:creator>
      <dc:date>2022-11-03T14:42:35Z</dc:date>
    </item>
    <item>
      <title>Re: Best practice for logging in Databricks notebooks?</title>
      <link>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24093#M16716</link>
      <description>&lt;P&gt;Configuring verbose audit logs and configuring audit log delivery can be one of the best practises. &lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/administration-guide/account-settings/audit-logs.html" alt="https://docs.databricks.com/administration-guide/account-settings/audit-logs.html" target="_blank"&gt;https://docs.databricks.com/administration-guide/account-settings/audit-logs.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Nov 2022 06:14:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/best-practice-for-logging-in-databricks-notebooks/m-p/24093#M16716</guid>
      <dc:creator>Debayan</dc:creator>
      <dc:date>2022-11-03T06:14:33Z</dc:date>
    </item>
  </channel>
</rss>

